Physical concepts in the development of constitutive equations
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1985-01-01
Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.
The maximum entropy production principle: two basic questions.
Martyushev, Leonid M
2010-05-12
The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.
Statistical Thermodynamics and Microscale Thermophysics
NASA Astrophysics Data System (ADS)
Carey, Van P.
1999-08-01
Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.
Comparison of Classical and Quantum Mechanical Uncertainties.
ERIC Educational Resources Information Center
Peslak, John, Jr.
1979-01-01
Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)
A first principles calculation and statistical mechanics modeling of defects in Al-H system
NASA Astrophysics Data System (ADS)
Ji, Min; Wang, Cai-Zhuang; Ho, Kai-Ming
2007-03-01
The behavior of defects and hydrogen in Al was investigated by first principles calculations and statistical mechanics modeling. The formation energy of different defects in Al+H system such as Al vacancy, H in institution and multiple H in Al vacancy were calculated by first principles method. Defect concentration in thermodynamical equilibrium was studied by total free energy calculation including configuration entropy and defect-defect interaction from low concentration limit to hydride limit. In our grand canonical ensemble model, hydrogen chemical potential under different environment plays an important role in determing the defect concentration and properties in Al-H system.
Twenty-five years of maximum-entropy principle
NASA Astrophysics Data System (ADS)
Kapur, J. N.
1983-04-01
The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.
Quantum formalism for classical statistics
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-06-01
In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.
NASA Astrophysics Data System (ADS)
Stapp, Henry P.
2011-11-01
The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.
Dynamic principle for ensemble control tools.
Samoletov, A; Vasiev, B
2017-11-28
Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, Henry P.
2011-05-10
The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determinedmore » by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.« less
Designing Networks that are Capable of Self-Healing and Adapting
2017-04-01
from statistical mechanics, combinatorics, boolean networks, and numerical simulations, and inspired by design principles from biological networks, we... principles for self-healing networks, and applications, and construct an all-possible-paths model for network adaptation. 2015-11-16 UNIT CONVERSION...combinatorics, boolean networks, and numerical simulations, and inspired by design principles from biological networks, we will undertake the fol
Mechanics: Statics; A Syllabus.
ERIC Educational Resources Information Center
Compo, Louis
The instructor's guide presents material for structuring an engineering fundamentals course covering the basic laws of statistics as part of a mechanical technology program. Detailed behavioral objectives are described for the following five areas of course content: principles of mechanics, two-dimensional equilibrium, equilibrium of internal…
Lehoucq, R B; Sears, Mark P
2011-09-01
The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics.
A mechanism producing power law etc. distributions
NASA Astrophysics Data System (ADS)
Li, Heling; Shen, Hongjun; Yang, Bin
2017-07-01
Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.
Kantardjiev, Alexander A
2015-04-05
A cluster of strongly interacting ionization groups in protein molecules with irregular ionization behavior is suggestive for specific structure-function relationship. However, their computational treatment is unconventional (e.g., lack of convergence in naive self-consistent iterative algorithm). The stringent evaluation requires evaluation of Boltzmann averaged statistical mechanics sums and electrostatic energy estimation for each microstate. irGPU: Irregular strong interactions in proteins--a GPU solver is novel solution to a versatile problem in protein biophysics--atypical protonation behavior of coupled groups. The computational severity of the problem is alleviated by parallelization (via GPU kernels) which is applied for the electrostatic interaction evaluation (including explicit electrostatics via the fast multipole method) as well as statistical mechanics sums (partition function) estimation. Special attention is given to the ease of the service and encapsulation of theoretical details without sacrificing rigor of computational procedures. irGPU is not just a solution-in-principle but a promising practical application with potential to entice community into deeper understanding of principles governing biomolecule mechanisms. © 2015 Wiley Periodicals, Inc.
Rigorous force field optimization principles based on statistical distance minimization
Vlcek, Lukas; Chialvo, Ariel A.
2015-10-12
We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less
Quantum Mechanics and the Principle of Least Radix Economy
NASA Astrophysics Data System (ADS)
Garcia-Morales, Vladimir
2015-03-01
A new variational method, the principle of least radix economy, is formulated. The mathematical and physical relevance of the radix economy, also called digit capacity, is established, showing how physical laws can be derived from this concept in a unified way. The principle reinterprets and generalizes the principle of least action yielding two classes of physical solutions: least action paths and quantum wavefunctions. A new physical foundation of the Hilbert space of quantum mechanics is then accomplished and it is used to derive the Schrödinger and Dirac equations and the breaking of the commutativity of spacetime geometry. The formulation provides an explanation of how determinism and random statistical behavior coexist in spacetime and a framework is developed that allows dynamical processes to be formulated in terms of chains of digits. These methods lead to a new (pre-geometrical) foundation for Lorentz transformations and special relativity. The Parker-Rhodes combinatorial hierarchy is encompassed within our approach and this leads to an estimate of the interaction strength of the electromagnetic and gravitational forces that agrees with the experimental values to an error of less than one thousandth. Finally, it is shown how the principle of least-radix economy naturally gives rise to Boltzmann's principle of classical statistical thermodynamics. A new expression for a general (path-dependent) nonequilibrium entropy is proposed satisfying the Second Law of Thermodynamics.
Phenomenology of small violations of Fermi and Bose statistics
NASA Astrophysics Data System (ADS)
Greenberg, O. W.; Mohapatra, Rabindra N.
1989-04-01
In a recent paper, we proposed a ``paronic'' field-theory framework for possible small deviations from the Pauli exclusion principle. This theory cannot be represented in a positive-metric (Hilbert) space. Nonetheless, the issue of possible small violations of the exclusion principle can be addressed in the framework of quantum mechanics, without being connected with a local quantum field theory. In this paper, we discuss the phenomenology of small violations of both Fermi and Bose statistics. We consider the implications of such violations in atomic, nuclear, particle, and condensed-matter physics and in astrophysics and cosmology. We also discuss experiments that can detect small violations of Fermi and Bose statistics or place stringent bounds on their validity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallavotti, G.
It is shown that the chaoticity hypothesis recently introduced in statistical mechanics, which is analogous to Ruelle`s principle for turbulence, implies the Onsager reciprocity and the fluctuation-dissipation theorem in various reversible models for coexisting transport phenomena.
Statistical analogues of thermodynamic extremum principles
NASA Astrophysics Data System (ADS)
Ramshaw, John D.
2018-05-01
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium statistical mechanics can be simply derived from the principle of maximum entropy, in which the statistical entropy S=- {k}{{B}}{\\sum }i{p}i{log}{p}i is maximised subject to constraints on the mean values of the energy E and/or number of particles N in a system of fixed volume V. The Lagrange multipliers associated with those constraints are then found to be simply related to the temperature T and chemical potential μ. Here we show that the constrained maximisation of S is equivalent to, and can therefore be replaced by, the essentially unconstrained minimisation of the obvious statistical analogues of the Helmholtz free energy F = E ‑ TS and the grand potential J = F ‑ μN. Those minimisations are more easily performed than the maximisation of S because they formally eliminate the constraints on the mean values of E and N and their associated Lagrange multipliers. This procedure significantly simplifies the derivation of the canonical and grand canonical probability distributions, and shows that the well known extremum principles for the various thermodynamic potentials possess natural statistical analogues which are equivalent to the constrained maximisation of S.
The development of ensemble theory. A new glimpse at the history of statistical mechanics
NASA Astrophysics Data System (ADS)
Inaba, Hajime
2015-12-01
This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems
NASA Astrophysics Data System (ADS)
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Domain generality vs. modality specificity: The paradox of statistical learning
Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.
2015-01-01
Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249
NASA Technical Reports Server (NTRS)
Speziale, Charles G.
1988-01-01
The invariance of constitutive equations in continuum mechanics is examined from a basic theoretical standpoint. It is demonstrated the constitutive equations which are not form invariant under arbitrary translational accelerations of the reference frame are in violation of the Einstein equivalane principle. Furthermore, by making use of an analysis based on statistical mechanics, it is argued that any frame-dependent terms in constitutive equations must arise from the intrinsic spin tensor and are negligible provided that the ratio of microscopic to macroscopic time scales is extremely small. The consistency of these results with existing constitutive theories is discussed in detail along with possible avenues of future research.
Mechanics, Waves and Thermodynamics
NASA Astrophysics Data System (ADS)
Ranjan Jain, Sudhir
2016-05-01
Figures; Preface; Acknowledgement; 1. Energy, mass, momentum; 2. Kinematics, Newton's laws of motion; 3. Circular motion; 4. The principle of least action; 5. Work and energy; 6. Mechanics of a system of particles; 7. Friction; 8. Impulse and collisions; 9. Central forces; 10. Dimensional analysis; 11. Oscillations; 12. Waves; 13. Sound of music; 14. Fluid mechanics; 15. Water waves; 16. The kinetic theory of gases; 17. Concepts and laws of thermodynamics; 18. Some applications of thermodynamics; 19. Basic ideas of statistical mechanics; Bibliography; Index.
Non Kolmogorov Probability Models Outside Quantum Mechanics
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2009-03-01
This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mehra, J.
1987-05-01
In this paper, the main outlines of the discussions between Niels Bohr with Albert Einstein, Werner Heisenberg, and Erwin Schroedinger during 1920-1927 are treated. From the formulation of quantum mechanics in 1925-1926 and wave mechanics in 1926, there emerged Born's statistical interpretation of the wave function in summer 1926, and on the basis of the quantum mechanical transformation theory - formulated in fall 1926 by Dirac, London, and Jordan - Heisenberg formulated the uncertainty principle in early 1927. At the Volta Conference in Como in September 1927 and at the fifth Solvay Conference in Brussels the following month, Bohr publiclymore » enunciated his complementarity principle, which had been developing in his mind for several years. The Bohr-Einstein discussions about the consistency and completeness of quantum mechanics and of physical theory as such - formally begun in October 1927 at the fifth Solvay Conference and carried on at the sixth Solvay Conference in October 1930 - were continued during the next decades. All these aspects are briefly summarized.« less
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray
2014-05-13
The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.
Principle of maximum entropy for reliability analysis in the design of machine components
NASA Astrophysics Data System (ADS)
Zhang, Yimin
2018-03-01
We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.
Statistical mechanical theory for steady state systems. VI. Variational principles
NASA Astrophysics Data System (ADS)
Attard, Phil
2006-12-01
Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.
Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?
NASA Astrophysics Data System (ADS)
Majumder, Barun; Sen, Sourav
2012-10-01
In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.
Constitutive Modeling, Nonlinear Behavior, and the Stress-Optic Law
2011-01-01
estimates of D̂ from dynamic mechanical measurements. Some results are shown in Figure 58 for a filled EPDM rubber [116]. There is rough agreement with...elastomers and filler-reinforced rubber . 5.1 Linearity and the superposition principle The problem of analyzing viscoelastic mechanical behavior is greatly...deformation such as shear. For crosslinked rubber the strain can be defined in terms of the strain function suggested by the statistical theories of
The actual content of quantum theoretical kinematics and mechanics
NASA Technical Reports Server (NTRS)
Heisenberg, W.
1983-01-01
First, exact definitions are supplied for the terms: position, velocity, energy, etc. (of the electron, for instance), such that they are valid also in quantum mechanics. Canonically conjugated variables are determined simultaneously only with a characteristic uncertainty. This uncertainty is the intrinsic reason for the occurrence of statistical relations in quantum mechanics. Mathematical formulation is made possible by the Dirac-Jordan theory. Beginning from the basic principles thus obtained, macroscopic processes are understood from the viewpoint of quantum mechanics. Several imaginary experiments are discussed to elucidate the theory.
The unrealized promise of infant statistical word-referent learning
Smith, Linda B.; Suanda, Sumarga H.; Yu, Chen
2014-01-01
Recent theory and experiments offer a new solution as to how infant learners may break into word learning, by using cross-situational statistics to find the underlying word-referent mappings. Computational models demonstrate the in-principle plausibility of this statistical learning solution and experimental evidence shows that infants can aggregate and make statistically appropriate decisions from word-referent co-occurrence data. We review these contributions and then identify the gaps in current knowledge that prevent a confident conclusion about whether cross-situational learning is the mechanism through which infants break into word learning. We propose an agenda to address that gap that focuses on detailing the statistics in the learning environment and the cognitive processes that make use of those statistics. PMID:24637154
Statistical Mechanical Proof of the Second Law of Thermodynamics based on Volume Entropy
NASA Astrophysics Data System (ADS)
Campisi, Michele
2007-10-01
As pointed out in [M. Campisi. Stud. Hist. Phil. M. P. 36 (2005) 275-290] the volume entropy (that is the logarithm of the volume of phase space enclosed by the constant energy hyper-surface) provides a good mechanical analogue of thermodynamic entropy because it satisfies the heat theorem and it is an adiabatic invariant. This property explains the ``equal'' sign in Clausius principle (Sf>=Si) in a purely mechanical way and suggests that the volume entropy might explain the ``larger than'' sign (i.e. the Law of Entropy Increase) if non adiabatic transformations were considered. Based on the principles of quantum mechanics here we prove that, provided the initial equilibrium satisfy the natural condition of decreasing ordering of probabilities, the expectation value of the volume entropy cannot decrease for arbitrary transformations performed by some external sources of work on a insulated system. This can be regarded as a rigorous quantum mechanical proof of the Second Law.
ERIC Educational Resources Information Center
Grenn, Michael W.
2013-01-01
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of…
A Comparative Analysis of the Minuteman Education Programs as Currently Offered at Six SAC Bases.
1980-06-01
Principles of Marketing 3 Business Statistics 3 Business Law 3 Management Total... Principles of Marketing 3 Mathematics Methods I Total prerequisite hours 26 Required Graduate Courses Policy Formulation and Administration 3 Management...Business and Economic Statistics 3 Intermediate Business and Economic Statistics 3 Principles of Management 3 Corporation Finance 3 Principles of Marketing
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Shu-Kun
1996-12-31
Gibbs paradox statement of entropy of mixing has been regarded as the theoretical foundation of statistical mechanics, quantum theory and biophysics. However, all the relevant chemical experimental observations and logical analyses indicate that the Gibbs paradox statement is false. I prove that this statement is wrong: Gibbs paradox statement implies that entropy decreases with the increase in symmetry (as represented by a symmetry number {sigma}; see any statistical mechanics textbook). From group theory any system has at least a symmetry number {sigma}=1 which is the identity operation for a strictly asymmetric system. It follows that the entropy of a systemmore » is equal to, or less than, zero. However, from either von Neumann-Shannon entropy formula (S(w) =-{Sigma}{sup {omega}} in p{sub 1}) or the Boltzmann entropy formula (S = in w) and the original definition, entropy is non-negative. Therefore, this statement is false. It should not be a surprise that for the first time, many outstanding problems such as the validity of Pauling`s resonance theory, the explanation of second order phase transition phenomena, the biophysical problem of protein folding and the related hydrophobic effect, etc., can be solved. Empirical principles such as Pauli principle (and Hund`s rule) and HSAB principle, etc., can also be given a theoretical explanation.« less
Physics of Electronic Materials
NASA Astrophysics Data System (ADS)
Rammer, Jørgen
2017-03-01
1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.
Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.
O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao
2017-07-01
Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.
Statistical mechanics and thermodynamic limit of self-gravitating fermions in D dimensions.
Chavanis, Pierre-Henri
2004-06-01
We discuss the statistical mechanics of a system of self-gravitating fermions in a space of dimension D. We plot the caloric curves of the self-gravitating Fermi gas giving the temperature as a function of energy and investigate the nature of phase transitions as a function of the dimension of space. We consider stable states (global entropy maxima) as well as metastable states (local entropy maxima). We show that for D> or =4, there exists a critical temperature (for sufficiently large systems) and a critical energy below which the system cannot be found in statistical equilibrium. Therefore, for D> or =4, quantum mechanics cannot stabilize matter against gravitational collapse. This is similar to a result found by Ehrenfest (1917) at the atomic level for Coulomb forces. This makes the dimension D=3 of our Universe very particular with possible implications regarding the anthropic principle. Our study joins a long tradition of scientific and philosophical papers that examined how the dimension of space affects the laws of physics.
Study of pre-seismic kHz EM emissions by means of complex systems
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos
2010-05-01
The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.
Boltzmann, Darwin and Directionality theory
NASA Astrophysics Data System (ADS)
Demetrius, Lloyd A.
2013-09-01
Boltzmann’s statistical thermodynamics is a mathematical theory which relates the macroscopic properties of aggregates of interacting molecules with the laws of their interaction. The theory is based on the concept thermodynamic entropy, a statistical measure of the extent to which energy is spread throughout macroscopic matter. Macroscopic evolution of material aggregates is quantitatively explained in terms of the principle: Thermodynamic entropy increases as the composition of the aggregate changes under molecular collision. Darwin’s theory of evolution is a qualitative theory of the origin of species and the adaptation of populations to their environment. A central concept in the theory is fitness, a qualitative measure of the capacity of an organism to contribute to the ancestry of future generations. Macroscopic evolution of populations of living organisms can be qualitatively explained in terms of a neo-Darwinian principle: Fitness increases as the composition of the population changes under variation and natural selection. Directionality theory is a quantitative model of the Darwinian argument of evolution by variation and selection. This mathematical theory is based on the concept evolutionary entropy, a statistical measure which describes the rate at which an organism appropriates energy from the environment and reinvests this energy into survivorship and reproduction. According to directionality theory, microevolutionary dynamics, that is evolution by mutation and natural selection, can be quantitatively explained in terms of a directionality principle: Evolutionary entropy increases when the resources are diverse and of constant abundance; but decreases when the resource is singular and of variable abundance. This report reviews the analytical and empirical support for directionality theory, and invokes the microevolutionary dynamics of variation and selection to delineate the principles which govern macroevolutionary dynamics of speciation and extinction. We also elucidate the relation between thermodynamic entropy, which pertains to the extent of energy spreading and sharing within inanimate matter, and evolutionary entropy, which refers to the rate of energy appropriation from the environment and allocation within living systems. We show that the entropic principle of thermodynamics is the limit as R→0, M→∞, (where R denote the resource production rate, and M denote population size) of the entropic principle of evolution. We exploit this relation between the thermodynamic and evolutionary tenets to propose a physico-chemical model of the transition from inanimate matter which is under thermodynamic selection, to living systems which are subject to evolutionary selection. Life history variation and the evolution of senescence The evolutionary dynamics of speciation and extinction Evolutionary trends in body size. The origin of sporadic forms of cancer and neurological diseases, and the evolution of cooperation are important recent applications of directionality theory. These applications, which draw from the medical sciences and sociobiology, appeal to methods which lie outside the formalism described in this report. A companion review, Demetrius and Gundlach (submitted for publication), gives an account of these applications.An important aspect of this report pertains to the connection between statistical mechanics and evolutionary theory and its implications towards understanding the processes which underlie the emergence of living systems from inanimate matter-a problem which has recently attracted considerable attention, Morowitz (1992), Eigen (1992), Dyson (2000), Pross (2012).The connection between the two disciplines can be addressed by appealing to certain extremal principles which are considered the mainstay of the respective theories.The extremal principle in statistical mechanics can be stated as follows:
Jarzynski equality in the context of maximum path entropy
NASA Astrophysics Data System (ADS)
González, Diego; Davis, Sergio
2017-06-01
In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.
A concurrent multiscale micromorphic molecular dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Shaofan, E-mail: shaofan@berkeley.edu; Tong, Qi
2015-04-21
In this work, we have derived a multiscale micromorphic molecular dynamics (MMMD) from first principle to extend the (Andersen)-Parrinello-Rahman molecular dynamics to mesoscale and continuum scale. The multiscale micromorphic molecular dynamics is a con-current three-scale dynamics that couples a fine scale molecular dynamics, a mesoscale micromorphic dynamics, and a macroscale nonlocal particle dynamics together. By choosing proper statistical closure conditions, we have shown that the original Andersen-Parrinello-Rahman molecular dynamics is the homogeneous and equilibrium case of the proposed multiscale micromorphic molecular dynamics. In specific, we have shown that the Andersen-Parrinello-Rahman molecular dynamics can be rigorously formulated and justified from firstmore » principle, and its general inhomogeneous case, i.e., the three scale con-current multiscale micromorphic molecular dynamics can take into account of macroscale continuum mechanics boundary condition without the limitation of atomistic boundary condition or periodic boundary conditions. The discovered multiscale scale structure and the corresponding multiscale dynamics reveal a seamless transition from atomistic scale to continuum scale and the intrinsic coupling mechanism among them based on first principle formulation.« less
Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum
NASA Astrophysics Data System (ADS)
Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.
The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.
ERIC Educational Resources Information Center
Sevilla, F. J.; Olivares-Quiroz, L.
2012-01-01
In this work, we address the concept of the chemical potential [mu] in classical and quantum gases towards the calculation of the equation of state [mu] = [mu](n, T) where n is the particle density and "T" the absolute temperature using the methods of equilibrium statistical mechanics. Two cases seldom discussed in elementary textbooks are…
A statistical mechanics model for free-for-all airplane passenger boarding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steffen, Jason H.; /Fermilab
2008-08-01
I discuss a model for free-for-all passenger boarding which is employed by some discount air carriers. The model is based on the principles of statistical mechanics where each seat in the aircraft has an associated energy which reflects the preferences of travelers. As each passenger enters the airplane they select their seats using Boltzmann statistics, proceed to that location, load their luggage, sit down, and the partition function seen by remaining passengers is modified to reflect this fact. I discuss the various model parameters and make qualitative comparisons of this passenger boarding model with those that involve assigned seats. Themore » model can be used to predict the probability that certain seats will be occupied at different times during the boarding process. These results might provide a useful description of this boarding method. The model is a relatively unusual application of undergraduate level physics and describes a situation familiar to many students and faculty.« less
On the statistical mechanics of species abundance distributions.
Bowler, Michael G; Kelly, Colleen K
2012-09-01
A central issue in ecology is that of the factors determining the relative abundance of species within a natural community. The proper application of the principles of statistical physics to species abundance distributions (SADs) shows that simple ecological properties could account for the near universal features observed. These properties are (i) a limit on the number of individuals in an ecological guild and (ii) per capita birth and death rates. They underpin the neutral theory of Hubbell (2001), the master equation approach of Volkov et al. (2003, 2005) and the idiosyncratic (extreme niche) theory of Pueyo et al. (2007); they result in an underlying log series SAD, regardless of neutral or niche dynamics. The success of statistical mechanics in this application implies that communities are in dynamic equilibrium and hence that niches must be flexible and that temporal fluctuations on all sorts of scales are likely to be important in community structure. Copyright © 2012 Elsevier Inc. All rights reserved.
Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics
NASA Astrophysics Data System (ADS)
Sugiyama, Masaru
. Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.
Maximum caliber inference of nonequilibrium processes
NASA Astrophysics Data System (ADS)
Otten, Moritz; Stock, Gerhard
2010-07-01
Thirty years ago, Jaynes suggested a general theoretical approach to nonequilibrium statistical mechanics, called maximum caliber (MaxCal) [Annu. Rev. Phys. Chem. 31, 579 (1980)]. MaxCal is a variational principle for dynamics in the same spirit that maximum entropy is a variational principle for equilibrium statistical mechanics. Motivated by the success of maximum entropy inference methods for equilibrium problems, in this work the MaxCal formulation is applied to the inference of nonequilibrium processes. That is, given some time-dependent observables of a dynamical process, one constructs a model that reproduces these input data and moreover, predicts the underlying dynamics of the system. For example, the observables could be some time-resolved measurements of the folding of a protein, which are described by a few-state model of the free energy landscape of the system. MaxCal then calculates the probabilities of an ensemble of trajectories such that on average the data are reproduced. From this probability distribution, any dynamical quantity of the system can be calculated, including population probabilities, fluxes, or waiting time distributions. After briefly reviewing the formalism, the practical numerical implementation of MaxCal in the case of an inference problem is discussed. Adopting various few-state models of increasing complexity, it is demonstrated that the MaxCal principle indeed works as a practical method of inference: The scheme is fairly robust and yields correct results as long as the input data are sufficient. As the method is unbiased and general, it can deal with any kind of time dependency such as oscillatory transients and multitime decays.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-10-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-01-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512
NASA Astrophysics Data System (ADS)
Obuchi, Tomoyuki; Monasson, Rémi
2015-09-01
The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.
Seeing is believing: good graphic design principles for medical research.
Duke, Susan P; Bancken, Fabrice; Crowe, Brenda; Soukup, Mat; Botsis, Taxiarchis; Forshee, Richard
2015-09-30
Have you noticed when you browse a book, journal, study report, or product label how your eye is drawn to figures more than to words and tables? Statistical graphs are powerful ways to transparently and succinctly communicate the key points of medical research. Furthermore, the graphic design itself adds to the clarity of the messages in the data. The goal of this paper is to provide a mechanism for selecting the appropriate graph to thoughtfully construct quality deliverables using good graphic design principles. Examples are motivated by the efforts of a Safety Graphics Working Group that consisted of scientists from the pharmaceutical industry, Food and Drug Administration, and academic institutions. Copyright © 2015 John Wiley & Sons, Ltd.
On Some Assumptions of the Null Hypothesis Statistical Testing
ERIC Educational Resources Information Center
Patriota, Alexandre Galvão
2017-01-01
Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…
The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model
NASA Astrophysics Data System (ADS)
Verkley, Wim; Severijns, Camiel
2014-05-01
Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).
A quantum framework for likelihood ratios
NASA Astrophysics Data System (ADS)
Bond, Rachael L.; He, Yang-Hui; Ormerod, Thomas C.
The ability to calculate precise likelihood ratios is fundamental to science, from Quantum Information Theory through to Quantum State Estimation. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes’ theorem either defaults to the marginal probability driven “naive Bayes’ classifier”, or requires the use of compensatory expectation-maximization techniques. This paper takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement, and demonstrates that Bayes’ theorem is a special case of a more general quantum mechanical expression.
A second order thermodynamic perturbation theory for hydrogen bond cooperativity in water
NASA Astrophysics Data System (ADS)
Marshall, Bennett D.
2017-05-01
It has been extensively demonstrated through first principles quantum mechanics calculations that water exhibits strong hydrogen bond cooperativity. Equations of state developed from statistical mechanics typically assume pairwise additivity, meaning they cannot account for these 3-body and higher cooperative effects. In this paper, we extend a second order thermodynamic perturbation theory to correct for hydrogen bond cooperativity in 4 site water. We demonstrate that the theory predicts hydrogen bonding structure consistent spectroscopy, neutron diffraction, and molecular simulation data. Finally, we implement the approach into a general equation of state for water.
Applications of the principle of maximum entropy: from physics to ecology.
Banavar, Jayanth R; Maritan, Amos; Volkov, Igor
2010-02-17
There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.
Applying Statistical Process Control to Clinical Data: An Illustration.
ERIC Educational Resources Information Center
Pfadt, Al; And Others
1992-01-01
Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…
NASA Astrophysics Data System (ADS)
Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas
2017-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.
Information transport in classical statistical systems
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-02-01
For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.
Analysis of surface sputtering on a quantum statistical basis
NASA Technical Reports Server (NTRS)
Wilhelm, H. E.
1975-01-01
Surface sputtering is explained theoretically by means of a 3-body sputtering mechanism involving the ion and two surface atoms of the solid. By means of quantum-statistical mechanics, a formula for the sputtering ratio S(E) is derived from first principles. The theoretical sputtering rate S(E) was found experimentally to be proportional to the square of the difference between incident ion energy and the threshold energy for sputtering of surface atoms at low ion energies. Extrapolation of the theoretical sputtering formula to larger ion energies indicates that S(E) reaches a saturation value and finally decreases at high ion energies. The theoretical sputtering ratios S(E) for wolfram, tantalum, and molybdenum are compared with the corresponding experimental sputtering curves in the low energy region from threshold sputtering energy to 120 eV above the respective threshold energy. Theory and experiment are shown to be in good agreement.
Principles for valid histopathologic scoring in research
Gibson-Corley, Katherine N.; Olivier, Alicia K.; Meyerholz, David K.
2013-01-01
Histopathologic scoring is a tool by which semi-quantitative data can be obtained from tissues. Initially, a thorough understanding of the experimental design, study objectives and methods are required to allow the pathologist to appropriately examine tissues and develop lesion scoring approaches. Many principles go into the development of a scoring system such as tissue examination, lesion identification, scoring definitions and consistency in interpretation. Masking (a.k.a. “blinding”) of the pathologist to experimental groups is often necessary to constrain bias and multiple mechanisms are available. Development of a tissue scoring system requires appreciation of the attributes and limitations of the data (e.g. nominal, ordinal, interval and ratio data) to be evaluated. Incidence, ordinal and rank methods of tissue scoring are demonstrated along with key principles for statistical analyses and reporting. Validation of a scoring system occurs through two principal measures: 1) validation of repeatability and 2) validation of tissue pathobiology. Understanding key principles of tissue scoring can help in the development and/or optimization of scoring systems so as to consistently yield meaningful and valid scoring data. PMID:23558974
Thermodynamic evolution far from equilibrium
NASA Astrophysics Data System (ADS)
Khantuleva, Tatiana A.
2018-05-01
The presented model of thermodynamic evolution of an open system far from equilibrium is based on the modern results of nonequilibrium statistical mechanics, the nonlocal theory of nonequilibrium transport developed by the author and the Speed Gradient principle introduced in the theory of adaptive control. Transition to a description of the system internal structure evolution at the mesoscopic level allows a new insight at the stability problem of non-equilibrium processes. The new model is used in a number of specific tasks.
Statistical Analysis of Physiological Signals
NASA Astrophysics Data System (ADS)
Ruiz, María G.; Pérez, Leticia
2003-07-01
In spite of two hundred years of clinical practice, Homeopathy still lacks of scientific basis. Its fundamental laws, similia principle and the activity of the denominated ultra-high dilutions are controversial issues that do not fit into the mainstream medicine or current physical-chemistry field as well. Aside its clinical efficacy, the identification of physical - chemistry parameters, as markers of the homeopathic effect, would allow to construct mathematic models [1], which in turn, could provide clues regarding the involved mechanism.
NASA Astrophysics Data System (ADS)
Knuth, K. H.
2001-05-01
We consider the application of Bayesian inference to the study of self-organized structures in complex adaptive systems. In particular, we examine the distribution of elements, agents, or processes in systems dominated by hierarchical structure. We demonstrate that results obtained by Caianiello [1] on Hierarchical Modular Systems (HMS) can be found by applying Jaynes' Principle of Group Invariance [2] to a few key assumptions about our knowledge of hierarchical organization. Subsequent application of the Principle of Maximum Entropy allows inferences to be made about specific systems. The utility of the Bayesian method is considered by examining both successes and failures of the hierarchical model. We discuss how Caianiello's original statements suffer from the Mind Projection Fallacy [3] and we restate his assumptions thus widening the applicability of the HMS model. The relationship between inference and statistical physics, described by Jaynes [4], is reiterated with the expectation that this realization will aid the field of complex systems research by moving away from often inappropriate direct application of statistical mechanics to a more encompassing inferential methodology.
Ab initio joint density-functional theory of solvated electrodes, with model and explicit solvation
NASA Astrophysics Data System (ADS)
Arias, Tomas
2015-03-01
First-principles guided design of improved electrochemical systems has the potential for great societal impact by making non-fossil-fuel systems economically viable. Potential applications include improvements in fuel-cells, solar-fuel systems (``artificial photosynthesis''), supercapacitors and batteries. Economical fuel-cell systems would enable zero-carbon footprint transportation, solar-fuel systems would directly convert sunlight and water into hydrogen fuel for such fuel-cell vehicles, supercapacitors would enable nearly full recovery of energy lost during vehicle braking thus extending electric vehicle range and acceptance, and economical high-capacity batteries would be central to mitigating the indeterminacy of renewable resources such as wind and solar. Central to the operation of all of the above electrochemical systems is the electrode-electrolyte interface, whose underlying physics is quite rich, yet remains remarkably poorly understood. The essential underlying technical challenge to the first principles studies which could explore this physics is the need to properly represent simultaneously both the interaction between electron-transfer events at the electrode, which demand a quantum mechanical description, and multiscale phenomena in the liquid environment such as the electrochemical double layer (ECDL) and its associated shielding, which demand a statistical description. A direct ab initio approach to this challenge would, in principle, require statistical sampling and thousands of repetitions of already computationally demanding quantum mechanical calculations. This talk will begin with a brief review of a recent advance, joint density-functional theory (JDFT), which allows for a fully rigorous and, in principle, exact representation of the thermodynamic equilibrium between a system described at the quantum-mechanical level and a liquid environment, but without the need for costly sampling. We then shall demonstrate how this approach applies in the electrochemical context and how it is needed for realistic description of solvated electrode systems [], and how simple ``implicit'' polarized continuum methods fail radically in this context. Finally, we shall present a series of results relevant to battery, supercapacitor, and solar-fuel systems, one of which has led to a recent invention disclosure for improving battery cycle lifetimes. Supported as a part of the Energy Materials Center at Cornell, an Energy Frontier Research Center funded by DOE/BES (award de-sc0001086) and by the New York State Division of Science, Technology and Innovation (NYSTAR, award 60923).
A novel conceptual framework for understanding the mechanism of adherence to long term therapies
Reach, Gérard
2008-01-01
The World Health Organization claimed recently that improving patient adherence to long term therapies would be more beneficial than any biomedical progress. First, however, we must understand its mechanisms. In this paper I propose a novel approach using concepts elaborated in a field rarely explored in medicine, the philosophy of mind. While conventional psychological models (eg, the Health Belief Model) provide explanations and predictions which have only a statistical value, the philosophical assumption that mental states (eg, beliefs) are causally efficient (mental causation) can provide the basis for a causal theory of health behaviors. This paper shows that nonadherence to long term therapies can be described as the medical expression of a philosophical concept, that is, weakness of will. I use philosophical explanations of this concept to suggest a mechanistic explanation of nonadherence. I propose that it results from the failure of two principles of rationality. First, a principle of continence, described by the philosopher Donald Davidson in his explanation of weakness of will. This principle exhorts us to act after having considered all available arguments and according to which option we consider best. However, patients conforming to this principle of continence should rationally be nonadherent. Indeed, when patients face a choice between adherence and nonadherence, they must decide, in general, between a large, but delayed reward (eg, health) and a small, but immediate reward (eg, smoking a cigarette). According to concepts elaborated by George Ainslie and Jon Elster, the force of our desires is strongly influenced by the proximity of reward. This inter-temporal choice theory on one hand, and the mere principle of continence on the other, should therefore lead to nonadherence. Nevertheless, adherence to long term therapies is possible, as a result of the intervention of an additional principle, the principle of foresight, which tells us to give priority to mental states oriented towards the future. PMID:19920939
Perlovsky, Leonid I
2016-01-01
Is it possible to turn psychology into "hard science"? Physics of the mind follows the fundamental methodology of physics in all areas where physics have been developed. What is common among Newtonian mechanics, statistical physics, quantum physics, thermodynamics, theory of relativity, astrophysics… and a theory of superstrings? The common among all areas of physics is a methodology of physics discussed in the first few lines of the paper. Is physics of the mind possible? Is it possible to describe the mind based on the few first principles as physics does? The mind with its variabilities and uncertainties, the mind from perception and elementary cognition to emotions and abstract ideas, to high cognition. Is it possible to turn psychology and neuroscience into "hard" sciences? The paper discusses established first principles of the mind, their mathematical formulations, and a mathematical model of the mind derived from these first principles, mechanisms of concepts, emotions, instincts, behavior, language, cognition, intuitions, conscious and unconscious, abilities for symbols, functions of the beautiful and musical emotions in cognition and evolution. Some of the theoretical predictions have been experimentally confirmed. This research won national and international awards. In addition to summarizing existing results the paper describes new development theoretical and experimental. The paper discusses unsolved theoretical problems as well as experimental challenges for future research.
Perlovsky, Leonid I.
2016-01-01
Is it possible to turn psychology into “hard science”? Physics of the mind follows the fundamental methodology of physics in all areas where physics have been developed. What is common among Newtonian mechanics, statistical physics, quantum physics, thermodynamics, theory of relativity, astrophysics… and a theory of superstrings? The common among all areas of physics is a methodology of physics discussed in the first few lines of the paper. Is physics of the mind possible? Is it possible to describe the mind based on the few first principles as physics does? The mind with its variabilities and uncertainties, the mind from perception and elementary cognition to emotions and abstract ideas, to high cognition. Is it possible to turn psychology and neuroscience into “hard” sciences? The paper discusses established first principles of the mind, their mathematical formulations, and a mathematical model of the mind derived from these first principles, mechanisms of concepts, emotions, instincts, behavior, language, cognition, intuitions, conscious and unconscious, abilities for symbols, functions of the beautiful and musical emotions in cognition and evolution. Some of the theoretical predictions have been experimentally confirmed. This research won national and international awards. In addition to summarizing existing results the paper describes new development theoretical and experimental. The paper discusses unsolved theoretical problems as well as experimental challenges for future research. PMID:27895558
Humidity Sensors Principle, Mechanism, and Fabrication Technologies: A Comprehensive Review
Farahani, Hamid; Wagiran, Rahman; Hamidon, Mohd Nizar
2014-01-01
Humidity measurement is one of the most significant issues in various areas of applications such as instrumentation, automated systems, agriculture, climatology and GIS. Numerous sorts of humidity sensors fabricated and developed for industrial and laboratory applications are reviewed and presented in this article. The survey frequently concentrates on the RH sensors based upon their organic and inorganic functional materials, e.g., porous ceramics (semiconductors), polymers, ceramic/polymer and electrolytes, as well as conduction mechanism and fabrication technologies. A significant aim of this review is to provide a distinct categorization pursuant to state of the art humidity sensor types, principles of work, sensing substances, transduction mechanisms, and production technologies. Furthermore, performance characteristics of the different humidity sensors such as electrical and statistical data will be detailed and gives an added value to the report. By comparison of overall prospects of the sensors it was revealed that there are still drawbacks as to efficiency of sensing elements and conduction values. The flexibility offered by thick film and thin film processes either in the preparation of materials or in the choice of shape and size of the sensor structure provides advantages over other technologies. These ceramic sensors show faster response than other types. PMID:24784036
Localization in quantum field theory
NASA Astrophysics Data System (ADS)
Balachandran, A. P.
In non-relativistic quantum mechanics, Born’s principle of localization is as follows: For a single particle, if a wave function ψK vanishes outside a spatial region K, it is said to be localized in K. In particular, if a spatial region K‧ is disjoint from K, a wave function ψK‧ localized in K‧ is orthogonal to ψK. Such a principle of localization does not exist compatibly with relativity and causality in quantum field theory (QFT) (Newton and Wigner) or interacting point particles (Currie, Jordan and Sudarshan). It is replaced by symplectic localization of observables as shown by Brunetti, Guido and Longo, Schroer and others. This localization gives a simple derivation of the spin-statistics theorem and the Unruh effect, and shows how to construct quantum fields for anyons and for massless particles with “continuous” spin. This review outlines the basic principles underlying symplectic localization and shows or mentions its deep implications. In particular, it has the potential to affect relativistic quantum information theory and black hole physics.
Statistical mechanics and scaling of fault populations with increasing strain in the Corinth Rift
NASA Astrophysics Data System (ADS)
Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter
2015-12-01
Scaling properties of fracture/fault systems are studied in order to characterize the mechanical properties of rocks and to provide insight into the mechanisms that govern fault growth. A comprehensive image of the fault network in the Corinth Rift, Greece, obtained through numerous field studies and marine geophysical surveys, allows for the first time such a study over the entire area of the Rift. We compile a detailed fault map of the area and analyze the scaling properties of fault trace-lengths by using a statistical mechanics model, derived in the framework of generalized statistical mechanics and associated maximum entropy principle. By using this framework, a range of asymptotic power-law to exponential-like distributions are derived that can well describe the observed scaling patterns of fault trace-lengths in the Rift. Systematic variations and in particular a transition from asymptotic power-law to exponential-like scaling are observed to be a function of increasing strain in distinct strain regimes in the Rift, providing quantitative evidence for such crustal processes in a single tectonic setting. These results indicate the organization of the fault system as a function of brittle strain in the Earth's crust and suggest there are different mechanisms for fault growth in the distinct parts of the Rift. In addition, other factors such as fault interactions and the thickness of the brittle layer affect how the fault system evolves in time. The results suggest that regional strain, fault interactions and the boundary condition of the brittle layer may control fault growth and the fault network evolution in the Corinth Rift.
Mechanism-based Pharmacovigilance over the Life Sciences Linked Open Data Cloud.
Kamdar, Maulik R; Musen, Mark A
2017-01-01
Adverse drug reactions (ADR) result in significant morbidity and mortality in patients, and a substantial proportion of these ADRs are caused by drug-drug interactions (DDIs). Pharmacovigilance methods are used to detect unanticipated DDIs and ADRs by mining Spontaneous Reporting Systems, such as the US FDA Adverse Event Reporting System (FAERS). However, these methods do not provide mechanistic explanations for the discovered drug-ADR associations in a systematic manner. In this paper, we present a systems pharmacology-based approach to perform mechanism-based pharmacovigilance. We integrate data and knowledge from four different sources using Semantic Web Technologies and Linked Data principles to generate a systems network. We present a network-based Apriori algorithm for association mining in FAERS reports. We evaluate our method against existing pharmacovigilance methods for three different validation sets. Our method has AUROC statistics of 0.7-0.8, similar to current methods, and event-specific thresholds generate AUROC statistics greater than 0.75 for certain ADRs. Finally, we discuss the benefits of using Semantic Web technologies to attain the objectives for mechanism-based pharmacovigilance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yong, E-mail: 83229994@qq.com; Ge, Hao, E-mail: haoge@pku.edu.cn; Xiong, Jie, E-mail: jiexiong@umac.mo
Fluctuation theorem is one of the major achievements in the field of nonequilibrium statistical mechanics during the past two decades. There exist very few results for steady-state fluctuation theorem of sample entropy production rate in terms of large deviation principle for diffusion processes due to the technical difficulties. Here we give a proof for the steady-state fluctuation theorem of a diffusion process in magnetic fields, with explicit expressions of the free energy function and rate function. The proof is based on the Karhunen-Loève expansion of complex-valued Ornstein-Uhlenbeck process.
Quantum theory of multiscale coarse-graining.
Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A
2018-03-14
Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.
Proof of the Spin Statistics Connection 2: Relativistic Theory
NASA Astrophysics Data System (ADS)
Santamato, Enrico; De Martini, Francesco
2017-12-01
The traditional standard theory of quantum mechanics is unable to solve the spin-statistics problem, i.e. to justify the utterly important "Pauli Exclusion Principle" but by the adoption of the complex standard relativistic quantum field theory. In a recent paper (Santamato and De Martini in Found Phys 45(7):858-873, 2015) we presented a proof of the spin-statistics problem in the nonrelativistic approximation on the basis of the "Conformal Quantum Geometrodynamics". In the present paper, by the same theory the proof of the spin-statistics theorem is extended to the relativistic domain in the general scenario of curved spacetime. The relativistic approach allows to formulate a manifestly step-by-step Weyl gauge invariant theory and to emphasize some fundamental aspects of group theory in the demonstration. No relativistic quantum field operators are used and the particle exchange properties are drawn from the conservation of the intrinsic helicity of elementary particles. It is therefore this property, not considered in the standard quantum mechanics, which determines the correct spin-statistics connection observed in Nature (Santamato and De Martini in Found Phys 45(7):858-873, 2015). The present proof of the spin-statistics theorem is simpler than the one presented in Santamato and De Martini (Found Phys 45(7):858-873, 2015), because it is based on symmetry group considerations only, without having recourse to frames attached to the particles. Second quantization and anticommuting operators are not necessary.
ERIC Educational Resources Information Center
Hartsoe, Joseph K.; Barclay, Susan R.
2017-01-01
The purpose of this study was to investigate faculty belief, knowledge, and confidence in the principles of Universal Design for Instruction (UDI). Results yielded statistically significant correlations between participant's belief and knowledge of the principles of UDI. Furthermore, findings yielded statistically significant differences between…
Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.
Liu, Xinzijian; Liu, Jian
2018-03-14
An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.
Intuitive statistics by 8-month-old infants
Xu, Fei; Garcia, Vashti
2008-01-01
Human learners make inductive inferences based on small amounts of data: we generalize from samples to populations and vice versa. The academic discipline of statistics formalizes these intuitive statistical inferences. What is the origin of this ability? We report six experiments investigating whether 8-month-old infants are “intuitive statisticians.” Our results showed that, given a sample, the infants were able to make inferences about the population from which the sample had been drawn. Conversely, given information about the entire population of relatively small size, the infants were able to make predictions about the sample. Our findings provide evidence that infants possess a powerful mechanism for inductive learning, either using heuristics or basic principles of probability. This ability to make inferences based on samples or information about the population develops early and in the absence of schooling or explicit teaching. Human infants may be rational learners from very early in development. PMID:18378901
Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems
NASA Astrophysics Data System (ADS)
Liu, Xinzijian; Liu, Jian
2018-03-01
An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.
Cumulative (Dis)Advantage and the Matthew Effect in Life-Course Analysis
Bask, Miia; Bask, Mikael
2015-01-01
To foster a deeper understanding of the mechanisms behind inequality in society, it is crucial to work with well-defined concepts associated with such mechanisms. The aim of this paper is to define cumulative (dis)advantage and the Matthew effect. We argue that cumulative (dis)advantage is an intra-individual micro-level phenomenon, that the Matthew effect is an inter-individual macro-level phenomenon and that an appropriate measure of the Matthew effect focuses on the mechanism or dynamic process that generates inequality. The Matthew mechanism is, therefore, a better name for the phenomenon, where we provide a novel measure of the mechanism, including a proof-of-principle analysis using disposable personal income data. Finally, because socio-economic theory should be able to explain cumulative (dis)advantage and the Matthew mechanism when they are detected in data, we discuss the types of models that may explain the phenomena. We argue that interactions-based models in the literature traditions of analytical sociology and statistical mechanics serve this purpose. PMID:26606386
NASA Astrophysics Data System (ADS)
Palacios, Patricia
2018-05-01
In this paper, I compare the use of the thermodynamic limit in the theory of phase transitions with the infinite-time limit in the explanation of equilibrium statistical mechanics. In the case of phase transitions, I will argue that the thermodynamic limit can be justified pragmatically since the limit behavior (i) also arises before we get to the limit and (ii) for values of N that are physically significant. However, I will contend that the justification of the infinite-time limit is less straightforward. In fact, I will point out that even in cases where one can recover the limit behavior for finite t, i.e. before we get to the limit, one cannot recover this behavior for realistic time scales. I will claim that this leads us to reconsider the role that the rate of convergence plays in the justification of infinite limits and calls for a revision of the so-called Butterfield's principle.
NASA Astrophysics Data System (ADS)
Palacios, Patricia
2018-04-01
In this paper, I compare the use of the thermodynamic limit in the theory of phase transitions with the infinite-time limit in the explanation of equilibrium statistical mechanics. In the case of phase transitions, I will argue that the thermodynamic limit can be justified pragmatically since the limit behavior (i) also arises before we get to the limit and (ii) for values of N that are physically significant. However, I will contend that the justification of the infinite-time limit is less straightforward. In fact, I will point out that even in cases where one can recover the limit behavior for finite t, i.e. before we get to the limit, one cannot recover this behavior for realistic time scales. I will claim that this leads us to reconsider the role that the rate of convergence plays in the justification of infinite limits and calls for a revision of the so-called Butterfield's principle.
Speckle Filtering of GF-3 Polarimetric SAR Data with Joint Restriction Principle.
Xie, Jinwei; Li, Zhenfang; Zhou, Chaowei; Fang, Yuyuan; Zhang, Qingjun
2018-05-12
Polarimetric SAR (PolSAR) scattering characteristics of imagery are always obtained from the second order moments estimation of multi-polarization data, that is, the estimation of covariance or coherency matrices. Due to the extra-paths that signal reflected from separate scatterers within the resolution cell has to travel, speckle noise always exists in SAR images and has a severe impact on the scattering performance, especially on single look complex images. In order to achieve high accuracy in estimating covariance or coherency matrices, three aspects are taken into consideration: (1) the edges and texture of the scene are distinct after speckle filtering; (2) the statistical characteristic should be similar to the object pixel; and (3) the polarimetric scattering signature should be preserved, in addition to speckle reduction. In this paper, a joint restriction principle is proposed to meet the requirement. Three different restriction principles are introduced to the processing of speckle filtering. First, a new template, which is more suitable for the point or line targets, is designed to ensure the morphological consistency. Then, the extent sigma filter is used to restrict the pixels in the template aforementioned to have an identical statistic characteristic. At last, a polarimetric similarity factor is applied to the same pixels above, to guarantee the similar polarimetric features amongst the optional pixels. This processing procedure is named as speckle filtering with joint restriction principle and the approach is applied to GF-3 polarimetric SAR data acquired in San Francisco, CA, USA. Its effectiveness of keeping the image sharpness and preserving the scattering mechanism as well as speckle reduction is validated by the comparison with boxcar filters and refined Lee filter.
12 CFR 741.6 - Financial and statistical and other reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... greater, but may reflect regulatory accounting principles other than GAAP if the credit union has total.... GAAP means generally accepted accounting principles, as defined in § 715.2(e) of this chapter. GAAP is... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Financial and statistical and other reports...
First principles statistical mechanics of alloys and magnetism
NASA Astrophysics Data System (ADS)
Eisenbach, Markus; Khan, Suffian N.; Li, Ying Wai
Modern high performance computing resources are enabling the exploration of the statistical physics of phase spaces with increasing size and higher fidelity of the Hamiltonian of the systems. For selected systems, this now allows the combination of Density Functional based first principles calculations with classical Monte Carlo methods for parameter free, predictive thermodynamics of materials. We combine our locally selfconsistent real space multiple scattering method for solving the Kohn-Sham equation with Wang-Landau Monte-Carlo calculations (WL-LSMS). In the past we have applied this method to the calculation of Curie temperatures in magnetic materials. Here we will present direct calculations of the chemical order - disorder transitions in alloys. We present our calculated transition temperature for the chemical ordering in CuZn and the temperature dependence of the short-range order parameter and specific heat. Finally we will present the extension of the WL-LSMS method to magnetic alloys, thus allowing the investigation of the interplay of magnetism, structure and chemical order in ferrous alloys. This research was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Science and Engineering Division and it used Oak Ridge Leadership Computing Facility resources at Oak Ridge National Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorkyan, A. S., E-mail: g-ashot@sci.am; Sahakyan, V. V.
We study the classical 1D Heisenberg spin glasses in the framework of nearest-neighboring model. Based on the Hamilton equations we obtained the system of recurrence equations which allows to perform node-by-node calculations of a spin-chain. It is shown that calculations from the first principles of classical mechanics lead to ℕℙ hard problem, that however in the limit of the statistical equilibrium can be calculated by ℙ algorithm. For the partition function of the ensemble a new representation is offered in the form of one-dimensional integral of spin-chains’ energy distribution.
On thermalization of electron-positron-photon plasma
NASA Astrophysics Data System (ADS)
Siutsou, I. A.; Aksenov, A. G.; Vereshchagin, G. V.
2015-12-01
Recently a progress has been made in understanding thermalization mechanism of relativistic plasma starting from a non-equilibrium state. Relativistic Boltzmann equations were solved numerically for homogeneous isotropic plasma with collision integrals for two- and three-particle interactions calculated from the first principles by means of QED matrix elements. All particles were assumed to fulfill Boltzmann statistics. In this work we follow plasma thermalization by accounting for Bose enhancement and Pauli blocking in particle interactions. Our results show that particle in equilibrium reach Bose-Einstein distribution for photons, and Fermi-Dirac one for electrons, respectively.
Ensemble inequivalence and Maxwell construction in the self-gravitating ring model
NASA Astrophysics Data System (ADS)
Rocha Filho, T. M.; Silvestre, C. H.; Amato, M. A.
2018-06-01
The statement that Gibbs equilibrium ensembles are equivalent is a base line in many approaches in the context of equilibrium statistical mechanics. However, as a known fact, for some physical systems this equivalence may not be true. In this paper we illustrate from first principles the inequivalence between the canonical and microcanonical ensembles for a system with long range interactions. We make use of molecular dynamics simulations and Monte Carlo simulations to explore the thermodynamics properties of the self-gravitating ring model and discuss on what conditions the Maxwell construction is applicable.
Stochastic mechanics of reciprocal diffusions
NASA Astrophysics Data System (ADS)
Levy, Bernard C.; Krener, Arthur J.
1996-02-01
The dynamics and kinematics of reciprocal diffusions were examined in a previous paper [J. Math. Phys. 34, 1846 (1993)], where it was shown that reciprocal diffusions admit a chain of conservation laws, which close after the first two laws for two disjoint subclasses of reciprocal diffusions, the Markov and quantum diffusions. For the case of quantum diffusions, the conservation laws are equivalent to Schrödinger's equation. The Markov diffusions were employed by Schrödinger [Sitzungsber. Preuss. Akad. Wiss. Phys. Math Kl. 144 (1931); Ann. Inst. H. Poincaré 2, 269 (1932)], Nelson [Dynamical Theories of Brownian Motion (Princeton University, Princeton, NJ, 1967); Quantum Fluctuations (Princeton University, Princeton, NJ, 1985)], and other researchers to develop stochastic formulations of quantum mechanics, called stochastic mechanics. We propose here an alternative version of stochastic mechanics based on quantum diffusions. A procedure is presented for constructing the quantum diffusion associated to a given wave function. It is shown that quantum diffusions satisfy the uncertainty principle, and have a locality property, whereby given two dynamically uncoupled but statistically correlated particles, the marginal statistics of each particle depend only on the local fields to which the particle is subjected. However, like Wigner's joint probability distribution for the position and momentum of a particle, the finite joint probability densities of quantum diffusions may take negative values.
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
Resolving phase stability in the Ti-O binary with first-principles statistical mechanics methods
NASA Astrophysics Data System (ADS)
Gunda, N. S. Harsha; Puchala, Brian; Van der Ven, Anton
2018-03-01
The Ti-O system consists of a multitude of stable and metastable oxides that are used in wide ranging applications. In this work we investigate phase stability in the Ti-O binary from first principles. We perform a systematic search for ground state structures as a function of oxygen concentration by considering oxygen-vacancy and/or titanium-vacancy orderings over four parent crystal structures: (i) hcp Ti, (ii) ω -Ti, (iii) rocksalt, and (iv) hcp oxygen containing interstitial titanium. We explore phase stability at finite temperature using cluster expansion Hamiltonians and Monte Carlo simulations. The calculations predict a high oxygen solubility in hcp Ti and the stability of suboxide phases that undergo order-disorder transitions upon heating. Vacancy ordered rocksalt phases are also predicted at low temperature that disorder to form an extended solid solution at high temperatures. Predicted stable and metastable phase diagrams are qualitatively consistent with experimental observations, however, important discrepancies are revealed between first-principles density functional theory predictions of phase stability and the current understanding of phase stability in this system.
Information-theoretic approach to interactive learning
NASA Astrophysics Data System (ADS)
Still, S.
2009-01-01
The principles of statistical mechanics and information theory play an important role in learning and have inspired both theory and the design of numerous machine learning algorithms. The new aspect in this paper is a focus on integrating feedback from the learner. A quantitative approach to interactive learning and adaptive behavior is proposed, integrating model- and decision-making into one theoretical framework. This paper follows simple principles by requiring that the observer's world model and action policy should result in maximal predictive power at minimal complexity. Classes of optimal action policies and of optimal models are derived from an objective function that reflects this trade-off between prediction and complexity. The resulting optimal models then summarize, at different levels of abstraction, the process's causal organization in the presence of the learner's actions. A fundamental consequence of the proposed principle is that the learner's optimal action policies balance exploration and control as an emerging property. Interestingly, the explorative component is present in the absence of policy randomness, i.e. in the optimal deterministic behavior. This is a direct result of requiring maximal predictive power in the presence of feedback.
The development of principled connections and kind representations.
Haward, Paul; Wagner, Laura; Carey, Susan; Prasada, Sandeep
2018-07-01
Kind representations draw an important distinction between properties that are understood as existing in instances of a kind by virtue of their being the kind of thing they are and properties that are not understood in this manner. For example, the property of barking for the kind dog is understood as being had by dogs by virtue of the fact that they are dogs. These properties are said to have a principled connection to the kind. In contrast, the property of wearing a collar is not understood as existing in instances by virtue of their being dogs, despite the fact that a large percentage of dogs wear collars. Such properties are said to have a statistical connection to the kind. Two experiments tested two signatures of principled connections in 4-7 year olds and adults: (i) that principled connections license normative expectations (e.g., we judge there to be something wrong with a dog that does not bark), and (ii) that principled connections license formal explanations which explain the existence of a property by reference to the kind (e.g., that barks because it is a dog). Experiment 1 showed that both the children and adults have normative expectations for properties that have a principled connection to a kind, but not those that have a mere statistical connection to a kind. Experiment 2 showed that both children and adults are more likely to provide a formal explanation when explaining the existence of properties with a principled connection to a kind than properties with statistical connections to their kinds. Both experiments showed no effect of age (over ages 4, 7, and adulthood) on the extent to which participants differentiated principled and statistical connections. We discuss the implications of the results for theories of conceptual representation and for the structure of explanation. Copyright © 2018 Elsevier B.V. All rights reserved.
Edwards statistical mechanics for jammed granular matter
NASA Astrophysics Data System (ADS)
Baule, Adrian; Morone, Flaviano; Herrmann, Hans J.; Makse, Hernán A.
2018-01-01
In 1989, Sir Sam Edwards made the visionary proposition to treat jammed granular materials using a volume ensemble of equiprobable jammed states in analogy to thermal equilibrium statistical mechanics, despite their inherent athermal features. Since then, the statistical mechanics approach for jammed matter—one of the very few generalizations of Gibbs-Boltzmann statistical mechanics to out-of-equilibrium matter—has garnered an extraordinary amount of attention by both theorists and experimentalists. Its importance stems from the fact that jammed states of matter are ubiquitous in nature appearing in a broad range of granular and soft materials such as colloids, emulsions, glasses, and biomatter. Indeed, despite being one of the simplest states of matter—primarily governed by the steric interactions between the constitutive particles—a theoretical understanding based on first principles has proved exceedingly challenging. Here a systematic approach to jammed matter based on the Edwards statistical mechanical ensemble is reviewed. The construction of microcanonical and canonical ensembles based on the volume function, which replaces the Hamiltonian in jammed systems, is discussed. The importance of approximation schemes at various levels is emphasized leading to quantitative predictions for ensemble averaged quantities such as packing fractions and contact force distributions. An overview of the phenomenology of jammed states and experiments, simulations, and theoretical models scrutinizing the strong assumptions underlying Edwards approach is given including recent results suggesting the validity of Edwards ergodic hypothesis for jammed states. A theoretical framework for packings whose constitutive particles range from spherical to nonspherical shapes such as dimers, polymers, ellipsoids, spherocylinders or tetrahedra, hard and soft, frictional, frictionless and adhesive, monodisperse, and polydisperse particles in any dimensions is discussed providing insight into a unifying phase diagram for all jammed matter. Furthermore, the connection between the Edwards ensemble of metastable jammed states and metastability in spin glasses is established. This highlights the fact that the packing problem can be understood as a constraint satisfaction problem for excluded volume and force and torque balance leading to a unifying framework between the Edwards ensemble of equiprobable jammed states and out-of-equilibrium spin glasses.
Forecasting runout of rock and debris avalanches
Iverson, Richard M.; Evans, S.G.; Mugnozza, G.S.; Strom, A.; Hermanns, R.L.
2006-01-01
Physically based mathematical models and statistically based empirical equations each may provide useful means of forecasting runout of rock and debris avalanches. This paper compares the foundations, strengths, and limitations of a physically based model and a statistically based forecasting method, both of which were developed to predict runout across three-dimensional topography. The chief advantage of the physically based model results from its ties to physical conservation laws and well-tested axioms of soil and rock mechanics, such as the Coulomb friction rule and effective-stress principle. The output of this model provides detailed information about the dynamics of avalanche runout, at the expense of high demands for accurate input data, numerical computation, and experimental testing. In comparison, the statistical method requires relatively modest computation and no input data except identification of prospective avalanche source areas and a range of postulated avalanche volumes. Like the physically based model, the statistical method yields maps of predicted runout, but it provides no information on runout dynamics. Although the two methods differ significantly in their structure and objectives, insights gained from one method can aid refinement of the other.
Applications of Principled Search Methods in Climate Influences and Mechanisms
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
Forest and grass fires cause economic losses in the billions of dollars in the U.S. alone. In addition, boreal forests constitute a large carbon store; it has been estimated that, were no burning to occur, an additional 7 gigatons of carbon would be sequestered in boreal soils each century. Effective wildfire suppression requires anticipation of locales and times for which wildfire is most probable, preferably with a two to four week forecast, so that limited resources can be efficiently deployed. The United States Forest Service (USFS), and other experts and agencies have developed several measures of fire risk combining physical principles and expert judgment, and have used them in automated procedures for forecasting fire risk. Forecasting accuracies for some fire risk indices in combination with climate and other variables have been estimated for specific locations, with the value of fire risk index variables assessed by their statistical significance in regressions. In other cases, the MAPSS forecasts [23, 241 for example, forecasting accuracy has been estimated only by simulated data. We describe alternative forecasting methods that predict fire probability by locale and time using statistical or machine learning procedures trained on historical data, and we give comparative assessments of their forecasting accuracy for one fire season year, April- October, 2003, for all U.S. Forest Service lands. Aside from providing an accuracy baseline for other forecasting methods, the results illustrate the interdependence between the statistical significance of prediction variables and the forecasting method used.
Screening Health Risk Assessment Burn Pit Exposures, Balad Air Base, Iraq and Addendum Report
2008-05-01
risk uses principles drawn from many scientific disciplines including chemistry , toxicology, physics, mathematics, and statistics. Because the data...uses principles drawn from many scientific disciplines, including chemistry , toxicology, physics, mathematics, and statistics. Because the data...natural chemicals in plants (called flavonoids ) also act on the Ah-receptor and could potentially block the effects of dioxins. One more reason to
Gamma-Ray Telescope and Uncertainty Principle
ERIC Educational Resources Information Center
Shivalingaswamy, T.; Kagali, B. A.
2012-01-01
Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…
A Bayesian perspective on Markovian dynamics and the fluctuation theorem
NASA Astrophysics Data System (ADS)
Virgo, Nathaniel
2013-08-01
One of E. T. Jaynes' most important achievements was to derive statistical mechanics from the maximum entropy (MaxEnt) method. I re-examine a relatively new result in statistical mechanics, the Evans-Searles fluctuation theorem, from a MaxEnt perspective. This is done in the belief that interpreting such results in Bayesian terms will lead to new advances in statistical physics. The version of the fluctuation theorem that I will discuss applies to discrete, stochastic systems that begin in a non-equilibrium state and relax toward equilibrium. I will show that for such systems the fluctuation theorem can be seen as a consequence of the fact that the equilibrium distribution must obey the property of detailed balance. Although the principle of detailed balance applies only to equilibrium ensembles, it puts constraints on the form of non-equilibrium trajectories. This will be made clear by taking a novel kind of Bayesian perspective, in which the equilibrium distribution is seen as a prior over the system's set of possible trajectories. Non-equilibrium ensembles are calculated from this prior using Bayes' theorem, with the initial conditions playing the role of the data. I will also comment on the implications of this perspective for the question of how to derive the second law.
Liu, Wei; Ding, Jinhui
2018-04-01
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
A statistical framework for applying RNA profiling to chemical hazard detection.
Kostich, Mitchell S
2017-12-01
Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.
Statistical detection of systematic election irregularities
Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan
2012-01-01
Democratic societies are built around the principle of free and fair elections, and that each citizen’s vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons. PMID:23010929
Statistical detection of systematic election irregularities.
Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan
2012-10-09
Democratic societies are built around the principle of free and fair elections, and that each citizen's vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons.
Superstatistical Energy Distributions of an Ion in an Ultracold Buffer Gas
NASA Astrophysics Data System (ADS)
Rouse, I.; Willitsch, S.
2017-04-01
An ion in a radio frequency ion trap interacting with a buffer gas of ultracold neutral atoms is a driven dynamical system which has been found to develop a nonthermal energy distribution with a power law tail. The exact analytical form of this distribution is unknown, but has often been represented empirically by q -exponential (Tsallis) functions. Based on the concepts of superstatistics, we introduce a framework for the statistical mechanics of an ion trapped in an rf field subject to collisions with a buffer gas. We derive analytic ion secular energy distributions from first principles both neglecting and including the effects of the thermal energy of the buffer gas. For a buffer gas with a finite temperature, we prove that Tsallis statistics emerges from the combination of a constant heating term and multiplicative energy fluctuations. We show that the resulting distributions essentially depend on experimentally controllable parameters paving the way for an accurate control of the statistical properties of ion-atom hybrid systems.
Unifying hydrotropy under Gibbs phase rule.
Shimizu, Seishi; Matubayasi, Nobuyuki
2017-09-13
The task of elucidating the mechanism of solubility enhancement using hydrotropes has been hampered by the wide variety of phase behaviour that hydrotropes can exhibit, encompassing near-ideal aqueous solution, self-association, micelle formation, and micro-emulsions. Instead of taking a field guide or encyclopedic approach to classify hydrotropes into different molecular classes, we take a rational approach aiming at constructing a unified theory of hydrotropy based upon the first principles of statistical thermodynamics. Achieving this aim can be facilitated by the two key concepts: (1) the Gibbs phase rule as the basis of classifying the hydrotropes in terms of the degrees of freedom and the number of variables to modulate the solvation free energy; (2) the Kirkwood-Buff integrals to quantify the interactions between the species and their relative contributions to the process of solubilization. We demonstrate that the application of the two key concepts can in principle be used to distinguish the different molecular scenarios at work under apparently similar solubility curves observed from experiments. In addition, a generalization of our previous approach to solutes beyond dilution reveals the unified mechanism of hydrotropy, driven by a strong solute-hydrotrope interaction which overcomes the apparent per-hydrotrope inefficiency due to hydrotrope self-clustering.
Step-wise pulling protocols for non-equilibrium dynamics
NASA Astrophysics Data System (ADS)
Ngo, Van Anh
The fundamental laws of thermodynamics and statistical mechanics, and the deeper understandings of quantum mechanics have been rebuilt in recent years. It is partly because of the increasing power of computing resources nowadays, that allow shedding direct insights into the connections among the thermodynamics laws, statistical nature of our world, and the concepts of quantum mechanics, which have not yet been understood. But mostly, the most important reason, also the ultimate goal, is to understand the mechanisms, statistics and dynamics of biological systems, whose prevailing non-equilibrium processes violate the fundamental laws of thermodynamics, deviate from statistical mechanics, and finally complicate quantum effects. I believe that investigations of the fundamental laws of non-equilibrium dynamics will be a frontier research for at least several more decades. One of the fundamental laws was first discovered in 1997 by Jarzynski, so-called Jarzynski's Equality. Since then, different proofs, alternative descriptions of Jarzynski's Equality, and its further developments and applications have been quickly accumulated. My understandings, developments and applications of an alternative theory on Jarzynski's Equality form the bulk of this dissertation. The core of my theory is based on stepwise pulling protocols, which provide deeper insight into how fluctuations of reaction coordinates contribute to free-energy changes along a reaction pathway. We find that the most optimal pathways, having the largest contribution to free-energy changes, follow the principle of detailed balance. This is a glimpse of why the principle of detailed balance appears so powerful for sampling the most probable statistics of events. In a further development on Jarzynski's Equality, I have been trying to use it in the formalism of diagonal entropy to propose a way to extract useful thermodynamic quantities such temperature, work and free-energy profiles from far-from-equilibrium ensembles, which can be used to characterize non-equilibrium dynamics. Furthermore, we have applied the stepwise pulling protocols and Jarzynski's Equality to investigate the ion selectivity of potassium channels via molecular dynamics simulations. The mechanism of the potassium ion selectivity has remained poorly understood for over fifty years, although a Nobel Prize was awarded to the discovery of the molecular structure of a potassium-selective channel in 2003. In one year of performing simulations, we were able to reproduce the major results of ion selectivity accumulated in fifty years. We have been even boldly going further to propose a new model for ion selectivity based on the structural rearrangement of the selectivity filter of potassium-selective KcsA channels. This structural rearrangement has never been shown to play such a pivotal role in selecting and conducting potassium ions, but effectively rejecting sodium ions. Using the stepwise pulling protocols, we are also able to estimate conductance for ion channels, which remains elusive by using other methods. In the light of ion channels, we have also investigated how a synthetic channel of telemeric G-quadruplex conducts different types of ions. These two studies on ion selectivity not only constitute an interesting part of this dissertation, but also will enable us to further explore a new set of ion-selectivity principles. Beside the focus of my dissertation, I used million-atom molecular dynamics simulations to investigate the mechanical properties of body-centered-cubic (BCCS) and face-centered-cubic (FCCS) supercrystals of DNA-functionalized gold nanoparticles. These properties are valuable for examining whether these supercrystals can be used in gene delivery and gene therapy. The formation of such ordered supercrystals is useful to protect DNAs or RNAs from being attacked and destroyed by enzymes in cells. I also performed all-atom molecular dynamics simulations to study a pure oleic acid (OA) membrane in water that results into a triple-layer structure. The simulations show that the trans-membrane movement of water and OAs is cooperative and correlated, and agrees with experimentally measured absorption rates. The simulation results support the idea that OA flip-flop is more favorable than transport by means of functional proteins. This study might provide further insight into how primitive cell membranes work, and how the interplay and correlation between water and fatty acids may occur.
NASA Astrophysics Data System (ADS)
Raman, Kumar; Papanikolaou, Stefanos; Fradkin, Eduardo
2007-03-01
We construct a two-dimensional microscopic model of interacting quantum dimers that displays an infinite number of periodic striped phases in its T=0 phase diagram. The phases form an incomplete devil's staircase and the period becomes arbitrarily large as the staircase is traversed. The Hamiltonian has purely short-range interactions, does not break any symmetries, and is generic in that it does not involve the fine tuning of a large number of parameters. Our model, a quantum mechanical analog of the Pokrovsky-Talapov model of fluctuating domain walls in two dimensional classical statistical mechanics, provides a mechanism by which striped phases with periods large compared to the lattice spacing can, in principle, form in frustrated quantum magnetic systems with only short-ranged interactions and no explicitly broken symmetries. Please see cond-mat/0611390 for more details.
Back to basics: an introduction to statistics.
Halfens, R J G; Meijers, J M M
2013-05-01
In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.
How Many Is Enough?—Statistical Principles for Lexicostatistics
Zhang, Menghan; Gong, Tao
2016-01-01
Lexicostatistics has been applied in linguistics to inform phylogenetic relations among languages. There are two important yet not well-studied parameters in this approach: the conventional size of vocabulary list to collect potentially true cognates and the minimum matching instances required to confirm a recurrent sound correspondence. Here, we derive two statistical principles from stochastic theorems to quantify these parameters. These principles validate the practice of using the Swadesh 100- and 200-word lists to indicate degree of relatedness between languages, and enable a frequency-based, dynamic threshold to detect recurrent sound correspondences. Using statistical tests, we further evaluate the generality of the Swadesh 100-word list compared to the Swadesh 200-word list and other 100-word lists sampled randomly from the Swadesh 200-word list. All these provide mathematical support for applying lexicostatistics in historical and comparative linguistics. PMID:28018261
Variation Principles and Applications in the Study of Cell Structure and Aging
NASA Technical Reports Server (NTRS)
Economos, Angelos C.; Miquel, Jaime; Ballard, Ralph C.; Johnson, John E., Jr.
1981-01-01
In this report we have attempted to show that "some reality lies concealed in biological variation". This "reality" has its principles, laws, mechanisms, and rules, only a few of which we have sketched. A related idea we pursued was that important information may be lost in the process of ignoring frequency distributions of physiological variables (as is customary in experimental physiology and gerontology). We suggested that it may be advantageous to expand one's "statistical field of vision" beyond simple averages +/- standard deviations. Indeed, frequency distribution analysis may make visible some hidden information not evident from a simple qualitative analysis, particularly when the effect of some external factor or condition (e.g., aging, dietary chemicals) is being investigated. This was clearly illustrated by the application of distribution analysis in the study of variation in mouse liver cellular and fine structure, and may be true of fine structural studies in general. In living systems, structure and function interact in a dynamic way; they are "inseparable," unlike in technological systems or machines. Changes in fine structure therefore reflect changes in function. If such changes do not exceed a certain physiologic range, a quantitative analysis of structure will provide valuable information on quantitative changes in function that may not be possible or easy to measure directly. Because there is a large inherent variation in fine structure of cells in a given organ of an individual and among individuals, changes in fine structure can be analyzed only by studying frequency distribution curves of various structural characteristics (dimensions). Simple averages +/- S.D. do not in general reveal all information on the effect of a certain factor, because often this effect is not uniform; on the contrary, this will be apparent from distribution analysis because the form of the curves will be affected. We have also attempted to show in this chapter that similar general statistical principles and mechanisms may be operative in biological and technological systems. Despite the common belief that most biological and technological characteristics of interest have a symmetric bell-shaped (normal or Gaussian) distribution, we have shown that more often than not, distributions tend to be asymmetric and often resemble a so-called log-normal distribution. We saw that at least three general mechanisms may be operative, i.e., nonadditivity of influencing factors, competition among individuals for a common resource, and existence of an "optimum" value for a studied characteristic; more such mechanisms could exist.
Neighbor effect in complexation of a conjugated polymer.
Sosorev, Andrey; Zapunidi, Sergey
2013-09-19
Charge-transfer complex (CTC) formation between a conjugated polymer and low-molecular-weight organic acceptor is proposed to be driven by the neighbor effect. Formation of a CTC on the polymer chain results in an increased probability of new CTC formation near the existing one. We present an analytical model for CTC distribution considering the neighbor effect, based on the principles of statistical mechanics. This model explains the experimentally observed threshold-like dependence of the CTC concentration on the acceptor content in a polymer:acceptor blend. It also allows us to evaluate binding energies of the complexes.
The visual system’s internal model of the world
Lee, Tai Sing
2015-01-01
The Bayesian paradigm has provided a useful conceptual theory for understanding perceptual computation in the brain. While the detailed neural mechanisms of Bayesian inference are not fully understood, recent computational and neurophysiological works have illuminated the underlying computational principles and representational architecture. The fundamental insights are that the visual system is organized as a modular hierarchy to encode an internal model of the world, and that perception is realized by statistical inference based on such internal model. In this paper, I will discuss and analyze the varieties of representational schemes of these internal models and how they might be used to perform learning and inference. I will argue for a unified theoretical framework for relating the internal models to the observed neural phenomena and mechanisms in the visual cortex. PMID:26566294
Experimental Realization of a Thermal Squeezed State of Levitated Optomechanics
NASA Astrophysics Data System (ADS)
Rashid, Muddassar; Tufarelli, Tommaso; Bateman, James; Vovrosh, Jamie; Hempston, David; Kim, M. S.; Ulbricht, Hendrik
2016-12-01
We experimentally squeeze the thermal motional state of an optically levitated nanosphere by fast switching between two trapping frequencies. The measured phase-space distribution of the center of mass of our particle shows the typical shape of a squeezed thermal state, from which we infer up to 2.7 dB of squeezing along one motional direction. In these experiments the average thermal occupancy is high and, even after squeezing, the motional state remains in the remit of classical statistical mechanics. Nevertheless, we argue that the manipulation scheme described here could be used to achieve squeezing in the quantum regime if preceded by cooling of the levitated mechanical oscillator. Additionally, a higher degree of squeezing could, in principle, be achieved by repeating the frequency-switching protocol multiple times.
Compression selective solid-state chemistry
NASA Astrophysics Data System (ADS)
Hu, Anguang
Compression selective solid-state chemistry refers to mechanically induced selective reactions of solids under thermomechanical extreme conditions. Advanced quantum solid-state chemistry simulations, based on density functional theory with localized basis functions, were performed to provide a remarkable insight into bonding pathways of high-pressure chemical reactions in all agreement with experiments. These pathways clearly demonstrate reaction mechanisms in unprecedented structural details, showing not only the chemical identity of reactive intermediates but also how atoms move along the reaction coordinate associated with a specific vibrational mode, directed by induced chemical stress occurred during bond breaking and forming. It indicates that chemical bonds in solids can break and form precisely under compression as we wish. This can be realized through strongly coupling of mechanical work to an initiation vibrational mode when all other modes can be suppressed under compression, resulting in ultrafast reactions to take place isothermally in a few femtoseconds. Thermodynamically, such reactions correspond to an entropy minimum process on an isotherm where the compression can force thermal expansion coefficient equal to zero. Combining a significantly brief reaction process with specific mode selectivity, both statistical laws and quantum uncertainty principle can be bypassed to precisely break chemical bonds, establishing fundamental principles of compression selective solid-state chemistry. Naturally this leads to understand the ''alchemy'' to purify, grow, and perfect certain materials such as emerging novel disruptive energetics.
What is the uncertainty principle of non-relativistic quantum mechanics?
NASA Astrophysics Data System (ADS)
Riggs, Peter J.
2018-05-01
After more than ninety years of discussions over the uncertainty principle, there is still no universal agreement on what the principle states. The Robertson uncertainty relation (incorporating standard deviations) is given as the mathematical expression of the principle in most quantum mechanics textbooks. However, the uncertainty principle is not merely a statement of what any of the several uncertainty relations affirm. It is suggested that a better approach would be to present the uncertainty principle as a statement about the probability distributions of incompatible variables and the resulting restrictions on quantum states.
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
CADDIS Volume 4. Data Analysis: Basic Principles & Issues
Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.
Teaching for Statistical Literacy: Utilising Affordances in Real-World Data
ERIC Educational Resources Information Center
Chick, Helen L.; Pierce, Robyn
2012-01-01
It is widely held that context is important in teaching mathematics and statistics. Consideration of context is central to statistical thinking, and any teaching of statistics must incorporate this aspect. Indeed, it has been advocated that real-world data sets can motivate the learning of statistical principles. It is not, however, a…
Use of Statistical Heuristics in Everyday Inductive Reasoning.
ERIC Educational Resources Information Center
Nisbett, Richard E.; And Others
1983-01-01
In everyday reasoning, people use statistical heuristics (judgmental tools that are rough intuitive equivalents of statistical principles). Use of statistical heuristics is more likely when (1) sampling is clear, (2) the role of chance is clear, (3) statistical reasoning is normative for the event, or (4) the subject has had training in…
Statistical principle and methodology in the NISAN system.
Asano, C
1979-01-01
The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594
Residual-QSAR. Implications for genotoxic carcinogenesis
2011-01-01
Introduction Both main types of carcinogenesis, genotoxic and epigenetic, were examined in the context of non-congenericity and similarity, respectively, for the structure of ligand molecules, emphasizing the role of quantitative structure-activity relationship ((Q)SAR) studies in accordance with OECD (Organization for Economic and Cooperation Development) regulations. The main purpose of this report involves electrophilic theory and the need for meaningful physicochemical parameters to describe genotoxicity by a general mechanism. Residual-QSAR Method The double or looping multiple linear correlation was examined by comparing the direct and residual structural information against the observed activity. A self-consistent equation of observed-computed activity was assumed to give maximum correlation efficiency for those situations in which the direct correlations gave non-significant statistical information. Alternatively, it was also suited to describe slow and apparently non-noticeable cancer phenomenology, with special application to non-congeneric molecules involved in genotoxic carcinogenesis. Application and Discussions The QSAR principles were systematically applied to a given pool of molecules with genotoxic activity in rats to elucidate their carcinogenic mechanisms. Once defined, the endpoint associated with ligand-DNA interaction was used to select variables that retained the main Hansch physicochemical parameters of hydrophobicity, polarizability and stericity, computed by the custom PM3 semiempirical quantum method. The trial and test sets of working molecules were established by implementing the normal Gaussian principle of activities that applies when the applicability domain is not restrained to the congeneric compounds, as in the present study. The application of the residual, self-consistent QSAR method and the factor (or average) method yielded results characterized by extremely high and low correlations, respectively, with the latter resembling the direct activity to parameter QSARs. Nevertheless, such contrasted correlations were further incorporated into the advanced statistical minimum paths principle, which selects the minimum hierarchy from Euclidean distances between all considered QSAR models for all combinations and considered molecular sets (i.e., school and validation). This ultimately led to a mechanistic picture based on the identified alpha, beta and gamma paths connecting structural indicators (i.e., the causes) to the global endpoint, with all included causes. The molecular mechanism preserved the self-consistent feature of the residual QSAR, with each descriptor appearing twice in the course of one cycle of ligand-DNA interaction through inter-and intra-cellular stages. Conclusions Both basal features of the residual-QSAR principle of self-consistency and suitability for non-congeneric molecules make it appropriate for conceptually assessing the mechanistic description of genotoxic carcinogenesis. Additionally, it could be extended to enriched physicochemical structural indices by considering the molecular fragments or structural alerts (or other molecular residues), providing more detailed maps of chemical-biological interactions and pathways. PMID:21668999
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2017-06-01
The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, helpful or even necessary there or in physics itself. I shall suggest one possible type of such models, singularized probabilistic, SP, models, some of which are time-dependent, TDSP-models. The necessity of using such models may change the nature of mathematical modeling in science and, thus, the nature of science, as it happened in the case of Q-models, which not only led to a revolutionary transformation of physics but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.
NASA Technical Reports Server (NTRS)
Wolf, S. F.; Lipschutz, M. E.
1993-01-01
Multivariate statistical analysis techniques (linear discriminant analysis and logistic regression) can provide powerful discrimination tools which are generally unfamiliar to the planetary science community. Fall parameters were used to identify a group of 17 H chondrites (Cluster 1) that were part of a coorbital stream which intersected Earth's orbit in May, from 1855 - 1895, and can be distinguished from all other H chondrite falls. Using multivariate statistical techniques, it was demonstrated that a totally different criterion, labile trace element contents - hence thermal histories - or 13 Cluster 1 meteorites are distinguishable from those of 45 non-Cluster 1 H chondrites. Here, we focus upon the principles of multivariate statistical techniques and illustrate their application using non-meteoritic and meteoritic examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klink, W.H.; Wickramasekara, S., E-mail: wickrama@grinnell.edu; Department of Physics, Grinnell College, Grinnell, IA 50112
2014-01-15
In previous work we have developed a formulation of quantum mechanics in non-inertial reference frames. This formulation is grounded in a class of unitary cocycle representations of what we have called the Galilean line group, the generalization of the Galilei group that includes transformations amongst non-inertial reference frames. These representations show that in quantum mechanics, just as is the case in classical mechanics, the transformations to accelerating reference frames give rise to fictitious forces. A special feature of these previously constructed representations is that they all respect the non-relativistic equivalence principle, wherein the fictitious forces associated with linear acceleration canmore » equivalently be described by gravitational forces. In this paper we exhibit a large class of cocycle representations of the Galilean line group that violate the equivalence principle. Nevertheless the classical mechanics analogue of these cocycle representations all respect the equivalence principle. -- Highlights: •A formulation of Galilean quantum mechanics in non-inertial reference frames is given. •The key concept is the Galilean line group, an infinite dimensional group. •A large class of general cocycle representations of the Galilean line group is constructed. •These representations show violations of the equivalence principle at the quantum level. •At the classical limit, no violations of the equivalence principle are detected.« less
NASA Astrophysics Data System (ADS)
Beretta, Gian Paolo
2014-10-01
By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.
Sustaining organizational culture change in health systems.
Willis, Cameron David; Saul, Jessie; Bevan, Helen; Scheirer, Mary Ann; Best, Allan; Greenhalgh, Trisha; Mannion, Russell; Cornelissen, Evelyn; Howland, David; Jenkins, Emily; Bitz, Jennifer
2016-01-01
The questions addressed by this review are: first, what are the guiding principles underlying efforts to stimulate sustained cultural change; second, what are the mechanisms by which these principles operate; and, finally, what are the contextual factors that influence the likelihood of these principles being effective? The paper aims to discuss these issues. The authors conducted a literature review informed by rapid realist review methodology that examined how interventions interact with contexts and mechanisms to influence the sustainability of cultural change. Reference and expert panelists assisted in refining the research questions, systematically searching published and grey literature, and helping to identify interactions between interventions, mechanisms and contexts. Six guiding principles were identified: align vision and action; make incremental changes within a comprehensive transformation strategy; foster distributed leadership; promote staff engagement; create collaborative relationships; and continuously assess and learn from change. These principles interact with contextual elements such as local power distributions, pre-existing values and beliefs and readiness to engage. Mechanisms influencing how these principles sustain cultural change include activation of a shared sense of urgency and fostering flexible levels of engagement. The principles identified in this review, along with the contexts and mechanisms that influence their effectiveness, are useful domains for policy and practice leaders to explore when grappling with cultural change. These principles are sufficiently broad to allow local flexibilities in adoption and application. This is the first study to adopt a realist approach for understanding how changes in organizational culture may be sustained. Through doing so, this review highlights the broad principles by which organizational action may be organized within enabling contextual settings.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Quantum Mechanics predicts evolutionary biology.
Torday, J S
2018-07-01
Nowhere are the shortcomings of conventional descriptive biology more evident than in the literature on Quantum Biology. In the on-going effort to apply Quantum Mechanics to evolutionary biology, merging Quantum Mechanics with the fundamentals of evolution as the First Principles of Physiology-namely negentropy, chemiosmosis and homeostasis-offers an authentic opportunity to understand how and why physics constitutes the basic principles of biology. Negentropy and chemiosmosis confer determinism on the unicell, whereas homeostasis constitutes Free Will because it offers a probabilistic range of physiologic set points. Similarly, on this basis several principles of Quantum Mechanics also apply directly to biology. The Pauli Exclusion Principle is both deterministic and probabilistic, whereas non-localization and the Heisenberg Uncertainty Principle are both probabilistic, providing the long-sought after ontologic and causal continuum from physics to biology and evolution as the holistic integration recognized as consciousness for the first time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Properties of JP=1/2+ baryon octets at low energy
NASA Astrophysics Data System (ADS)
Kaur, Amanpreet; Gupta, Pallavi; Upadhyay, Alka
2017-06-01
The statistical model in combination with the detailed balance principle is able to phenomenologically calculate and analyze spin- and flavor-dependent properties like magnetic moments (with effective masses, with effective charge, or with both effective mass and effective charge), quark spin polarization and distribution, the strangeness suppression factor, and \\overline{d}-\\overline{u} asymmetry incorporating the strange sea. The s\\overline{s} in the sea is said to be generated via the basic quark mechanism but suppressed by the strange quark mass factor ms>m_{u,d}. The magnetic moments of the octet baryons are analyzed within the statistical model, by putting emphasis on the SU(3) symmetry-breaking effects generated by the mass difference between the strange and non-strange quarks. The work presented here assumes hadrons with a sea having an admixture of quark gluon Fock states. The results obtained have been compared with theoretical models and experimental data.
The statistical geometry of transcriptome divergence in cell-type evolution and cancer.
Liang, Cong; Forrest, Alistair R R; Wagner, Günter P
2015-01-14
In evolution, body plan complexity increases due to an increase in the number of individualized cell types. Yet, there is very little understanding of the mechanisms that produce this form of organismal complexity. One model for the origin of novel cell types is the sister cell-type model. According to this model, each cell type arises together with a sister cell type through specialization from an ancestral cell type. A key prediction of the sister cell-type model is that gene expression profiles of cell types exhibit tree structure. Here we present a statistical model for detecting tree structure in transcriptomic data and apply it to transcriptomes from ENCODE and FANTOM5. We show that transcriptomes of normal cells harbour substantial amounts of hierarchical structure. In contrast, cancer cell lines have less tree structure, suggesting that the emergence of cancer cells follows different principles from that of evolutionary cell-type origination.
Quantum-statistical theory of microwave detection using superconducting tunnel junctions
NASA Astrophysics Data System (ADS)
Deviatov, I. A.; Kuzmin, L. S.; Likharev, K. K.; Migulin, V. V.; Zorin, A. B.
1986-09-01
A quantum-statistical theory of microwave and millimeter-wave detection using superconducting tunnel junctions is developed, with a rigorous account of quantum, thermal, and shot noise arising from fluctuation sources associated with the junctions, signal source, and matching circuits. The problem of the noise characterization in the quantum sensitivity range is considered and a general noise parameter Theta(N) is introduced. This parameter is shown to be an adequate figure of merit for most receivers of interest while some devices can require a more complex characterization. Analytical expressions and/or numerically calculated plots for Theta(N) are presented for the most promising detection modes including the parametric amplification, heterodyne mixing, and quadratic videodetection, using both the quasiparticle-current and the Cooper-pair-current nonlinearities. Ultimate minimum values of Theta(N) for each detection mode are compared and found to be in agreement with limitations imposed by the quantum-mechanical uncertainty principle.
Quantitative analysis of spatial variability of geotechnical parameters
NASA Astrophysics Data System (ADS)
Fang, Xing
2018-04-01
Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.
Automated sampling assessment for molecular simulations using the effective sample size
Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.
2010-01-01
To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418
Total Quality Leadership as it Applies to the Surface Navy
1990-12-01
with statistical control methods. Dr. Deming opened the eyes of the Japanese. They embraced his ideas and accepted his 14 principles of management shown...move closer to fully embracing Deming’s fourteen principles of management . 3. Shipboard Leadership Compared To TQL Many activities on board Navy ships...The results of the comparison of Deming’s principles of management and the Navalized TQL principles show both similar- ities and differences do appear
The energetic cost of walking: a comparison of predictive methods.
Kramer, Patricia Ann; Sylvester, Adam D
2011-01-01
The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is "best", but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species.
Relativistic quantum chaos-An emergent interdisciplinary field.
Lai, Ying-Cheng; Xu, Hong-Ya; Huang, Liang; Grebogi, Celso
2018-05-01
Quantum chaos is referred to as the study of quantum manifestations or fingerprints of classical chaos. A vast majority of the studies were for nonrelativistic quantum systems described by the Schrödinger equation. Recent years have witnessed a rapid development of Dirac materials such as graphene and topological insulators, which are described by the Dirac equation in relativistic quantum mechanics. A new field has thus emerged: relativistic quantum chaos. This Tutorial aims to introduce this field to the scientific community. Topics covered include scarring, chaotic scattering and transport, chaos regularized resonant tunneling, superpersistent currents, and energy level statistics-all in the relativistic quantum regime. As Dirac materials have the potential to revolutionize solid-state electronic and spintronic devices, a good understanding of the interplay between chaos and relativistic quantum mechanics may lead to novel design principles and methodologies to enhance device performance.
Mechanical Metamaterials with Negative Compressibility Transitions
NASA Astrophysics Data System (ADS)
Motter, Adilson
2015-03-01
When tensioned, ordinary materials expand along the direction of the applied force. In this presentation, I will explore network concepts to design metamaterials exhibiting negative compressibility transitions, during which the material undergoes contraction when tensioned (or expansion when pressured). Such transitions, which are forbidden in thermodynamic equilibrium, are possible during the decay of metastable, super-strained states. I will introduce a statistical physics theory for negative compressibility transitions, derive a first-principles model to predict these transitions, and present a validation of the model using molecular dynamics simulations. Aside from its immediate mechanical implications, our theory points to a wealth of analogous inverted responses, such as inverted susceptibility or heat-capacity transitions, allowed when considering realistic scales. This research was done in collaboration with Zachary Nicolaou, and was supported by the National Science Foundation and the Alfred P. Sloan Foundation.
The computational nature of memory modification.
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-03-15
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature.
Specific features of goal setting in road traffic safety
NASA Astrophysics Data System (ADS)
Kolesov, V. I.; Danilov, O. F.; Petrov, A. I.
2017-10-01
Road traffic safety (RTS) management is inherently a branch of cybernetics and therefore requires clear formalization of the task. The paper aims at identification of the specific features of goal setting in RTS management under the system approach. The paper presents the results of cybernetic modeling of the cause-to-effect mechanism of a road traffic accident (RTA); in here, the mechanism itself is viewed as a complex system. A designed management goal function is focused on minimizing the difficulty in achieving the target goal. Optimization of the target goal has been performed using the Lagrange principle. The created working algorithms have passed the soft testing. The key role of the obtained solution in the tactical and strategic RTS management is considered. The dynamics of the management effectiveness indicator has been analyzed based on the ten-year statistics for Russia.
Application of the principle of similarity fluid mechanics
NASA Technical Reports Server (NTRS)
Hendericks, R. C.; Sengers, J. V.
1979-01-01
The principle of similarity applied to fluid mechanics is described and illustrated. The concept of transforming the conservation equations by combining similarity principles for thermophysical properties with those for fluid flow is examined. The usefulness of the procedure is illustrated by applying such a transformation to calculate two phase critical mass flow through a nozzle.
Physical Regulation of the Self-Assembly of Tobacco Mosaic Virus Coat Protein
Kegel, Willem K.; van der Schoot, Paul
2006-01-01
We present a statistical mechanical model based on the principle of mass action that explains the main features of the in vitro aggregation behavior of the coat protein of tobacco mosaic virus (TMV). By comparing our model to experimentally obtained stability diagrams, titration experiments, and calorimetric data, we pin down three competing factors that regulate the transitions between the different kinds of aggregated state of the coat protein. These are hydrophobic interactions, electrostatic interactions, and the formation of so-called “Caspar” carboxylate pairs. We suggest that these factors could be universal and relevant to a large class of virus coat proteins. PMID:16731551
Efficient Parameter Searches for Colloidal Materials Design with Digital Alchemy
NASA Astrophysics Data System (ADS)
Dodd, Paul, M.; Geng, Yina; van Anders, Greg; Glotzer, Sharon C.
Optimal colloidal materials design is challenging, even for high-throughput or genomic approaches, because the design space provided by modern colloid synthesis techniques can easily have dozens of dimensions. In this talk we present the methodology of an inverse approach we term ''digital alchemy'' to perform rapid searches of design-paramenter spaces with up to 188 dimensions that yield thermodynamically optimal colloid parameters for target crystal structures with up to 20 particles in a unit cell. The method relies only on fundamental principles of statistical mechanics and Metropolis Monte Carlo techniques, and yields particle attribute tolerances via analogues of familiar stress-strain relationships.
Hauptmann, C; Roulet, J-C; Niederhauser, J J; Döll, W; Kirlangic, M E; Lysyansky, B; Krachkovskyi, V; Bhatti, M A; Barnikol, U B; Sasse, L; Bührle, C P; Speckmann, E-J; Götz, M; Sturm, V; Freund, H-J; Schnell, U; Tass, P A
2009-12-01
In the past decade deep brain stimulation (DBS)-the application of electrical stimulation to specific target structures via implanted depth electrodes-has become the standard treatment for medically refractory Parkinson's disease and essential tremor. These diseases are characterized by pathological synchronized neuronal activity in particular brain areas. We present an external trial DBS device capable of administering effectively desynchronizing stimulation techniques developed with methods from nonlinear dynamics and statistical physics according to a model-based approach. These techniques exploit either stochastic phase resetting principles or complex delayed-feedback mechanisms. We explain how these methods are implemented into a safe and user-friendly device.
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
Natural image statistics mediate brightness 'filling in'.
Dakin, Steven C; Bex, Peter J
2003-11-22
Although the human visual system can accurately estimate the reflectance (or lightness) of surfaces under enormous variations in illumination, two equiluminant grey regions can be induced to appear quite different simply by placing a light-dark luminance transition between them. This illusion, the Craik-Cornsweet-O'Brien (CCOB) effect, has been taken as evidence for a low-level 'filling-in' mechanism subserving lightness perception. Here, we present evidence that the mechanism responsible for the CCOB effect operates not via propagation of a neural signal across space but by amplification of the low spatial frequency (SF) structure of the image. We develop a simple computational model that relies on the statistics of natural scenes actively to reconstruct the image that is most likely to have caused an observed series of responses across SF channels. This principle is tested psychophysically by deriving classification images (CIs) for subjects' discrimination of the contrast polarity of CCOB stimuli masked with noise. CIs resemble 'filled-in' stimuli; i.e. observers rely on portions of the stimuli that contain no information per se but that correspond closely to the reported perceptual completion. As predicted by the model, the filling-in process is contingent on the presence of appropriate low SF structure.
Intermediate/Advanced Research Design and Statistics
NASA Technical Reports Server (NTRS)
Ploutz-Snyder, Robert
2009-01-01
The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs
The Statistical Power of Planned Comparisons.
ERIC Educational Resources Information Center
Benton, Roberta L.
Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…
Yuan, Zhongshang; Ji, Jiadong; Zhang, Tao; Liu, Yi; Zhang, Xiaoshuai; Chen, Wei; Xue, Fuzhong
2016-12-20
Traditional epidemiology often pays more attention to the identification of a single factor rather than to the pathway that is related to a disease, and therefore, it is difficult to explore the disease mechanism. Systems epidemiology aims to integrate putative lifestyle exposures and biomarkers extracted from multiple omics platforms to offer new insights into the pathway mechanisms that underlie disease at the human population level. One key but inadequately addressed question is how to develop powerful statistics to identify whether one candidate pathway is associated with a disease. Bearing in mind that a pathway difference can result from not only changes in the nodes but also changes in the edges, we propose a novel statistic for detecting group differences between pathways, which in principle, captures the nodes changes and edge changes, as well as simultaneously accounting for the pathway structure simultaneously. The proposed test has been proven to follow the chi-square distribution, and various simulations have shown it has better performance than other existing methods. Integrating genome-wide DNA methylation data, we analyzed one real data set from the Bogalusa cohort study and significantly identified a potential pathway, Smoking → SOCS3 → PIK3R1, which was strongly associated with abdominal obesity. The proposed test was powerful and efficient at identifying pathway differences between two groups, and it can be extended to other disciplines that involve statistical comparisons between pathways. The source code in R is available on our website. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Sensitivity to the Sampling Process Emerges From the Principle of Efficiency.
Jara-Ettinger, Julian; Sun, Felix; Schulz, Laura; Tenenbaum, Joshua B
2018-05-01
Humans can seamlessly infer other people's preferences, based on what they do. Broadly, two types of accounts have been proposed to explain different aspects of this ability. The first account focuses on spatial information: Agents' efficient navigation in space reveals what they like. The second account focuses on statistical information: Uncommon choices reveal stronger preferences. Together, these two lines of research suggest that we have two distinct capacities for inferring preferences. Here we propose that this is not the case, and that spatial-based and statistical-based preference inferences can be explained by the assumption that agents are efficient alone. We show that people's sensitivity to spatial and statistical information when they infer preferences is best predicted by a computational model of the principle of efficiency, and that this model outperforms dual-system models, even when the latter are fit to participant judgments. Our results suggest that, as adults, a unified understanding of agency under the principle of efficiency underlies our ability to infer preferences. Copyright © 2018 Cognitive Science Society, Inc.
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural mechanics component of the Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
NASA Astrophysics Data System (ADS)
Nguyen, Huu Chuong; Szyja, Bartłomiej M.; Doltsinis, Nikos L.
2014-09-01
Density functional theory (DFT) based molecular dynamics simulations have been performed of a 1,4-benzenedithiol molecule attached to two gold electrodes. To model the mechanical manipulation in typical break junction and atomic force microscopy experiments, the distance between two electrodes was incrementally increased up to the rupture point. For each pulling distance, the electric conductance was calculated using the DFT nonequilibrium Green's-function approach for a statistically relevant sample of configurations extracted from the simulation. With increasing mechanical strain, the formation of monoatomic gold wires is observed. The conductance decreases by three orders of magnitude as the initial twofold coordination of the thiol sulfur to the gold is reduced to a single S-Au bond at each electrode and the order in the electrodes is destroyed. Independent of the pulling distance, the conductance was found to fluctuate by at least two orders of magnitude depending on the instantaneous junction geometry.
Einstein's equivalence principle in quantum mechanics revisited
NASA Astrophysics Data System (ADS)
Nauenberg, Michael
2016-11-01
The gravitational equivalence principle in quantum mechanics is of considerable importance, but it is generally not included in physics textbooks. In this note, we present a precise quantum formulation of this principle and comment on its verification in a neutron diffraction experiment. The solution of the time dependent Schrödinger equation for this problem also gives the wave function for the motion of a charged particle in a homogeneous electric field, which is also usually ignored in textbooks on quantum mechanics.
Teaching Principles of Linkage and Gene Mapping with the Tomato.
ERIC Educational Resources Information Center
Hawk, James A.; And Others
1980-01-01
A three-point linkage system in tomatoes is used to explain concepts of gene mapping, linking and statistical analysis. The system is designed for teaching the effective use of statistics, and the power of genetic analysis from statistical analysis of phenotypic ratios. (Author/SA)
Rational integration of noisy evidence and prior semantic expectations in sentence interpretation.
Gibson, Edward; Bergen, Leon; Piantadosi, Steven T
2013-05-14
Sentence processing theories typically assume that the input to our language processing mechanisms is an error-free sequence of words. However, this assumption is an oversimplification because noise is present in typical language use (for instance, due to a noisy environment, producer errors, or perceiver errors). A complete theory of human sentence comprehension therefore needs to explain how humans understand language given imperfect input. Indeed, like many cognitive systems, language processing mechanisms may even be "well designed"--in this case for the task of recovering intended meaning from noisy utterances. In particular, comprehension mechanisms may be sensitive to the types of information that an idealized statistical comprehender would be sensitive to. Here, we evaluate four predictions about such a rational (Bayesian) noisy-channel language comprehender in a sentence comprehension task: (i) semantic cues should pull sentence interpretation towards plausible meanings, especially if the wording of the more plausible meaning is close to the observed utterance in terms of the number of edits; (ii) this process should asymmetrically treat insertions and deletions due to the Bayesian "size principle"; such nonliteral interpretation of sentences should (iii) increase with the perceived noise rate of the communicative situation and (iv) decrease if semantically anomalous meanings are more likely to be communicated. These predictions are borne out, strongly suggesting that human language relies on rational statistical inference over a noisy channel.
Applying Regression Analysis to Problems in Institutional Research.
ERIC Educational Resources Information Center
Bohannon, Tom R.
1988-01-01
Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)
Werner, Gerhard
2009-04-01
In this theoretical and speculative essay, I propose that insights into certain aspects of neural system functions can be gained from viewing brain function in terms of the branch of Statistical Mechanics currently referred to as "Modern Critical Theory" [Stanley, H.E., 1987. Introduction to Phase Transitions and Critical Phenomena. Oxford University Press; Marro, J., Dickman, R., 1999. Nonequilibrium Phase Transitions in Lattice Models. Cambridge University Press, Cambridge, UK]. The application of this framework is here explored in two stages: in the first place, its principles are applied to state transitions in global brain dynamics, with benchmarks of Cognitive Neuroscience providing the relevant empirical reference points. The second stage generalizes to suggest in more detail how the same principles could also apply to the relation between other levels of the structural-functional hierarchy of the nervous system and between neural assemblies. In this view, state transitions resulting from the processing at one level are the input to the next, in the image of a 'bucket brigade', with the content of each bucket being passed on along the chain, after having undergone a state transition. The unique features of a process of this kind will be discussed and illustrated.
A sub-ensemble theory of ideal quantum measurement processes
NASA Astrophysics Data System (ADS)
Allahverdyan, Armen E.; Balian, Roger; Nieuwenhuizen, Theo M.
2017-01-01
In order to elucidate the properties currently attributed to ideal measurements, one must explain how the concept of an individual event with a well-defined outcome may emerge from quantum theory which deals with statistical ensembles, and how different runs issued from the same initial state may end up with different final states. This so-called "measurement problem" is tackled with two guidelines. On the one hand, the dynamics of the macroscopic apparatus A coupled to the tested system S is described mathematically within a standard quantum formalism, where " q-probabilities" remain devoid of interpretation. On the other hand, interpretative principles, aimed to be minimal, are introduced to account for the expected features of ideal measurements. Most of the five principles stated here, which relate the quantum formalism to physical reality, are straightforward and refer to macroscopic variables. The process can be identified with a relaxation of S + A to thermodynamic equilibrium, not only for a large ensemble E of runs but even for its sub-ensembles. The different mechanisms of quantum statistical dynamics that ensure these types of relaxation are exhibited, and the required properties of the Hamiltonian of S + A are indicated. The additional theoretical information provided by the study of sub-ensembles remove Schrödinger's quantum ambiguity of the final density operator for E which hinders its direct interpretation, and bring out a commutative behaviour of the pointer observable at the final time. The latter property supports the introduction of a last interpretative principle, needed to switch from the statistical ensembles and sub-ensembles described by quantum theory to individual experimental events. It amounts to identify some formal " q-probabilities" with ordinary frequencies, but only those which refer to the final indications of the pointer. The desired properties of ideal measurements, in particular the uniqueness of the result for each individual run of the ensemble and von Neumann's reduction, are thereby recovered with economic interpretations. The status of Born's rule involving both A and S is re-evaluated, and contextuality of quantum measurements is made obvious.
Fundamentals of Diesel Engines.
ERIC Educational Resources Information Center
Marine Corps Inst., Washington, DC.
This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…
How Do Students in an Innovative Principle-Based Mechanics Course Understand Energy Concepts?
ERIC Educational Resources Information Center
Ding, Lin; Chabay, Ruth; Sherwood, Bruce
2013-01-01
We investigated students' conceptual learning of energy topics in an innovative college-level introductory mechanics course, entitled Matter & Interactions (M&I) Modern Mechanics. This course differs from traditional curricula in that it emphasizes application of a small number of fundamental principles across various scales, involving…
NASA Astrophysics Data System (ADS)
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better understanding of the behavior of these systems.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
ERIC Educational Resources Information Center
Tu, Wendy; Snyder, Martha M.
2017-01-01
Difficulties in learning statistics primarily at the college-level led to a reform movement in statistics education in the early 1990s. Although much work has been done, effective learning designs that facilitate active learning, conceptual understanding of statistics, and the use of real-data in the classroom are needed. Guided by Merrill's First…
The computational nature of memory modification
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-01-01
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature. DOI: http://dx.doi.org/10.7554/eLife.23763.001 PMID:28294944
Teaching the principles of statistical dynamics
Ghosh, Kingshuk; Dill, Ken A.; Inamdar, Mandar M.; Seitaridou, Effrosyni; Phillips, Rob
2012-01-01
We describe a simple framework for teaching the principles that underlie the dynamical laws of transport: Fick’s law of diffusion, Fourier’s law of heat flow, the Newtonian viscosity law, and the mass-action laws of chemical kinetics. In analogy with the way that the maximization of entropy over microstates leads to the Boltzmann distribution and predictions about equilibria, maximizing a quantity that E. T. Jaynes called “caliber” over all the possible microtrajectories leads to these dynamical laws. The principle of maximum caliber also leads to dynamical distribution functions that characterize the relative probabilities of different microtrajectories. A great source of recent interest in statistical dynamics has resulted from a new generation of single-particle and single-molecule experiments that make it possible to observe dynamics one trajectory at a time. PMID:23585693
Teaching the principles of statistical dynamics.
Ghosh, Kingshuk; Dill, Ken A; Inamdar, Mandar M; Seitaridou, Effrosyni; Phillips, Rob
2006-02-01
We describe a simple framework for teaching the principles that underlie the dynamical laws of transport: Fick's law of diffusion, Fourier's law of heat flow, the Newtonian viscosity law, and the mass-action laws of chemical kinetics. In analogy with the way that the maximization of entropy over microstates leads to the Boltzmann distribution and predictions about equilibria, maximizing a quantity that E. T. Jaynes called "caliber" over all the possible microtrajectories leads to these dynamical laws. The principle of maximum caliber also leads to dynamical distribution functions that characterize the relative probabilities of different microtrajectories. A great source of recent interest in statistical dynamics has resulted from a new generation of single-particle and single-molecule experiments that make it possible to observe dynamics one trajectory at a time.
NASA Astrophysics Data System (ADS)
Michlmayr, Gernot; Cohen, Denis; Or, Dani
2012-05-01
The formation of cracks and emergence of shearing planes and other modes of rapid macroscopic failure in geologic granular media involve numerous grain scale mechanical interactions often generating high frequency (kHz) elastic waves, referred to as acoustic emissions (AE). These acoustic signals have been used primarily for monitoring and characterizing fatigue and progressive failure in engineered systems, with only a few applications concerning geologic granular media reported in the literature. Similar to the monitoring of seismic events preceding an earthquake, AE may offer a means for non-invasive, in-situ, assessment of mechanical precursors associated with imminent landslides or other types of rapid mass movements (debris flows, rock falls, snow avalanches, glacier stick-slip events). Despite diverse applications and potential usefulness, a systematic description of the AE method and its relevance to mechanical processes in Earth sciences is lacking. This review is aimed at providing a sound foundation for linking observed AE with various micro-mechanical failure events in geologic granular materials, not only for monitoring of triggering events preceding mass mobilization, but also as a non-invasive tool in its own right for probing the rich spectrum of mechanical processes at scales ranging from a single grain to a hillslope. We review first studies reporting use of AE for monitoring of failure in various geologic materials, and describe AE generating source mechanisms in mechanically stressed geologic media (e.g., frictional sliding, micro-crackling, particle collisions, rupture of water bridges, etc.) including AE statistical features, such as frequency content and occurrence probabilities. We summarize available AE sensors and measurement principles. The high sampling rates of advanced AE systems enable detection of numerous discrete failure events within a volume and thus provide access to statistical descriptions of progressive collapse of systems with many interacting mechanical elements such as the fiber bundle model (FBM). We highlight intrinsic links between AE characteristics and established statistical models often used in structural engineering and material sciences, and outline potential applications for failure prediction and early-warning using the AE method in combination with the FBM. The biggest challenge to application of the AE method for field applications is strong signal attenuation. We provide an outlook for overcoming such limitations considering emergence of a class of fiber-optic based distributed AE sensors and deployment of acoustic waveguides as part of monitoring networks.
NASA Astrophysics Data System (ADS)
Li, Ziyi
2017-12-01
Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.
Baucom, Brian R W; Leo, Karena; Adamo, Colin; Georgiou, Panayiotis; Baucom, Katherine J W
2017-12-01
Observational behavioral coding methods are widely used for the study of relational phenomena. There are numerous guidelines for the development and implementation of these methods that include principles for creating new and adapting existing coding systems as well as principles for creating coding teams. While these principles have been successfully implemented in research on relational phenomena, the ever expanding array of phenomena being investigated with observational methods calls for a similar expansion of these principles. Specifically, guidelines are needed for decisions that arise in current areas of emphasis in couple research including observational investigation of related outcomes (e.g., relationship distress and psychological symptoms), the study of change in behavior over time, and the study of group similarities and differences in the enactment and perception of behavior. This article describes conceptual and statistical considerations involved in these 3 areas of research and presents principle- and empirically based rationale for design decisions related to these issues. A unifying principle underlying these guidelines is the need for careful consideration of fit between theory, research questions, selection of coding systems, and creation of coding teams. Implications of (mis)fit for the advancement of theory are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Kuhn's Paradigm and Example-Based Teaching of Newtonian Mechanics.
ERIC Educational Resources Information Center
Whitaker, M. A. B.
1980-01-01
Makes a recommendation for more direct teaching of the basic principles of mechanics. Contends that students currently learn mechanics in terms of standard examples. This causes difficulty when the student is confronted with a problem that can be solved from basic principles, but which does not fit a standard category. (GS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonçalves, L.A.; Olavo, L.S.F., E-mail: olavolsf@gmail.com
Dissipation in Quantum Mechanics took some time to become a robust field of investigation after the birth of the field. The main issue hindering developments in the field is that the Quantization process was always tightly connected to the Hamiltonian formulation of Classical Mechanics. In this paper we present a quantization process that does not depend upon the Hamiltonian formulation of Classical Mechanics (although still departs from Classical Mechanics) and thus overcome the problem of finding, from first principles, a completely general Schrödinger equation encompassing dissipation. This generalized process of quantization is shown to be nothing but an extension ofmore » a more restricted version that is shown to produce the Schrödinger equation for Hamiltonian systems from first principles (even for Hamiltonian velocity dependent potential). - Highlights: • A Quantization process independent of the Hamiltonian formulation of quantum Mechanics is proposed. • This quantization method is applied to dissipative or absorptive systems. • A Dissipative Schrödinger equation is derived from first principles.« less
Theory of atomic spectral emission intensity
NASA Astrophysics Data System (ADS)
Yngström, Sten
1994-07-01
The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.
Better with Byzantine: Manipulation-Optimal Mechanisms
NASA Astrophysics Data System (ADS)
Othman, Abraham; Sandholm, Tuomas
A mechanism is manipulable if it is in some agents’ best interest to misrepresent their private information. The revelation principle establishes that, roughly, anything that can be accomplished by a manipulable mechanism can also be accomplished with a truthful mechanism. Yet agents often fail to play their optimal manipulations due to computational limitations or various flavors of incompetence and cognitive biases. Thus, manipulable mechanisms in particular should anticipate byzantine play. We study manipulation-optimal mechanisms: mechanisms that are undominated by truthful mechanisms when agents act fully rationally, and do better than any truthful mechanism if any agent fails to act rationally in any way. This enables the mechanism designer to do better than the revelation principle would suggest, and obviates the need to predict byzantine agents’ irrational behavior. We prove a host of possibility and impossibility results for the concept which have the impression of broadly limiting possibility. These results are largely in line with the revelation principle, although the considerations are more subtle and the impossibility not universal.
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos
2014-05-01
When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.
Calculations of the surface tensions of liquid metals
NASA Technical Reports Server (NTRS)
Stroud, D. G.
1981-01-01
The understanding of the surface tension of liquid metals and alloys from as close to first principles as possible is discussed. The two ingredients which are combined in these calculations are: the electron theory of metals, and the classical theory of liquids, as worked out within the framework of statistical mechanics. The results are a new theory of surface tensions and surface density profiles from knowledge purely of the bulk properties of the coexisting liquid and vapor phases. It is found that the method works well for the pure liquid metals on which it was tested; work is extended to mixtures of liquid metals, interfaces between immiscible liquid metals, and to the temperature derivative of the surface tension.
The accuracy of the ATLAS muon X-ray tomograph
NASA Astrophysics Data System (ADS)
Avramidou, R.; Berbiers, J.; Boudineau, C.; Dechelette, C.; Drakoulakos, D.; Fabjan, C.; Grau, S.; Gschwendtner, E.; Maugain, J.-M.; Rieder, H.; Rangod, S.; Rohrbach, F.; Sbrissa, E.; Sedykh, E.; Sedykh, I.; Smirnov, Y.; Vertogradov, L.; Vichou, I.
2003-01-01
A gigantic detector, the ATLAS project, is under construction at CERN for particle physics research at the Large Hadron Collider which is to be ready by 2006. An X-ray tomograph has been developed, designed and constructed at CERN in order to control the mechanical quality of the ATLAS muon chambers. We reached a measurement accuracy of 2 μm systematic and 2 μm statistical uncertainties in the horizontal and vertical directions in the working area 220 cm (horizontal)×60 cm (vertical). Here we describe in detail the fundamental approach of the basic principle chosen to achieve such good accuracy. In order to crosscheck our precision, key results of measurements are presented.
Zipf's law holds for phrases, not words.
Williams, Jake Ryland; Lessard, Paul R; Desu, Suma; Clark, Eric M; Bagrow, James P; Danforth, Christopher M; Dodds, Peter Sheridan
2015-08-11
With Zipf's law being originally and most famously observed for word frequency, it is surprisingly limited in its applicability to human language, holding over no more than three to four orders of magnitude before hitting a clear break in scaling. Here, building on the simple observation that phrases of one or more words comprise the most coherent units of meaning in language, we show empirically that Zipf's law for phrases extends over as many as nine orders of rank magnitude. In doing so, we develop a principled and scalable statistical mechanical method of random text partitioning, which opens up a rich frontier of rigorous text analysis via a rank ordering of mixed length phrases.
Applications of statistics to medical science, IV survival analysis.
Watanabe, Hiroshi
2012-01-01
The fundamental principles of survival analysis are reviewed. In particular, the Kaplan-Meier method and a proportional hazard model are discussed. This work is the last part of a series in which medical statistics are surveyed.
42 CFR 417.806 - Financial records, statistical data, and cost finding.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 3 2011-10-01 2011-10-01 false Financial records, statistical data, and cost finding. 417.806 Section 417.806 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF..., statistical data, and cost finding. (a) The principles specified in § 417.568 apply to HCPPs, except those in...
42 CFR 417.806 - Financial records, statistical data, and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Financial records, statistical data, and cost finding. 417.806 Section 417.806 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF..., statistical data, and cost finding. (a) The principles specified in § 417.568 apply to HCPPs, except those in...
Apparatus for Teaching Physics.
ERIC Educational Resources Information Center
Minnix, Richard B.; Carpenter, D. Rae, Jr., Eds.
1982-01-01
Thirteen demonstrations using a capacitor-start induction motor fitted with an aluminum disk are described. Demonstrations illustrate principles from mechanics, fluids (Bernoulli's principle), waves (chladni patterns and doppler effect), magnetism, electricity, and light (mechanical color mixing). In addition, the instrument can measure friction…
A Theory of Density Layering in Stratified Turbulence using Statistical State Dynamics
NASA Astrophysics Data System (ADS)
Fitzgerald, J.; Farrell, B.
2016-12-01
Stably stratified turbulent fluids commonly develop density structures that are layered in the vertical direction (e.g., Manucharyan et al., 2015). Within layers, density is approximately constant and stratification is weak. Between layers, density varies rapidly and stratification is strong. A common explanation for the existence of layers invokes the negative diffusion mechanism of Phillips (1972) & Posmentier (1977). The physical principle underlying this mechanism is that the flux-gradient relationship connecting the turbulent fluxes of buoyancy to the background stratification must have the special property of weakening fluxes with strengthening gradient. Under these conditions, the evolution of the stratification is governed by a negative diffusion problem which gives rise to spontaneous layer formation. In previous work on stratified layering, this flux-gradient property is often assumed (e.g, Posmentier, 1977) or drawn from phenomenological models of turbulence (e.g., Balmforth et al., 1998).In this work we develop the theoretical underpinnings of layer formation by applying stochastic turbulence modeling and statistical state dynamics (SSD) to predict the flux-gradient relation and analyze layer formation directly from the equations of motion. We show that for stochastically-forced homogeneous 2D Boussinesq turbulence, the flux-gradient relation can be obtained analytically and indicates that the fluxes always strengthen with stratification. The Phillips mechanism thus does not operate in this maximally simplified scenario. However, when the problem is augmented to include a large scale background shear, we show that the flux-gradient relationship is modified so that the fluxes weaken with stratification. Sheared and stratified 2D Boussinesq turbulence thus spontaneously forms density layers through the Phillips mechanism. Using SSD (Farrell & Ioannou 2003), we obtain a closed, deterministic dynamics for the stratification and the statistical turbulent state. We show that density layers form as a linear instability of the sheared turbulence, associated with a supercritical bifurcation. We further show that SSD predicts the nonlinear equilibration and maintenance of the layers, and captures the phenomena of layer growth and mergers (Radko, 2007).
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Malandraki, O.; Khabarova, O.; Livadiotis, G.; Pavlos, E.; Karakatsanis, L. P.; Iliopoulos, A. C.; Parisis, K.
2017-12-01
In this work we study the non-extensivity of Solar Wind space plasma by using electric-magnetic field data obtained by in situ spacecraft observations at different dynamical states of solar wind system especially in interplanetary coronal mass ejections (ICMEs), Interplanetary shocks, magnetic islands, or near the Earth Bow shock. Especially, we study the energetic particle non extensive fractional acceleration mechanism producing kappa distributions as well as the intermittent turbulence mechanism producing multifractal structures related with the Tsallis q-entropy principle. We present some new and significant results concerning the dynamics of ICMEs observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere as well as magnetic islands. In-situ measurements of energetic particles at L1 are analyzed, in response to major solar eruptive events at the Sun (intense flares, fast CMEs). The statistical characteristics are obtained and compared for the Solar Energetic Particles (SEPs) originating at the Sun, the energetic particle enhancements associated with local acceleration during the CME-driven shock passage over the spacecraft (Energetic Particle Enhancements, ESPs) as well as the energetic particle signatures observed during the passage of the ICME. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of electric-magnetic field and the kappa distributions of solar energetic particles time series of the ICME, magnetic islands, resulting from the solar eruptive activity or the internal Solar Wind dynamics. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states.
Modern Empirical Statistical Spectral Analysis.
1980-05-01
716-723. Akaike, H. (1977). On entropy maximization principle, Applications of Statistics, P.R. Krishnaiah , ed., North-Holland, Amsterdam, 27-41...by P. Krishnaiah , North Holland: Amsterdam, 283-295. Parzen, E. (1979). Forecasting and whitening filter estimation, TIMS Studies in the Management
Code of Federal Regulations, 2010 CFR
2010-01-01
... relating to an account taken in connection with inactivity, default, or delinquency as to that account... with the creditor's business judgment); (iii) Developed and validated using accepted statistical principles and methodology; and (iv) Periodically revalidated by the use of appropriate statistical...
A solution to the biodiversity paradox by logical deterministic cellular automata.
Kalmykov, Lev V; Kalmykov, Vyacheslav L
2015-06-01
The paradox of biological diversity is the key problem of theoretical ecology. The paradox consists in the contradiction between the competitive exclusion principle and the observed biodiversity. The principle is important as the basis for ecological theory. On a relatively simple model we show a mechanism of indefinite coexistence of complete competitors which violates the known formulations of the competitive exclusion principle. This mechanism is based on timely recovery of limiting resources and their spatio-temporal allocation between competitors. Because of limitations of the black-box modeling there was a problem to formulate the exclusion principle correctly. Our white-box multiscale model of two-species competition is based on logical deterministic individual-based cellular automata. This approach provides an automatic deductive inference on the basis of a system of axioms, and gives a direct insight into mechanisms of the studied system. It is one of the most promising methods of artificial intelligence. We reformulate and generalize the competitive exclusion principle and explain why this formulation provides a solution of the biodiversity paradox. In addition, we propose a principle of competitive coexistence.
On the thermodynamics of phase transitions in metal hydrides
NASA Astrophysics Data System (ADS)
di Vita, Andrea
2012-02-01
Metal hydrides are solutions of hydrogen in a metal, where phase transitions may occur depending on temperature, pressure etc. We apply Le Chatelier's principle of thermodynamics to a particular phase transition in TiH x , which can approximately be described as a second-order phase transition. We show that the fluctuations of the order parameter correspond to fluctuations both of the density of H+ ions and of the distance between adjacent H+ ions. Moreover, as the system approaches the transition and the correlation radius increases, we show -with the help of statistical mechanics-that the statistical weight of modes involving a large number of H+ ions (`collective modes') increases sharply, in spite of the fact that the Boltzmann factor of each collective mode is exponentially small. As a result, the interaction of the H+ ions with collective modes makes a tiny suprathermal fraction of the H+ population appear. Our results hold for similar transitions in metal deuterides, too. A violation of an -insofar undisputed-upper bound on hydrogen loading follows.
NASA Astrophysics Data System (ADS)
Barnea, A. Ronny; Cheshnovsky, Ori; Even, Uzi
2018-02-01
Interference experiments have been paramount in our understanding of quantum mechanics and are frequently the basis of testing the superposition principle in the framework of quantum theory. In recent years, several studies have challenged the nature of wave-function interference from the perspective of Born's rule—namely, the manifestation of so-called high-order interference terms in a superposition generated by diffraction of the wave functions. Here we present an experimental test of multipath interference in the diffraction of metastable helium atoms, with large-number counting statistics, comparable to photon-based experiments. We use a variation of the original triple-slit experiment and accurate single-event counting techniques to provide a new experimental bound of 2.9 ×10-5 on the statistical deviation from the commonly approximated null third-order interference term in Born's rule for matter waves. Our value is on the order of the maximal contribution predicted for multipath trajectories by Feynman path integrals.
Statistical Physics of T-Cell Development and Pathogen Specificity
NASA Astrophysics Data System (ADS)
Košmrlj, Andrej; Kardar, Mehran; Chakraborty, Arup K.
2013-04-01
In addition to an innate immune system that battles pathogens in a nonspecific fashion, higher organisms, such as humans, possess an adaptive immune system to combat diverse (and evolving) microbial pathogens. Remarkably, the adaptive immune system mounts pathogen-specific responses, which can be recalled upon reinfection with the same pathogen. It is difficult to see how the adaptive immune system can be preprogrammed to respond specifically to a vast and unknown set of pathogens. Although major advances have been made in understanding pertinent molecular and cellular phenomena, the precise principles that govern many aspects of an immune response are largely unknown. We discuss complementary approaches from statistical mechanics and cell biology that can shed light on how key components of the adaptive immune system, T cells, develop to enable pathogen-specific responses against many diverse pathogens. The mechanistic understanding that emerges has implications for how host genetics may influence the development of T cells with differing responses to the human immunodeficiency virus (HIV) infection.
Quantum Biometrics with Retinal Photon Counting
NASA Astrophysics Data System (ADS)
Loulakis, M.; Blatsios, G.; Vrettou, C. S.; Kominis, I. K.
2017-10-01
It is known that the eye's scotopic photodetectors, rhodopsin molecules, and their associated phototransduction mechanism leading to light perception, are efficient single-photon counters. We here use the photon-counting principles of human rod vision to propose a secure quantum biometric identification based on the quantum-statistical properties of retinal photon detection. The photon path along the human eye until its detection by rod cells is modeled as a filter having a specific transmission coefficient. Precisely determining its value from the photodetection statistics registered by the conscious observer is a quantum parameter estimation problem that leads to a quantum secure identification method. The probabilities for false-positive and false-negative identification of this biometric technique can readily approach 10-10 and 10-4, respectively. The security of the biometric method can be further quantified by the physics of quantum measurements. An impostor must be able to perform quantum thermometry and quantum magnetometry with energy resolution better than 10-9ℏ , in order to foil the device by noninvasively monitoring the biometric activity of a user.
26 CFR 1.482-1 - Allocation of income and deductions among taxpayers.
Code of Federal Regulations, 2010 CFR
2010-04-01
... section sets forth general principles and guidelines to be followed under section 482. Section 1.482-2... practices, economic principles, or statistical analyses. The extent and reliability of any adjustments will..., extraction, and assembly; (E) Purchasing and materials management; (F) Marketing and distribution functions...
Principles of time evolution in classical physics
NASA Astrophysics Data System (ADS)
Güémez, J.; Fiolhais, M.
2018-07-01
We address principles of time evolution in classical mechanical/thermodynamical systems in translational and rotational motion, in three cases: when there is conservation of mechanical energy, when there is energy dissipation and when there is mechanical energy production. In the first case, the time derivative of the Hamiltonian vanishes. In the second one, when dissipative forces are present, the time evolution is governed by the minimum potential energy principle, or, equivalently, maximum increase of the entropy of the universe. Finally, in the third situation, when internal sources of work are available to the system, it evolves in time according to the principle of minimum Gibbs function. We apply the Lagrangian formulation to the systems, dealing with the non-conservative forces using restriction functions such as the Rayleigh dissipative function.
Mathematical models of behavior of individual animals.
Tsibulsky, Vladimir L; Norman, Andrew B
2007-01-01
This review is focused on mathematical modeling of behaviors of a whole organism with special emphasis on models with a clearly scientific approach to the problem that helps to understand the mechanisms underlying behavior. The aim is to provide an overview of old and contemporary mathematical models without complex mathematical details. Only deterministic and stochastic, but not statistical models are reviewed. All mathematical models of behavior can be divided into two main classes. First, models that are based on the principle of teleological determinism assume that subjects choose the behavior that will lead them to a better payoff in the future. Examples are game theories and operant behavior models both of which are based on the matching law. The second class of models are based on the principle of causal determinism, which assume that subjects do not choose from a set of possibilities but rather are compelled to perform a predetermined behavior in response to specific stimuli. Examples are perception and discrimination models, drug effects models and individual-based population models. A brief overview of the utility of each mathematical model is provided for each section.
Maximum entropy production in environmental and ecological systems.
Kleidon, Axel; Malhi, Yadvinder; Cox, Peter M
2010-05-12
The coupled biosphere-atmosphere system entails a vast range of processes at different scales, from ecosystem exchange fluxes of energy, water and carbon to the processes that drive global biogeochemical cycles, atmospheric composition and, ultimately, the planetary energy balance. These processes are generally complex with numerous interactions and feedbacks, and they are irreversible in their nature, thereby producing entropy. The proposed principle of maximum entropy production (MEP), based on statistical mechanics and information theory, states that thermodynamic processes far from thermodynamic equilibrium will adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate. This issue focuses on the latest development of applications of MEP to the biosphere-atmosphere system including aspects of the atmospheric circulation, the role of clouds, hydrology, vegetation effects, ecosystem exchange of energy and mass, biogeochemical interactions and the Gaia hypothesis. The examples shown in this special issue demonstrate the potential of MEP to contribute to improved understanding and modelling of the biosphere and the wider Earth system, and also explore limitations and constraints to the application of the MEP principle.
NASA Astrophysics Data System (ADS)
Delle Site, Luigi
2018-01-01
A theoretical scheme for the treatment of an open molecular system with electrons and nuclei is proposed. The idea is based on the Grand Canonical description of a quantum region embedded in a classical reservoir of molecules. Electronic properties of the quantum region are calculated at constant electronic chemical potential equal to that of the corresponding (large) bulk system treated at full quantum level. Instead, the exchange of molecules between the quantum region and the classical environment occurs at the chemical potential of the macroscopic thermodynamic conditions. The Grand Canonical Adaptive Resolution Scheme is proposed for the treatment of the classical environment; such an approach can treat the exchange of molecules according to first principles of statistical mechanics and thermodynamic. The overall scheme is build on the basis of physical consistency, with the corresponding definition of numerical criteria of control of the approximations implied by the coupling. Given the wide range of expertise required, this work has the intention of providing guiding principles for the construction of a well founded computational protocol for actual multiscale simulations from the electronic to the mesoscopic scale.
NASA Astrophysics Data System (ADS)
Belof, Jonathan; Orlikowski, Daniel; Wu, Christine; McLaughlin, Keith
2013-06-01
Shock and ramp compression experiments are allowing us to probe condensed matter under extreme conditions where phase transitions and other non-equilibrium aspects can now be directly observed, but first principles simulation of kinetics remains a challenge. A multi-scale approach is presented here, with non-equilibrium statistical mechanical quantities calculated by molecular dynamics (MD) and then leveraged to inform a classical nucleation and growth kinetics model at the hydrodynamic scale. Of central interest is the free energy barrier for the formation of a critical nucleus, with direct NEMD presenting the challenge of relatively long timescales necessary to resolve nucleation. Rather than attempt to resolve the time-dependent nucleation sequence directly, the methodology derived here is built upon the non-equilibrium work theorem in order to bias the formation of a critical nucleus and thus construct the nucleation and growth rates. Having determined these kinetic terms from MD, a hydrodynamics implementation of Kolmogorov-Johnson-Mehl-Avrami (KJMA) kinetics and metastabilty is applied to the dynamic compressive freezing of water and compared with recent ramp compression experiments [Dolan et al., Nature (2007)] Lawrence Livermore National Laboratory is operated by Lawrence Livermore National Security, LLC, for the U.S. Department of Energy, National Nuclear Security Administration under Contract DE-AC52-07NA27344.
ERIC Educational Resources Information Center
Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh
2005-01-01
A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…
Connectopic mapping with resting-state fMRI.
Haak, Koen V; Marquand, Andre F; Beckmann, Christian F
2018-04-15
Brain regions are often topographically connected: nearby locations within one brain area connect with nearby locations in another area. Mapping these connection topographies, or 'connectopies' in short, is crucial for understanding how information is processed in the brain. Here, we propose principled, fully data-driven methods for mapping connectopies using functional magnetic resonance imaging (fMRI) data acquired at rest by combining spectral embedding of voxel-wise connectivity 'fingerprints' with a novel approach to spatial statistical inference. We apply the approach in human primary motor and visual cortex, and show that it can trace biologically plausible, overlapping connectopies in individual subjects that follow these regions' somatotopic and retinotopic maps. As a generic mechanism to perform inference over connectopies, the new spatial statistics approach enables rigorous statistical testing of hypotheses regarding the fine-grained spatial profile of functional connectivity and whether that profile is different between subjects or between experimental conditions. The combined framework offers a fundamental alternative to existing approaches to investigating functional connectivity in the brain, from voxel- or seed-pair wise characterizations of functional association, towards a full, multivariate characterization of spatial topography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Applications of quantum entropy to statistics
NASA Astrophysics Data System (ADS)
Silver, R. N.; Martz, H. F.
This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to hierarchical Bayes methods.
Teaching the EPR Paradox at High School?
ERIC Educational Resources Information Center
Pospiech, Gesche
1999-01-01
Argues the importance of students at university and in the final years of high school gaining an appreciation of the principles of quantum mechanics. Presents the EPR gedanken experiment (thought experiment) as a method of teaching the principles of quantum mechanics. (Author/CCM)
Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...
The Virginia Department of Transportation's statistical specification for hydraulic cement concrete.
DOT National Transportation Integrated Search
1990-01-01
This report reviews some of the principles relating to the application of statistical concepts to be used in the quality assurance and acceptance testing of hydraulic cement concrete. The problems encountered in developing a workable system without a...
Realism in the Realized Popper's Experiment
NASA Astrophysics Data System (ADS)
Hunter, Geoffrey
2002-12-01
The realization of Karl Popper's EPR-like experiment by Shih and Kim (published 1999) produced the result that Popper hoped for: no ``action at a distance'' on one photon of an entangled pair when a measurement is made on the other photon. This experimental result is interpretable in local realistic terms: each photon has a definite position and transverse momentum most of the time; the position measurement on one photon (localization within a slit) disturbs the transverse momentum of that photon in a non-predictable way in accordance with the uncertainty principle; however, there is no effect on the other photon (the photon that is not in a slit) no action at a distance. The position measurement (localization within a slit) of the one photon destroys the coherence (entanglement) between the photons; i.e. decoherence occurs. This realistic (albeit retrodictive) interpretation of the Shih-Kim realization of what Popper called his ``crucial experiment'' is in accord with Bohr's original concept of the nature of the uncertainty principle, as being an inevitable effect of the disturbance of the measured system by the measuring apparatus. In this experiment the impact parameter of an incident photon with the centerline of the slit is an uncontrollable parameter of each individual photon scattering event; this impact parameter is variable for every incident photon, the variations being a statistical aspect of the beam of photons produced by the experimental arrangement. These experimental results are also in accord with the proposition of Einstein, Podolski and Rosen's 1935 paper: that quantum mechanics provides only a statistical, physically incomplete, theory of microscopic physical processes, for the quantum mechanical description of the experiment does not describe or explain the individual photon scattering events that are actually observed; the angle by which an individual photon is scattered is not predictable, because the photon's impact parameter with the centerline of the slit is not observable, and because the electromagnetic interaction between the photon and the matter forming the walls of the slit is not calculable.
Steenkamer, Betty; Baan, Caroline; Putters, Kim; van Oers, Hans; Drewes, Hanneke
2018-04-09
Purpose A range of strategies to improve pharmaceutical care has been implemented by population health management (PHM) initiatives. However, which strategies generate the desired outcomes is largely unknown. The purpose of this paper is to identify guiding principles underlying collaborative strategies to improve pharmaceutical care and the contextual factors and mechanisms through which these principles operate. Design/methodology/approach The evaluation was informed by a realist methodology examining the links between PHM strategies, their outcomes and the contexts and mechanisms by which these strategies operate. Guiding principles were identified by grouping context-specific strategies with specific outcomes. Findings In total, ten guiding principles were identified: create agreement and commitment based on a long-term vision; foster cooperation and representation at the board level; use layered governance structures; create awareness at all levels; enable interpersonal links at all levels; create learning environments; organize shared responsibility; adjust financial strategies to market contexts; organize mutual gains; and align regional agreements with national policies and regulations. Contextual factors such as shared savings influenced the effectiveness of the guiding principles. Mechanisms by which these guiding principles operate were, for instance, fostering trust and creating a shared sense of the problem. Practical implications The guiding principles highlight how collaboration can be stimulated to improve pharmaceutical care while taking into account local constraints and possibilities. The interdependency of these principles necessitates effectuating them together in order to realize the best possible improvements and outcomes. Originality/value This is the first study using a realist approach to understand the guiding principles underlying collaboration to improve pharmaceutical care.
Perez-Cruz, Angel; Stiharu, Ion; Dominguez-Gonzalez, Aurelio
2017-07-20
In recent years paper-based microfluidic systems have emerged as versatile tools for developing sensors in different areas. In this work; we report a novel physical sensing principle for the characterization of liquids using a paper-based hygro-mechanical system (PB-HMS). The PB-HMS is formed by the interaction of liquid droplets and paper-based mini-structures such as cantilever beams. The proposed principle takes advantage of the hygroscopic properties of paper to produce hygro-mechanical motion. The dynamic response of the PB-HMS reveals information about the tested liquid that can be applied to characterize certain properties of liquids. A suggested method to characterize liquids by means of the proposed principle is introduced. The experimental results show the feasibility of such a method. It is expected that the proposed principle may be applied to sense properties of liquids in different applications where both disposability and portability are of extreme importance.
Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle
NASA Astrophysics Data System (ADS)
Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong
2017-02-01
Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.
The Energetic Cost of Walking: A Comparison of Predictive Methods
Kramer, Patricia Ann; Sylvester, Adam D.
2011-01-01
Background The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is “best”, but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. Methodology/Principal Findings We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Conclusion Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species. PMID:21731693
Physics of Bacterial Morphogenesis
Sun, Sean X.; Jiang, Hongyuan
2011-01-01
Summary: Bacterial cells utilize three-dimensional (3D) protein assemblies to perform important cellular functions such as growth, division, chemoreception, and motility. These assemblies are composed of mechanoproteins that can mechanically deform and exert force. Sometimes, small-nucleotide hydrolysis is coupled to mechanical deformations. In this review, we describe the general principle for an understanding of the coupling of mechanics with chemistry in mechanochemical systems. We apply this principle to understand bacterial cell shape and morphogenesis and how mechanical forces can influence peptidoglycan cell wall growth. We review a model that can potentially reconcile the growth dynamics of the cell wall with the role of cytoskeletal proteins such as MreB and crescentin. We also review the application of mechanochemical principles to understand the assembly and constriction of the FtsZ ring. A number of potential mechanisms are proposed, and important questions are discussed. PMID:22126993
Assessing the Preparedness Level of Incoming Principles of Accounting Students.
ERIC Educational Resources Information Center
Imel, Phillip W.
2000-01-01
Reports that the introductory level Principles of Accounting classes at Southwest Virginia Community College (SVCC) had high unsuccessful grade rates between 1989 and 1999. Describes a study conducted to determine whether there was a statistical difference in the test scores and GPA of successful versus unsuccessful accounting students. Finds that…
ERIC Educational Resources Information Center
Iyioke, Ifeoma Chika
2013-01-01
This dissertation describes a design for training, in accordance with probability judgment heuristics principles, for the Angoff standard setting method. The new training with instruction, practice, and feedback tailored to the probability judgment heuristics principles was called the Heuristic training and the prevailing Angoff method training…
Cost Finding Principles and Procedures. Preliminary Field Review Edition. Technical Report 26.
ERIC Educational Resources Information Center
Ziemer, Gordon; And Others
This report is part of the Larger Cost Finding Principles Project designed to develop a uniform set of standards, definitions, and alternative procedures that will use accounting and statistical data to find the full cost of resources utilized in the process of producing institutional outputs. This technical report describes preliminary procedures…
Web-based Learning Environments Guided by Principles of Good Teaching Practice.
ERIC Educational Resources Information Center
Chizmar, John F.; Walbert, Mark S.
1999-01-01
Describes the preparation and execution of a statistics course, an undergraduate econometrics course, and a microeconomic theory course that all utilize Internet technology. Reviews seven principles of teaching practice in order to demonstrate how to enhance the quality of student learning using Web technologies. Includes reactions by Steve Hurd…
Modeling Success: Using Preenrollment Data to Identify Academically At-Risk Students
ERIC Educational Resources Information Center
Gansemer-Topf, Ann M.; Compton, Jonathan; Wohlgemuth, Darin; Forbes, Greg; Ralston, Ekaterina
2015-01-01
Improving student success and degree completion is one of the core principles of strategic enrollment management. To address this principle, institutional data were used to develop a statistical model to identify academically at-risk students. The model employs multiple linear regression techniques to predict students at risk of earning below a…
Country Education Profiles: Algeria.
ERIC Educational Resources Information Center
International Bureau of Education, Geneva (Switzerland).
One of a series of profiles prepared by the Cooperative Educational Abstracting Service, this brief outline provides basic background information on educational principles, system of administration, structure and organization, curricula, and teacher training in Algeria. Statistics provided by the Unesco Office of Statistics show enrollment at all…
Principled Missing Data Treatments.
Lang, Kyle M; Little, Todd D
2018-04-01
We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.
Précis of statistical significance: rationale, validity, and utility.
Chow, S L
1998-04-01
The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.
Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi
2017-01-01
Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
NASA Astrophysics Data System (ADS)
Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; Rennich, Steven; Rogers, James H.
2017-02-01
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.
Eisenbach, Markus; Larkin, Jeff; Lutjens, Justin; ...
2016-07-12
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn–Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. In this paper, we present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. We reimplement the scattering matrix calculation for GPUs with a block matrix inversion algorithm that only uses accelerator memory. Finally, using the Craymore » XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code.« less
Augmenting Latent Dirichlet Allocation and Rank Threshold Detection with Ontologies
2010-03-01
Probabilistic Latent Semantic Indexing (PLSI) is an automated indexing information retrieval model [20]. It is based on a statistical latent class model which is...uses a statistical foundation that is more accurate in finding hidden semantic relationships [20]. The model uses factor analysis of count data, number...principle of statistical infer- ence which asserts that all of the information in a sample is contained in the likelihood function [20]. The statistical
Statistical Entropy of Dirac Field Outside RN Black Hole and Modified Density Equation
NASA Astrophysics Data System (ADS)
Cao, Fei; He, Feng
2012-02-01
Statistical entropy of Dirac field in Reissner-Nordstrom black hole space-time is computed by state density equation corrected by the generalized uncertainty principle to all orders in Planck length and WKB approximation. The result shows that the statistical entropy is proportional to the horizon area but the present result is convergent without any artificial cutoff.
A Framework for Thinking about Informal Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie; Rubin, Andee
2009-01-01
Informal inferential reasoning has shown some promise in developing students' deeper understanding of statistical processes. This paper presents a framework to think about three key principles of informal inference--generalizations "beyond the data," probabilistic language, and data as evidence. The authors use primary school classroom…
Cored density profiles in the DARKexp model
NASA Astrophysics Data System (ADS)
Destri, Claudio
2018-05-01
The DARKexp model represents a novel and promising attempt to solve a long standing problem of statistical mechanics, that of explaining from first principles the quasi-stationary states at the end of the collisionless gravitational collapse. The model, which yields good fits to observation and simulation data on several scales, was originally conceived to provide a theoretical basis for the 1/r cusp of the Navarro-Frenk-White profile. In this note we show that it also allows for cored density profiles that, when viewed in three dimensions, in the r→0 limit have the conical shape characteristic of the Burkert profile. It remains to be established whether both cusps and cores, or only one of the two types, are allowed beyond the asymptotic analysis of this work.
Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators.
Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H-S; Ahn, Jaewook
2018-05-04
Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.
Biomimetic Phases of Microtubule-Motor Mixtures
NASA Astrophysics Data System (ADS)
Ross, Jennifer
2014-03-01
We try to determine the universal principles of organization from the molecular scale that gives rise to architecture on the cellular scale. We are specifically interested in the organization of the microtubule cytoskeleton, a rigid, yet versatile network in most cell types. Microtubules in the cell are organized by motor proteins and crosslinkers. This work applies the ideas of statistical mechanics and condensed matter physics to the non-equilibrium pattern formation behind intracellular organization using the microtubule cytoskeleton as the building blocks. We examine these processes in a bottom-up manner by adding increasingly complex protein actors into the system. Our systematic experiments expose nature's laws for organization and has large impacts on biology as well as illuminating new frontiers of non-equilibrium physics.
Detailed Balance of Thermalization Dynamics in Rydberg-Atom Quantum Simulators
NASA Astrophysics Data System (ADS)
Kim, Hyosub; Park, YeJe; Kim, Kyungtae; Sim, H.-S.; Ahn, Jaewook
2018-05-01
Dynamics of large complex systems, such as relaxation towards equilibrium in classical statistical mechanics, often obeys a master equation that captures essential information from the complexities. Here, we find that thermalization of an isolated many-body quantum state can be described by a master equation. We observe sudden quench dynamics of quantum Ising-like models implemented in our quantum simulator, defect-free single-atom tweezers in conjunction with Rydberg-atom interaction. Saturation of their local observables, a thermalization signature, obeys a master equation experimentally constructed by monitoring the occupation probabilities of prequench states and imposing the principle of the detailed balance. Our experiment agrees with theories and demonstrates the detailed balance in a thermalization dynamics that does not require coupling to baths or postulated randomness.
Frenetic Bounds on the Entropy Production
NASA Astrophysics Data System (ADS)
Maes, Christian
2017-10-01
We give a systematic derivation of positive lower bounds for the expected entropy production (EP) rate in classical statistical mechanical systems obeying a dynamical large deviation principle. The logic is the same for the return to thermodynamic equilibrium as it is for steady nonequilibria working under the condition of local detailed balance. We recover there recently studied "uncertainty" relations for the EP, appearing in studies about the effectiveness of mesoscopic machines. In general our refinement of the positivity of the expected EP rate is obtained in terms of a positive and even function of the expected current(s) which measures the dynamical activity in the system, a time-symmetric estimate of the changes in the system's configuration. Also underdamped diffusions can be included in the analysis.
Apfelbaum, Keith S; Hazeltine, Eliot; McMurray, Bob
2013-07-01
Early reading abilities are widely considered to derive in part from statistical learning of regularities between letters and sounds. Although there is substantial evidence from laboratory work to support this, how it occurs in the classroom setting has not been extensively explored; there are few investigations of how statistics among letters and sounds influence how children actually learn to read or what principles of statistical learning may improve learning. We examined 2 conflicting principles that may apply to learning grapheme-phoneme-correspondence (GPC) regularities for vowels: (a) variability in irrelevant units may help children derive invariant relationships and (b) similarity between words may force children to use a deeper analysis of lexical structure. We trained 224 first-grade students on a small set of GPC regularities for vowels, embedded in words with either high or low consonant similarity, and tested their generalization to novel tasks and words. Variability offered a consistent benefit over similarity for trained and new words in both trained and new tasks.
NASA Astrophysics Data System (ADS)
Kovalev, A. M.
The problem of the motion of a mechanical system with constraints conforming to Hamilton's principle is stated as an optimum control problem, with equations of motion obtained on the basis of Pontriagin's principle. A Hamiltonian function in Rodrigues-Hamilton parameters for a gyrostat in a potential force field is obtained as an example. Equations describing the motion of a skate on a sloping surface and the motion of a disk on a horizontal plane are examined.
Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos
2017-01-01
Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.
Prodinger, Birgit; Ballert, Carolina S; Brach, Mirjam; Brinkhof, Martin W G; Cieza, Alarcos; Hug, Kerstin; Jordan, Xavier; Post, Marcel W M; Scheel-Sailer, Anke; Schubert, Martin; Tennant, Alan; Stucki, Gerold
2016-02-01
Functioning is an important outcome to measure in cohort studies. Clear and operational outcomes are needed to judge the quality of a cohort study. This paper outlines guiding principles for reporting functioning in cohort studies and addresses some outstanding issues. Principles of how to standardize reporting of data from a cohort study on functioning, by deriving scores that are most useful for further statistical analysis and reporting, are outlined. The Swiss Spinal Cord Injury Cohort Study Community Survey serves as a case in point to provide a practical application of these principles. Development of reporting scores must be conceptually coherent and metrically sound. The International Classification of Functioning, Disability and Health (ICF) can serve as the frame of reference for this, with its categories serving as reference units for reporting. To derive a score for further statistical analysis and reporting, items measuring a single latent trait must be invariant across groups. The Rasch measurement model is well suited to test these assumptions. Our approach is a valuable guide for researchers and clinicians, as it fosters comparability of data, strengthens the comprehensiveness of scope, and provides invariant, interval-scaled data for further statistical analyses of functioning.
Kamiura, Moto; Sano, Kohei
2017-10-01
The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.
Non-commutative methods in quantum mechanics
NASA Astrophysics Data System (ADS)
Millard, Andrew Clive
1997-09-01
Non-commutativity appears in physics almost hand in hand with quantum mechanics. Non-commuting operators corresponding to observables lead to Heisenberg's Uncertainty Principle, which is often used as a prime example of how quantum mechanics transcends 'common sense', while the operators that generate a symmetry group are usually given in terms of their commutation relations. This thesis discusses a number of new developments which go beyond the usual stopping point of non-commuting quantities as matrices with complex elements. Chapter 2 shows how certain generalisations of quantum mechanics, from using complex numbers to using other (often non-commutative) algebras, can still be written as linear systems with symplectic phase flows. Chapter 3 deals with Adler's trace dynamics, a non-linear graded generalisation of Hamiltonian dynamics with supersymmetry applications, where the phase space coordinates are (generally non-commuting) operators, and reports on aspects of a demonstration that the statistical averages of the dynamical variables obey the rules of complex quantum field theory. The last two chapters discuss specific aspects of quaternionic quantum mechanics. Chapter 4 reports a generalised projective representation theory and presents a structure theorem that categorises quaternionic projective representations. Chapter 5 deals with a generalisation of the coherent states formalism and examines how it may be applied to two commonly used groups.
Nondestructive methods of integrating energy harvesting systems with structures
NASA Astrophysics Data System (ADS)
Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan
2012-04-01
Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.
Pilot Study: Impact of Computer Simulation on Students' Economic Policy Performance. Pilot Study.
ERIC Educational Resources Information Center
Domazlicky, Bruce; France, Judith
Fiscal and monetary policies taught in macroeconomic principles courses are concepts that might require both lecture and simulation methods. The simulation models, which apply the principles gleened from comparative statistics to a dynamic world, may give students an appreciation for the problems facing policy makers. This paper is a report of a…
Principled and Statistical Connections in Common Sense Conception
ERIC Educational Resources Information Center
Prasada, Sandeep; Dillingham, Elaine M.
2006-01-01
Nominal concepts represent things as tokens of types. We report six experiments that investigate the nature of the relations we represent between the type of thing something is (e.g. DOG) and its other properties. The experiments provide evidence that we represent principled connections between the type of thing something is (e.g. DOG) and some of…
Student Outcomes in Economics Principles: Online vs. Face-to-Face Delivery
ERIC Educational Resources Information Center
Birkeland, Kathryn; Weinandt, Mandie; Carr, David L.
2015-01-01
This study looks at the performance of students in an online and face-to-face section of economic principles with the same instructor. After controlling for the bias of students selecting the online section and observable characteristics, we did not find any statistical difference in the exam performance of students across delivery modes of the…
Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.
Frieden, B Roy; Gatenby, Robert A
2013-10-01
Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.
Gardeux, Vincent; Achour, Ikbel; Li, Jianrong; Maienschein-Cline, Mark; Li, Haiquan; Pesce, Lorenzo; Parinandi, Gurunadh; Bahroos, Neil; Winn, Robert; Foster, Ian; Garcia, Joe G N; Lussier, Yves A
2014-01-01
Background The emergence of precision medicine allowed the incorporation of individual molecular data into patient care. Indeed, DNA sequencing predicts somatic mutations in individual patients. However, these genetic features overlook dynamic epigenetic and phenotypic response to therapy. Meanwhile, accurate personal transcriptome interpretation remains an unmet challenge. Further, N-of-1 (single-subject) efficacy trials are increasingly pursued, but are underpowered for molecular marker discovery. Method ‘N-of-1-pathways’ is a global framework relying on three principles: (i) the statistical universe is a single patient; (ii) significance is derived from geneset/biomodules powered by paired samples from the same patient; and (iii) similarity between genesets/biomodules assesses commonality and differences, within-study and cross-studies. Thus, patient gene-level profiles are transformed into deregulated pathways. From RNA-Seq of 55 lung adenocarcinoma patients, N-of-1-pathways predicts the deregulated pathways of each patient. Results Cross-patient N-of-1-pathways obtains comparable results with conventional genesets enrichment analysis (GSEA) and differentially expressed gene (DEG) enrichment, validated in three external evaluations. Moreover, heatmap and star plots highlight both individual and shared mechanisms ranging from molecular to organ-systems levels (eg, DNA repair, signaling, immune response). Patients were ranked based on the similarity of their deregulated mechanisms to those of an independent gold standard, generating unsupervised clusters of diametric extreme survival phenotypes (p=0.03). Conclusions The N-of-1-pathways framework provides a robust statistical and relevant biological interpretation of individual disease-free survival that is often overlooked in conventional cross-patient studies. It enables mechanism-level classifiers with smaller cohorts as well as N-of-1 studies. Software http://lussierlab.org/publications/N-of-1-pathways PMID:25301808
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardeux, Vincent; Achour, Ikbel; Li, Jianrong
Background: The emergence of precision medicine allowed the incorporation of individual molecular data into patient care. This research entails, DNA sequencing predicts somatic mutations in individual patients. However, these genetic features overlook dynamic epigenetic and phenotypic response to therapy. Meanwhile, accurate personal transcriptome interpretation remains an unmet challenge. Further, N-of-1 (single-subject) efficacy trials are increasingly pursued, but are underpowered for molecular marker discovery. Method: ‘N-of-1- pathways’ is a global framework relying on three principles: (i) the statistical universe is a single patient; (ii) significance is derived from geneset/biomodules powered by paired samples from the same patient; and (iii) similarity betweenmore » genesets/biomodules assesses commonality and differences, within-study and cross-studies. Thus, patient gene-level profiles are transformed into deregulated pathways. From RNA-Seq of 55 lung adenocarcinoma patients, N-of-1- pathways predicts the deregulated pathways of each patient. Results: Cross-patient N-of-1- pathways obtains comparable results with conventional genesets enrichment analysis (GSEA) and differentially expressed gene (DEG) enrichment, validated in three external evaluations. Moreover, heatmap and star plots highlight both individual and shared mechanisms ranging from molecular to organ-systems levels (eg, DNA repair, signaling, immune response). Patients were ranked based on the similarity of their deregulated mechanisms to those of an independent gold standard, generating unsupervised clusters of diametric extreme survival phenotypes (p=0.03). Conclusions: The N-of-1- pathways framework provides a robust statistical and relevant biological interpretation of individual disease-free survival that is often overlooked in conventional cross-patient studies. It enables mechanism-level classifiers with smaller cohorts as well as N-of-1 studies.« less
Gardeux, Vincent; Achour, Ikbel; Li, Jianrong; ...
2014-11-01
Background: The emergence of precision medicine allowed the incorporation of individual molecular data into patient care. This research entails, DNA sequencing predicts somatic mutations in individual patients. However, these genetic features overlook dynamic epigenetic and phenotypic response to therapy. Meanwhile, accurate personal transcriptome interpretation remains an unmet challenge. Further, N-of-1 (single-subject) efficacy trials are increasingly pursued, but are underpowered for molecular marker discovery. Method: ‘N-of-1- pathways’ is a global framework relying on three principles: (i) the statistical universe is a single patient; (ii) significance is derived from geneset/biomodules powered by paired samples from the same patient; and (iii) similarity betweenmore » genesets/biomodules assesses commonality and differences, within-study and cross-studies. Thus, patient gene-level profiles are transformed into deregulated pathways. From RNA-Seq of 55 lung adenocarcinoma patients, N-of-1- pathways predicts the deregulated pathways of each patient. Results: Cross-patient N-of-1- pathways obtains comparable results with conventional genesets enrichment analysis (GSEA) and differentially expressed gene (DEG) enrichment, validated in three external evaluations. Moreover, heatmap and star plots highlight both individual and shared mechanisms ranging from molecular to organ-systems levels (eg, DNA repair, signaling, immune response). Patients were ranked based on the similarity of their deregulated mechanisms to those of an independent gold standard, generating unsupervised clusters of diametric extreme survival phenotypes (p=0.03). Conclusions: The N-of-1- pathways framework provides a robust statistical and relevant biological interpretation of individual disease-free survival that is often overlooked in conventional cross-patient studies. It enables mechanism-level classifiers with smaller cohorts as well as N-of-1 studies.« less
Le Châtelier reciprocal relations and the mechanical analog
NASA Astrophysics Data System (ADS)
Gilmore, Robert
1983-08-01
Le Châtelier's principle is discussed carefully in terms of two sets of simple thermodynamic examples. The principle is then formulated quantitatively for general thermodynamic systems. The formulation is in terms of a perturbation-response matrix, the Le Châtelier matrix [L]. Le Châtelier's principle is contained in the diagonal elements of this matrix, all of which exceed one. These matrix elements describe the response of a system to a perturbation of either its extensive or intensive variables. These response ratios are inverses of each other. The Le Châtelier matrix is symmetric, so that a new set of thermodynamic reciprocal relations is derived. This quantitative formulation is illustrated by a single simple example which includes the original examples and shows the reciprocities among them. The assumptions underlying this new quantitative formulation of Le Châtelier's principle are general and applicable to a wide variety of nonthermodynamic systems. Le Châtelier's principle is formulated quantitatively for mechanical systems in static equilibrium, and mechanical examples of this formulation are given.
Quantum correlations are tightly bound by the exclusivity principle.
Yan, Bin
2013-06-28
It is a fundamental problem in physics of what principle limits the correlations as predicted by our current description of nature, based on quantum mechanics. One possible explanation is the "global exclusivity" principle recently discussed in Phys. Rev. Lett. 110, 060402 (2013). In this work we show that this principle actually has a much stronger restriction on the probability distribution. We provide a tight constraint inequality imposed by this principle and prove that this principle singles out quantum correlations in scenarios represented by any graph. Our result implies that the exclusivity principle might be one of the fundamental principles of nature.
Teaching ``The Physics of Energy'' at MIT
NASA Astrophysics Data System (ADS)
Jaffe, Robert
2009-05-01
New physics courses on energy are popping up at colleges and universities across the country. Many require little or no previous physics background, aiming to introduce a broad audience to this complex and critical problem, often augmenting the scientific message with economic and policy discussions. Others are advanced courses, focussing on highly specialized subjects like solar voltaics, nuclear physics, or thermal fluids, for example. About two years ago Washington Taylor and I undertook to develop a course on the ``Physics of Energy'' open to all MIT students who had taken MIT's common core of university level calculus, physics, and chemistry. By avoiding higher level prerequisites, we aimed to attract and make the subject relevant to students in the life sciences, economics, etc. --- as well as physical scientists and engineers --- who want to approach energy issues in a sophisticated and analytical fashion, exploiting their background in calculus, mechanics, and E & M, but without having to take advanced courses in thermodynamics, quantum mechanics, or nuclear physics beforehand. Our object was to interweave teaching the fundamental physics principles at the foundations of energy science with the applications of those principles to energy systems. We envisioned a course that would present the basics of statistical, quantum, and fluid mechanics at a fairly sophisticated level and apply those concepts to the study of energy sources, conversion, transport, losses, storage, conservation, and end use. In the end we developed almost all of the material for the course from scratch. The course debuted this past fall. I will describe what we learned and what general lessons our experience might have for others who contemplate teaching energy physics broadly to a technically sophisticated audience.
Molecular Dynamics of Dense Fluids: Simulation-Theory Symbiosis
NASA Astrophysics Data System (ADS)
Yip, Sidney
35 years ago Berni J. Alder showed the Boltzmann-Enskog kinetic theory failed to adequately account for the viscosity of fluids near solid density as determined by molecular dynamics simulation. This work, along with other notable simulation findings, provided great stimulus to the statistical mechanical studies of transport phenomena, particularly in dealing with collective effects in the time correlation functions of liquids. An extended theoretical challenge that remains partially resolved at best is the shear viscosity of supercooled liquids. How can one give a unified explanation of the so-called fragile and strong characteristic temperature behavior, with implications for the dynamics of glass transition? In this tribute on the occasion of his 90th birthday symposium, we recount a recent study where simulation, combined with heuristic (transition-state) and first principles (linear response) theories, identifies the molecular mechanisms governing glassy-state relaxation. Such an interplay between simulation and theory is progress from the early days; instead of simulation challenging theory, now simulation and theory complement each other.
Why granular media are thermal after all
NASA Astrophysics Data System (ADS)
Liu, Mario; Jiang, Yimin
2017-06-01
Two approaches exist to account for granular behavior. The thermal one considers the total entropy, which includes microscopic degrees of freedom such as phonons; the athermal one (as with the Edward entropy) takes grains as elementary. Granular solid hydrodynamics (GSH) belongs to the first, DEM, granular kinetic theory and athermal statistical mechanics (ASM) to the second. A careful discussion of their conceptual differences is given here. Three noteworthy insights or results are: (1) While DEM and granular kinetic theory are well justified to take grains as elementary, any athermal entropic consideration is bound to run into trouble. (2) Many general principles are taken as invalid in granular media. Yet within the thermal approach, energy conservation and fluctuation-dissipation theorem remain valid, granular temperatures equilibrate, and phase space is well explored in a grain at rest. Hence these are abnormalities of the athermal approximation, not of granular media as such. (3) GSH is a wide-ranged continuum mechanical description of granular dynamics.
Generalized uncertainty principle: implications for black hole complementarity
NASA Astrophysics Data System (ADS)
Chen, Pisin; Ong, Yen Chin; Yeom, Dong-han
2014-12-01
At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N -dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N -dependence, and whether black hole complementarity is indeed correct.
Physics of mind: Experimental confirmations of theoretical predictions.
Schoeller, Félix; Perlovsky, Leonid; Arseniev, Dmitry
2018-02-02
What is common among Newtonian mechanics, statistical physics, thermodynamics, quantum physics, the theory of relativity, astrophysics and the theory of superstrings? All these areas of physics have in common a methodology, which is discussed in the first few lines of the review. Is a physics of the mind possible? Is it possible to describe how a mind adapts in real time to changes in the physical world through a theory based on a few basic laws? From perception and elementary cognition to emotions and abstract ideas allowing high-level cognition and executive functioning, at nearly all levels of study, the mind shows variability and uncertainties. Is it possible to turn psychology and neuroscience into so-called "hard" sciences? This review discusses several established first principles for the description of mind and their mathematical formulations. A mathematical model of mind is derived from these principles. This model includes mechanisms of instincts, emotions, behavior, cognition, concepts, language, intuitions, and imagination. We clarify fundamental notions such as the opposition between the conscious and the unconscious, the knowledge instinct and aesthetic emotions, as well as humans' universal abilities for symbols and meaning. In particular, the review discusses in length evolutionary and cognitive functions of aesthetic emotions and musical emotions. Several theoretical predictions are derived from the model, some of which have been experimentally confirmed. These empirical results are summarized and we introduce new theoretical developments. Several unsolved theoretical problems are proposed, as well as new experimental challenges for future research. Copyright © 2017. Published by Elsevier B.V.
Some limit theorems for ratios of order statistics from uniform random variables.
Xu, Shou-Fang; Miao, Yu
2017-01-01
In this paper, we study the ratios of order statistics based on samples drawn from uniform distribution and establish some limit properties such as the almost sure central limit theorem, the large deviation principle, the Marcinkiewicz-Zygmund law of large numbers and complete convergence.
Index of Economic Freedom: Unrealized Pedagogical Opportunities
ERIC Educational Resources Information Center
Maier, Mark; Miller, John A.
2017-01-01
Although the Index of Economic Freedom appears in many economic textbooks, their coverage of the index misses opportunities to teach statistical and policy-related concepts important for the principles course. The standard textbook presentation passes up an opportunity to examine the statistical issues of weighting in composite index numbers and…
Challenging Conventional Wisdom for Multivariate Statistical Models with Small Samples
ERIC Educational Resources Information Center
McNeish, Daniel
2017-01-01
In education research, small samples are common because of financial limitations, logistical challenges, or exploratory studies. With small samples, statistical principles on which researchers rely do not hold, leading to trust issues with model estimates and possible replication issues when scaling up. Researchers are generally aware of such…
ERIC Educational Resources Information Center
Brewer, James K.
1988-01-01
Six best-selling introductory behavioral statistics textbooks that were published in 1982 and two well-known sampling theory textbooks were reviewed to determine the presence of rules-of-thumb--useful principles with wide application that are not intended to be strictly accurate. The relative frequency and type of rules are reported along with a…
ERIC Educational Resources Information Center
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Embryo mechanics: balancing force production with elastic resistance during morphogenesis.
Davidson, Lance A
2011-01-01
Morphogenesis requires the spatial and temporal control of embryo mechanics, including force production and mechanical resistance to those forces, to coordinate tissue deformation and large-scale movements. Thus, biomechanical processes play a key role in directly shaping the embryo. Additional roles for embryo mechanics during development may include the patterning of positional information and to provide feedback to ensure the success of morphogenetic movements in shaping the larval body and organs. To understand the multiple roles of mechanics during development requires familiarity with engineering principles of the mechanics of structures, the viscoelastic properties of biomaterials, and the integration of force and stress within embryonic structures as morphogenesis progresses. In this chapter, we review the basic engineering principles of biomechanics as they relate to morphogenesis, introduce methods for quantifying embryo mechanics and the limitations of these methods, and outline a formalism for investigating the role of embryo mechanics in birth defects. We encourage the nascent field of embryo mechanics to adopt standard engineering terms and test methods so that studies of diverse organisms can be compared and universal biomechanical principles can be revealed. Copyright © 2011 Elsevier Inc. All rights reserved.
Developmental Principles: Fact or Fiction
Durston, A. J.
2012-01-01
While still at school, most of us are deeply impressed by the underlying principles that so beautifully explain why the chemical elements are ordered as they are in the periodic table, and may wonder, with the theoretician Brian Goodwin, “whether there might be equally powerful principles that account for the awe-inspiring diversity of body forms in the living realm”. We have considered the arguments for developmental principles, conclude that they do exist and have specifically identified features that may generate principles associated with Hox patterning of the main body axis in bilaterian metazoa in general and in the vertebrates in particular. We wonder whether this exercise serves any purpose. The features we discuss were already known to us as parts of developmental mechanisms and defining developmental principles (how, and at which level?) adds no insight. We also see little profit in the proposal by Goodwin that there are principles outside the emerging genetic mechanisms that need to be taken into account. The emerging developmental genetic hierarchies already reveal a wealth of interesting phenomena, whatever we choose to call them. PMID:22489210
Developmental principles: fact or fiction.
Durston, A J
2012-01-01
While still at school, most of us are deeply impressed by the underlying principles that so beautifully explain why the chemical elements are ordered as they are in the periodic table, and may wonder, with the theoretician Brian Goodwin, "whether there might be equally powerful principles that account for the awe-inspiring diversity of body forms in the living realm". We have considered the arguments for developmental principles, conclude that they do exist and have specifically identified features that may generate principles associated with Hox patterning of the main body axis in bilaterian metazoa in general and in the vertebrates in particular. We wonder whether this exercise serves any purpose. The features we discuss were already known to us as parts of developmental mechanisms and defining developmental principles (how, and at which level?) adds no insight. We also see little profit in the proposal by Goodwin that there are principles outside the emerging genetic mechanisms that need to be taken into account. The emerging developmental genetic hierarchies already reveal a wealth of interesting phenomena, whatever we choose to call them.
The image recognition based on neural network and Bayesian decision
NASA Astrophysics Data System (ADS)
Wang, Chugege
2018-04-01
The artificial neural network began in 1940, which is an important part of artificial intelligence. At present, it has become a hot topic in the fields of neuroscience, computer science, brain science, mathematics, and psychology. Thomas Bayes firstly reported the Bayesian theory in 1763. After the development in the twentieth century, it has been widespread in all areas of statistics. In recent years, due to the solution of the problem of high-dimensional integral calculation, Bayesian Statistics has been improved theoretically, which solved many problems that cannot be solved by classical statistics and is also applied to the interdisciplinary fields. In this paper, the related concepts and principles of the artificial neural network are introduced. It also summarizes the basic content and principle of Bayesian Statistics, and combines the artificial neural network technology and Bayesian decision theory and implement them in all aspects of image recognition, such as enhanced face detection method based on neural network and Bayesian decision, as well as the image classification based on the Bayesian decision. It can be seen that the combination of artificial intelligence and statistical algorithms has always been the hot research topic.
NASA Astrophysics Data System (ADS)
Dralle, D.; Karst, N.; Thompson, S. E.
2015-12-01
Multiple competing theories suggest that power law behavior governs the observed first-order dynamics of streamflow recessions - the important process by which catchments dry-out via the stream network, altering the availability of surface water resources and in-stream habitat. Frequently modeled as: dq/dt = -aqb, recessions typically exhibit a high degree of variability, even within a single catchment, as revealed by significant shifts in the values of "a" and "b" across recession events. One potential source of this variability lies in underlying, hard-to-observe fluctuations in how catchment water storage is partitioned amongst distinct storage elements, each having different discharge behaviors. Testing this and competing hypotheses with widely available streamflow timeseries, however, has been hindered by a power law scaling artifact that obscures meaningful covariation between the recession parameters, "a" and "b". Here we briefly outline a technique that removes this artifact, revealing intriguing new patterns in the joint distribution of recession parameters. Using long-term flow data from catchments in Northern California, we explore temporal variations, and find that the "a" parameter varies strongly with catchment wetness. Then we explore how the "b" parameter changes with "a", and find that measures of its variation are maximized at intermediate "a" values. We propose an interpretation of this pattern based on statistical mechanics, meaning "b" can be viewed as an indicator of the catchment "microstate" - i.e. the partitioning of storage - and "a" as a measure of the catchment macrostate (i.e. the total storage). In statistical mechanics, entropy (i.e. microstate variance, that is the variance of "b") is maximized for intermediate values of extensive variables (i.e. wetness, "a"), as observed in the recession data. This interpretation of "a" and "b" was supported by model runs using a multiple-reservoir catchment toy model, and lends support to the hypothesis that power law streamflow recession dynamics, and their variations, have their origin in the multiple modalities of storage partitioning.
Statistical modeling implicates neuroanatomical circuit mediating stress relief by ‘comfort’ food
Ulrich-Lai, Yvonne M.; Christiansen, Anne M.; Wang, Xia; Song, Seongho; Herman, James P.
2015-01-01
A history of eating highly-palatable foods reduces physiological and emotional responses to stress. For instance, we have previously shown that limited sucrose intake (4 ml of 30% sucrose twice daily for 14 days) reduces hypothalamic-pituitary-adrenocortical (HPA) axis responses to stress. However, the neural mechanisms underlying stress relief by such ‘comfort’ foods are unclear, and could reveal an endogenous brain pathway for stress mitigation. As such, the present work assessed the expression of several proteins related to neuronal activation and/or plasticity in multiple stress- and reward-regulatory brain regions of rats after limited sucrose (vs. water control) intake. These data were then subjected to a series of statistical analyses, including Bayesian modeling, to identify the most likely neurocircuit mediating stress relief by sucrose. The analyses suggest that sucrose reduces HPA activation by dampening an excitatory basolateral amygdala - medial amygdala circuit, while also potentiating an inhibitory bed nucleus of the stria terminalis principle subdivision-mediated circuit, resulting in reduced HPA activation after stress. Collectively, the results support the hypothesis that sucrose limits stress responses via plastic changes to the structure and function of stress-regulatory neural circuits. The work also illustrates that advanced statistical methods are useful approaches to identify potentially novel and important underlying relationships in biological data sets. PMID:26246177
Statistical modeling implicates neuroanatomical circuit mediating stress relief by 'comfort' food.
Ulrich-Lai, Yvonne M; Christiansen, Anne M; Wang, Xia; Song, Seongho; Herman, James P
2016-07-01
A history of eating highly palatable foods reduces physiological and emotional responses to stress. For instance, we have previously shown that limited sucrose intake (4 ml of 30 % sucrose twice daily for 14 days) reduces hypothalamic-pituitary-adrenocortical (HPA) axis responses to stress. However, the neural mechanisms underlying stress relief by such 'comfort' foods are unclear, and could reveal an endogenous brain pathway for stress mitigation. As such, the present work assessed the expression of several proteins related to neuronal activation and/or plasticity in multiple stress- and reward-regulatory brain regions of rats after limited sucrose (vs. water control) intake. These data were then subjected to a series of statistical analyses, including Bayesian modeling, to identify the most likely neurocircuit mediating stress relief by sucrose. The analyses suggest that sucrose reduces HPA activation by dampening an excitatory basolateral amygdala-medial amygdala circuit, while also potentiating an inhibitory bed nucleus of the stria terminalis principle subdivision-mediated circuit, resulting in reduced HPA activation after stress. Collectively, the results support the hypothesis that sucrose limits stress responses via plastic changes to the structure and function of stress-regulatory neural circuits. The work also illustrates that advanced statistical methods are useful approaches to identify potentially novel and important underlying relationships in biological datasets.
Superthermal photon bunching in terms of simple probability distributions
NASA Astrophysics Data System (ADS)
Lettau, T.; Leymann, H. A. M.; Melcher, B.; Wiersig, J.
2018-05-01
We analyze the second-order photon autocorrelation function g(2 ) with respect to the photon probability distribution and discuss the generic features of a distribution that results in superthermal photon bunching [g(2 )(0 ) >2 ]. Superthermal photon bunching has been reported for a number of optical microcavity systems that exhibit processes such as superradiance or mode competition. We show that a superthermal photon number distribution cannot be constructed from the principle of maximum entropy if only the intensity and the second-order autocorrelation are given. However, for bimodal systems, an unbiased superthermal distribution can be constructed from second-order correlations and the intensities alone. Our findings suggest modeling superthermal single-mode distributions by a mixture of a thermal and a lasinglike state and thus reveal a generic mechanism in the photon probability distribution responsible for creating superthermal photon bunching. We relate our general considerations to a physical system, i.e., a (single-emitter) bimodal laser, and show that its statistics can be approximated and understood within our proposed model. Furthermore, the excellent agreement of the statistics of the bimodal laser and our model reveals that the bimodal laser is an ideal source of bunched photons, in the sense that it can generate statistics that contain no other features but the superthermal bunching.
The difference between a dynamic and mechanical approach to stroke treatment.
Helgason, Cathy M
2007-06-01
The current classification of stroke is based on causation, also called pathogenesis, and relies on binary logic faithful to the Aristotelian tradition. Accordingly, a pathology is or is not the cause of the stroke, is considered independent of others, and is the target for treatment. It is the subject for large double-blind randomized clinical therapeutic trials. The scientific view behind clinical trials is the fundamental concept that information is statistical, and causation is determined by probabilities. Therefore, the cause and effect relation will be determined by probability-theory-based statistics. This is the basis of evidence-based medicine, which calls for the results of such trials to be the basis for physician decisions regarding diagnosis and treatment. However, there are problems with the methodology behind evidence-based medicine. Calculations using probability-theory-based statistics regarding cause and effect are performed within an automatic system where there are known inputs and outputs. This method of research provides a framework of certainty with no surprise elements or outcomes. However, it is not a system or method that will come up with previously unknown variables, concepts, or universal principles; it is not a method that will give a new outcome; and it is not a method that allows for creativity, expertise, or new insight for problem solving.
Mechatronics design principles for biotechnology product development.
Mandenius, Carl-Fredrik; Björkman, Mats
2010-05-01
Traditionally, biotechnology design has focused on the manufacture of chemicals and biologics. Still, a majority of biotechnology products that appear on the market today is the result of mechanical-electric (mechatronic) construction. For these, the biological components play decisive roles in the design solution; the biological entities are either integral parts of the design, or are transformed by the mechatronic system. This article explains how the development and production engineering design principles used for typical mechanical products can be adapted to the demands of biotechnology products, and how electronics, mechanics and biology can be integrated more successfully. We discuss three emerging areas of biotechnology in which mechatronic design principles can apply: stem cell manufacture, artificial organs, and bioreactors. Copyright 2010 Elsevier Ltd. All rights reserved.
Violation of Bell’s inequality: Must the Einstein locality really be abandoned?
NASA Astrophysics Data System (ADS)
Jung, Kurt
2017-08-01
Since John Bell has established his famous inequality and several independent experiments have confirmed the distinct polarization correlation of entangled photons predicted by quantum mechanics it is evident that quantum mechanics cannot be explained by local realistic theories. Actually, the observed polarization correlation can be deduced from wave optical considerations. The correlation has its origin in the phase coupling of the two circularly polarized wave packets leaving the photon source simultaneously. The experimental results violate Bell’s inequality although no non-local interactions have to be assumed. In consequence the principle of locality remains valid in the scope of quantum mechanics. However, the principle of realism has to be replaced by the less stringent principle of contextuality.
Metabolic networks evolve towards states of maximum entropy production.
Unrean, Pornkamol; Srienc, Friedrich
2011-11-01
A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.
Spaceborne receivers: Basic principles
NASA Technical Reports Server (NTRS)
Stacey, J. M.
1984-01-01
The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.
Comparative Analysis of Serum (Anti)oxidative Status Parameters in Healthy Persons
Jansen, Eugène HJM; Ruskovska, Tatjana
2013-01-01
Five antioxidant and two oxidative stress assays were applied to serum samples of 43 healthy males. The antioxidant tests showed different inter-assay correlations. A very good correlation of 0.807 was observed between the ferric reducing ability of plasma (FRAP) and total antioxidant status (TAS) assay and also a fair correlation of 0.501 between the biological antioxidant potential (BAP) and TAS assay. There was no statistically significant correlation between the BAP and FRAP assay. The anti-oxidant assays have a high correlation with uric acid, especially the TAS (0.922) and FRAP assay (0.869). The BAP assay has a much lower and no statistically significant correlation with uric acid (0.302), which makes BAP more suitable for the antioxidant status. The total thiol assay showed no statistically significant correlation with uric acid (0.114). The total thiol assay, which is based on a completely different principle, showed a good and statistically significant correlation with the BAP assay (0.510) and also to the TAS assay, but to a lower and not significant extent (0.279) and not with the FRAP assay (−0.008). The oxy-adsorbent test (OXY) assay has no correlation with any of the other assays tested. The oxidative stress assays, reactive oxygen metabolites (ROM) and total oxidant status (TOS), based on a different principle, do not show a statistically significant correlation with the serum samples in this study. Both assays showed a negative, but not significant, correlation with the antioxidant assays. In conclusion, the ROM, TOS, BAP and TTP assays are based on different principles and will have an additional value when a combination of these assays will be applied in large-scale population studies. PMID:23507749
Ea, Vuthy; Sexton, Tom; Gostan, Thierry; Herviou, Laurie; Baudement, Marie-Odile; Zhang, Yunzhe; Berlivet, Soizik; Le Lay-Taha, Marie-Noëlle; Cathala, Guy; Lesne, Annick; Victor, Jean-Marc; Fan, Yuhong; Cavalli, Giacomo; Forné, Thierry
2015-08-15
In higher eukaryotes, the genome is partitioned into large "Topologically Associating Domains" (TADs) in which the chromatin displays favoured long-range contacts. While a crumpled/fractal globule organization has received experimental supports at higher-order levels, the organization principles that govern chromatin dynamics within these TADs remain unclear. Using simple polymer models, we previously showed that, in mouse liver cells, gene-rich domains tend to adopt a statistical helix shape when no significant locus-specific interaction takes place. Here, we use data from diverse 3C-derived methods to explore chromatin dynamics within mouse and Drosophila TADs. In mouse Embryonic Stem Cells (mESC), that possess large TADs (median size of 840 kb), we show that the statistical helix model, but not globule models, is relevant not only in gene-rich TADs, but also in gene-poor and gene-desert TADs. Interestingly, this statistical helix organization is considerably relaxed in mESC compared to liver cells, indicating that the impact of the constraints responsible for this organization is weaker in pluripotent cells. Finally, depletion of histone H1 in mESC alters local chromatin flexibility but not the statistical helix organization. In Drosophila, which possesses TADs of smaller sizes (median size of 70 kb), we show that, while chromatin compaction and flexibility are finely tuned according to the epigenetic landscape, chromatin dynamics within TADs is generally compatible with an unconstrained polymer configuration. Models issued from polymer physics can accurately describe the organization principles governing chromatin dynamics in both mouse and Drosophila TADs. However, constraints applied on this dynamics within mammalian TADs have a peculiar impact resulting in a statistical helix organization.
NASA Astrophysics Data System (ADS)
Rauh, A.; Hinterhölzl, R.; Drechsler, K.
2012-05-01
In the automotive industry, finite element simulation is widely used to ensure crashworthiness. Mechanical material data over wide strain rate and temperature ranges are required as a basis. This work proposes a method reducing the cost of mechanical material characterization by using the time-temperature superposition principle on elastomeric adhesives. The method is based on the time and temperature interdependence which is characteristic for mechanical properties of polymers. Based on the assumption that polymers behave similarly at high strain rates and at low temperatures, a temperature-dominated test program is suggested, which can be used to deduce strain rate dependent material behavior at different reference temperatures. The temperature shift factor is found by means of dynamic mechanical analysis according to the WLF-equation, named after Williams, Landel and Ferry. The principle is applied to the viscoelastic properties as well as to the failure properties of the polymer. The applicability is validated with high strain rate tests.
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
Establishing security of quantum key distribution without monitoring disturbance
NASA Astrophysics Data System (ADS)
Koashi, Masato
2015-10-01
In conventional quantum key distribution (QKD) protocols, the information leak to an eavesdropper is estimated through the basic principle of quantum mechanics dictated in the original version of Heisenberg's uncertainty principle. The amount of leaked information on a shared sifted key is bounded from above essentially by using information-disturbance trade-off relations, based on the amount of signal disturbance measured via randomly sampled or inserted probe signals. Here we discuss an entirely different avenue toward the private communication, which does not rely on the information disturbance trade-off relations and hence does not require a monitoring of signal disturbance. The independence of the amount of privacy amplification from that of disturbance tends to give it a high tolerance on the channel noises. The lifting of the burden of precise statistical estimation of disturbance leads to a favorable finite-key-size effect. A protocol based on the novel principle can be implemented by only using photon detectors and classical optics tools: a laser, a phase modulator, and an interferometer. The protocol resembles the differential-phase-shift QKD protocol in that both share a simple binary phase shift keying on a coherent train of weak pulses from a laser. The difference lies in the use of a variable-delay interferometer in the new protocol, which randomly changes the combination of pulse pairs to be superposed. This extra randomness has turned out to be enough to upper-bound the information extracted by the eavesdropper, regardless of how they have disturbed the quantum signal.
On classical mechanical systems with non-linear constraints
NASA Astrophysics Data System (ADS)
Terra, Gláucio; Kobayashi, Marcelo H.
2004-03-01
In the present work, we analyze classical mechanical systems with non-linear constraints in the velocities. We prove that the d'Alembert-Chetaev trajectories of a constrained mechanical system satisfy both Gauss' principle of least constraint and Hölder's principle. In the case of a free mechanics, they also satisfy Hertz's principle of least curvature if the constraint manifold is a cone. We show that the Gibbs-Maggi-Appell (GMA) vector field (i.e. the second-order vector field which defines the d'Alembert-Chetaev trajectories) conserves energy for any potential energy if, and only if, the constraint is homogeneous (i.e. if the Liouville vector field is tangent to the constraint manifold). We introduce the Jacobi-Carathéodory metric tensor and prove Jacobi-Carathéodory's theorem assuming that the constraint manifold is a cone. Finally, we present a version of Liouville's theorem on the conservation of volume for the flow of the GMA vector field.
First-principles investigation of mechanical properties of silicene, germanene and stanene
NASA Astrophysics Data System (ADS)
Mortazavi, Bohayra; Rahaman, Obaidur; Makaremi, Meysam; Dianat, Arezoo; Cuniberti, Gianaurelio; Rabczuk, Timon
2017-03-01
Two-dimensional allotropes of group-IV substrates including silicene, germanene and stanene have recently attracted considerable attention in nanodevice fabrication industry. These materials involving the buckled structure have been experimentally fabricated lately. In this study, first-principles density functional theory calculations were utilized to investigate the mechanical properties of single-layer and free-standing silicene, germanene and stanene. Uniaxial tensile and compressive simulations were carried out to probe and compare stress-strain properties; such as the Young's modulus, Poisson's ratio and ultimate strength. We evaluated the chirality effect on the mechanical response and bond structure of the 2D substrates. Our first-principles simulations suggest that in all studied samples application of uniaxial loading can alter the electronic nature of the buckled structures into the metallic character. Our investigation provides a general but also useful viewpoint with respect to the mechanical properties of silicene, germanene and stanene.
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
Fine-Grained Sensitivity to Statistical Information in Adult Word Learning
ERIC Educational Resources Information Center
Vouloumanos, Athena
2008-01-01
A language learner trying to acquire a new word must often sift through many potential relations between particular words and their possible meanings. In principle, statistical information about the distribution of those mappings could serve as one important source of data, but little is known about whether learners can in fact track multiple…
ERIC Educational Resources Information Center
Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.
2010-01-01
Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…
Damned Lies. And Statistics. Otto Neurath and Soviet Propaganda in the 1930s.
ERIC Educational Resources Information Center
Chizlett, Clive
1992-01-01
Examines the philosophical and historical context in which Otto Neurath (1882-1945) worked. Examines critically (in the light of descriptive statistics) the principles of his Isotype Picture Language. Tests Neurath's personal credibility and scientific integrity by looking at his contributions to Soviet propaganda in the early 1930s. (SR)
QuantCrit: Education, Policy, "Big Data" and Principles for a Critical Race Theory of Statistics
ERIC Educational Resources Information Center
Gillborn, David; Warmington, Paul; Demack, Sean
2018-01-01
Quantitative research enjoys heightened esteem among policy-makers, media, and the general public. Whereas qualitative research is frequently dismissed as subjective and impressionistic, statistics are often assumed to be objective and factual. We argue that these distinctions are wholly false; quantitative data is no less socially constructed…
NASA Astrophysics Data System (ADS)
Rajjak Gazi, MD.; Rai, Ashutosh; Kunkri, Samir; Rahaman, Ramij
2010-11-01
Study of non-local correlations in terms of Hardy's argument has been quite popular in quantum mechanics. Hardy's non-locality argument depends on some kind of asymmetry, but a two-qubit maximally entangled state, being symmetric, does not exhibit this kind of non-locality. Here we ask the following question: can this feature be explained by some principle outside quantum mechanics? The no-signaling condition does not provide a solution. But, interestingly, the information causality principle (Pawlowski et al 2009 Nature 461 1101) offers an explanation. It shows that any generalized probability theory which gives completely random results for local dichotomic observable, cannot provide Hardy's non-local correlation if it is restricted by a necessary condition for respecting the information causality principle. In fact, the applied necessary condition imposes even more restrictions on the local randomness of measured observable. Still, there are some restrictions imposed by quantum mechanics that are not reproduced from the considered information causality condition.
Nondestructive methods of integrating energy harvesting systems for highway bridges
NASA Astrophysics Data System (ADS)
Inamdar, Sumedh; Zimowski, Krystian; Crawford, Richard; Wood, Kristin; Jensen, Dan
2012-04-01
Designing an attachment structure that is both novel and meets the system requirements can be a difficult task especially for inexperienced designers. This paper presents a design methodology for concept generation of a "parent/child" attachment system. The "child" is broadly defined as any device, part, or subsystem that will attach to any existing system, part, or device called the "parent." An inductive research process was used to study a variety of products, patents, and biological examples that exemplified the parent/child system. Common traits among these products were found and categorized as attachment principles in three different domains: mechanical, material, and field. The attachment principles within the mechanical domain and accompanying examples are the focus of this paper. As an example of the method, a case study of generating concepts for a bridge mounted wind energy harvester using the mechanical attachment principles derived from the methodology and TRIZ principles derived from Altshuller's matrix of contradictions is presented.
Como-Lesko, N; Primavera, L H; Szeszko, P R
1994-08-01
This study investigated high school students' marijuana usage patterns in relation to their harmfulness ratings of 15 licit and illicit drugs, perceived negative consequences from using marijuana, and types of defense mechanisms employed. Subjects were classified into one of five pattern-of-use groups based on marijuana usage: principled nonusers, nonusers, light users, moderate users, and heavy users. Principled nonusers (individuals who have never used marijuana and would not do so if it was legalized) rated marijuana, hashish, cocaine, and alcohol as significantly more harmful than heavy users. A cluster analysis of the drugs' harmfulness ratings best fit a three cluster solution and were named medicinal drugs, recreational drugs, and hard drugs. In general, principled nonusers rated negative consequences from using marijuana as significantly more likely to occur than other groups. Principled nonusers and heavy users utilized reversal from the Defense Mechanism Inventory, which includes repression and denial, significantly more than nonusers, indicating some trait common to the two extreme pattern-of-use groups.
Heisenberg's observability principle
NASA Astrophysics Data System (ADS)
Wolff, Johanna
2014-02-01
Werner Heisenberg's 1925 paper 'Quantum-theoretical re-interpretation of kinematic and mechanical relations' marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be 'founded exclusively upon relationships between quantities which in principle are observable'. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of 'observability' along empiricist or positivist lines I argue that such readings are philosophically unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels.
NASA Astrophysics Data System (ADS)
Yan, Jiawei; Wang, Shizhuo; Xia, Ke; Ke, Youqi
2018-01-01
We present first-principles analysis of interfacial disorder effects on spin-dependent tunneling statistics in thin Fe/MgO/Fe magnetic tunnel junctions. We find that interfacial disorder scattering can significantly modulate the tunneling statistics in the minority spin of the parallel configuration (PC) while all other spin channels remain dominated by the Poissonian process. For the minority-spin channel of PC, interfacial disorder scattering favors the formation of resonant tunneling channels by lifting the limitation of symmetry conservation at low concentration, presenting an important sub-Poissonian process in PC, but is destructive to the open channels at high concentration. We find that the important modulation of tunneling statistics is independent of the type of interfacial disorder. A bimodal distribution function of transmission with disorder dependence is introduced and fits very well our first-principles results. The increase of MgO thickness can quickly change the tunneling from a sub-Poissonian to Poissonian dominated process in the minority spin of PC with disorder. Our results provide a sensitive detection method of an ultralow concentration of interfacial defects.
Khalid, Shahzad; Kappus, Brian; Weninger, Keith; Putterman, Seth
2012-03-09
A strong interaction between a nanosecond laser and a 70 μm radius sonoluminescing plasma is achieved. The overall response of the system results in a factor of 2 increase in temperature as determined by its spectrum. Images of the interaction reveal that light energy is absorbed and trapped in a region smaller than the sonoluminescence emitting region of the bubble for over 100 ns. We interpret this opacity and transport measurement as demonstrating that sonoluminescencing bubbles can be 1000 times more opaque than what follows from the Saha equation of statistical mechanics in the ideal plasma limit. To address this discrepancy, we suggest that the effects of strong Coulomb interactions are an essential component of a first principles theory of sonoluminescence.
NASA Astrophysics Data System (ADS)
Khalid, Shahzad; Kappus, Brian; Weninger, Keith; Putterman, Seth
2012-03-01
A strong interaction between a nanosecond laser and a 70 μm radius sonoluminescing plasma is achieved. The overall response of the system results in a factor of 2 increase in temperature as determined by its spectrum. Images of the interaction reveal that light energy is absorbed and trapped in a region smaller than the sonoluminescence emitting region of the bubble for over 100 ns. We interpret this opacity and transport measurement as demonstrating that sonoluminescencing bubbles can be 1000 times more opaque than what follows from the Saha equation of statistical mechanics in the ideal plasma limit. To address this discrepancy, we suggest that the effects of strong Coulomb interactions are an essential component of a first principles theory of sonoluminescence.
Effect of slip-area scaling on the earthquake frequency-magnitude relationship
NASA Astrophysics Data System (ADS)
Senatorski, Piotr
2017-06-01
The earthquake frequency-magnitude relationship is considered in the maximum entropy principle (MEP) perspective. The MEP suggests sampling with constraints as a simple stochastic model of seismicity. The model is based on the von Neumann's acceptance-rejection method, with b-value as the parameter that breaks symmetry between small and large earthquakes. The Gutenberg-Richter law's b-value forms a link between earthquake statistics and physics. Dependence between b-value and the rupture area vs. slip scaling exponent is derived. The relationship enables us to explain observed ranges of b-values for different types of earthquakes. Specifically, different b-value ranges for tectonic and induced, hydraulic fracturing seismicity is explained in terms of their different triggering mechanisms: by the applied stress increase and fault strength reduction, respectively.
Liu, M; Wei, L; Zhang, J
2006-01-01
Missing data in clinical trials are inevitable. We highlight the ICH guidelines and CPMP points to consider on missing data. Specifically, we outline how we should consider missing data issues when designing, planning and conducting studies to minimize missing data impact. We also go beyond the coverage of the above two documents, provide a more detailed review of the basic concepts of missing data and frequently used terminologies, and examples of the typical missing data mechanism, and discuss technical details and literature for several frequently used statistical methods and associated software. Finally, we provide a case study where the principles outlined in this paper are applied to one clinical program at protocol design, data analysis plan and other stages of a clinical trial.
Ab initio Study on Ionization Energies of 3-Amino-1-propanol
NASA Astrophysics Data System (ADS)
Wang, Ke-dong; Jia, Ying-bin; Lai, Zhen-jiang; Liu, Yu-fang
2011-06-01
Fourteen conformers of 3-amino-1-propanol as the minima on the potential energy surface are examined at the MP2/6-311++G** level. Their relative energies calculated at B3LYP, MP3 and MP4 levels of theory indicated that two most stable conformers display the intramolecular OH···N hydrogen bonds. The vertical ionization energies of these conformers calculated with ab initio electron propagator theory in the P3/aug-cc-pVTZ approximation are in agreement with experimental data from photoelectron spectroscopy. Natural bond orbital analyses were used to explain the differences of IEs of the highest occupied molecular ortibal of conformers. Combined with statistical mechanics principles, conformational distributions at various temperatures are obtained and the temperature dependence of photoelectron spectra is interpreted.
Auger recombination in sodium iodide
NASA Astrophysics Data System (ADS)
McAllister, Andrew; Kioupakis, Emmanouil; Åberg, Daniel; Schleife, André
2014-03-01
Scintillators are an important tool used to detect high energy radiation - both in the interest of national security and in medicine. However, scintillator detectors currently suffer from lower energy resolutions than expected from basic counting statistics. This has been attributed to non-proportional light yield compared to incoming radiation, but the specific mechanism for this non-proportionality has not been identified. Auger recombination is a non-radiative process that could be contributing to the non-proportionality of scintillating materials. Auger recombination comes in two types - direct and phonon-assisted. We have used first-principles calculations to study Auger recombination in sodium iodide, a well characterized scintillating material. Our findings indicate that phonon-assisted Auger recombination is stronger in sodium iodide than direct Auger recombination. Computational resources provided by LLNL and NERSC. Funding provided by NA-22.
Many-body formalism for fermions: The partition function
NASA Astrophysics Data System (ADS)
Watson, D. K.
2017-09-01
The partition function, a fundamental tenet in statistical thermodynamics, contains in principle all thermodynamic information about a system. It encapsulates both microscopic information through the quantum energy levels and statistical information from the partitioning of the particles among the available energy levels. For identical particles, this statistical accounting is complicated by the symmetry requirements of the allowed quantum states. In particular, for Fermi systems, the enforcement of the Pauli principle is typically a numerically demanding task, responsible for much of the cost of the calculations. The interplay of these three elements—the structure of the many-body spectrum, the statistical partitioning of the N particles among the available levels, and the enforcement of the Pauli principle—drives the behavior of mesoscopic and macroscopic Fermi systems. In this paper, we develop an approach for the determination of the partition function, a numerically difficult task, for systems of strongly interacting identical fermions and apply it to a model system of harmonically confined, harmonically interacting fermions. This approach uses a recently introduced many-body method that is an extension of the symmetry-invariant perturbation method (SPT) originally developed for bosons. It uses group theory and graphical techniques to avoid the heavy computational demands of conventional many-body methods which typically scale exponentially with the number of particles. The SPT application of the Pauli principle is trivial to implement since it is done "on paper" by imposing restrictions on the normal-mode quantum numbers at first order in the perturbation. The method is applied through first order and represents an extension of the SPT method to excited states. Our method of determining the partition function and various thermodynamic quantities is accurate and efficient and has the potential to yield interesting insight into the role played by the Pauli principle and the influence of large degeneracies on the emergence of the thermodynamic behavior of large-N systems.
Higher-Education Budgeting at the State Level: Concepts and Principles.
ERIC Educational Resources Information Center
Jones, Dennis P.
New approaches to allocating state resources to colleges are discussed. Budgeting and resource allocation principles are considered that: (1) reflect the unique context of higher education; (2) are consistent with sound budgeting and management principles; and (3) represent institutional mechanisms applied at the state level rather than approaches…
Can Evolutionary Principles Explain Patterns of Family Violence?
ERIC Educational Resources Information Center
Archer, John
2013-01-01
The article's aim is to evaluate the application of the evolutionary principles of kin selection, reproductive value, and resource holding power to the understanding of family violence. The principles are described in relation to specific predictions and the mechanisms underlying these. Predictions are evaluated for physical violence perpetrated…
Mechanical Design of Downhole Tractor Based on Two-Way Self-locking Mechanism
NASA Astrophysics Data System (ADS)
Fang, Delei; Shang, Jianzhong; Luo, Zirong; Wu, Guoheng; Liu, Yiying
2018-03-01
Based on the technology of horizontal well tractor, a kind of downhole tractor was developed which can realize Two-Way self-locking function. Aiming at the needs of horizontal well logging to realize the target of small size, high traction and high reliability, the tractor selects unique heart-shaped CAM as the locking mechanism. The motion principle of telescopic downhole tractor, the design of mechanical structure and locking principle of the locking mechanism are all analyzed. The mathematical expressions of traction are obtained by mechanical analysis of parallel support rod in the locking mechanism. The force analysis and contour design of the heart-shaped CAM are performed, which can lay the foundation for the development of tractor prototype.
Principles of Statistics: What the Sports Medicine Professional Needs to Know.
Riemann, Bryan L; Lininger, Monica R
2018-07-01
Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.
1983-12-01
analysis; such work is not reported here. It seems pos- sible that a robust principle component analysis may he informa- tive (see Gnanadesikan (1977...Statistics in Atmospheric Sciences, American Meteorological Soc., Boston, Mass. (1979) pp. 46-48. a Gnanadesikan , R., Methods for Statistical Data...North Carolina Chapel Hill, NC 20742 Dr. R. Gnanadesikan Bell Telephone Lab Murray Hill, NJ 07733 -%.. *5%a: *1 *15 I ,, - . . , ,, ... . . . . . . NO
NAUSEA and the Principle of Supplementarity of Damping and Isolation in Noise Control.
1980-02-01
New approaches and uses of the statistical energy analysis (NAUSEA) have been considered and developed in recent months. The advances were made...possible in that the requirement, in the olde statistical energy analysis , that the dynamic systems be highly reverberant and the couplings between the...analytical consideration in terms of the statistical energy analysis (SEA). A brief discussion and simple examples that relate to these recent advances
A basic introduction to statistics for the orthopaedic surgeon.
Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef
2012-02-01
Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.
Physics of Life: A Model for Non-Newtonian Properties of Living Systems
NASA Technical Reports Server (NTRS)
Zak, Michail
2010-01-01
This innovation proposes the reconciliation of the evolution of life with the second law of thermodynamics via the introduction of the First Principle for modeling behavior of living systems. The structure of the model is quantum-inspired: it acquires the topology of the Madelung equation in which the quantum potential is replaced with the information potential. As a result, the model captures the most fundamental property of life: the progressive evolution; i.e. the ability to evolve from disorder to order without any external interference. The mathematical structure of the model can be obtained from the Newtonian equations of motion (representing the motor dynamics) coupled with the corresponding Liouville equation (representing the mental dynamics) via information forces. All these specific non-Newtonian properties equip the model with the levels of complexity that matches the complexity of life, and that makes the model applicable for description of behaviors of ecological, social, and economical systems. Rather than addressing the six aspects of life (organization, metabolism, growth, adaptation, response to stimuli, and reproduction), this work focuses only on biosignature ; i.e. the mechanical invariants of life, and in particular, the geometry and kinematics of behavior of living things. Living things obey the First Principles of Newtonian mechanics. One main objective of this model is to extend the First Principles of classical physics to include phenomenological behavior on living systems; to develop a new mathematical formalism within the framework of classical dynamics that would allow one to capture the specific properties of natural or artificial living systems such as formation of the collective mind based upon abstract images of the selves and non-selves; exploitation of this collective mind for communications and predictions of future expected characteristics of evolution; and for making decisions and implementing the corresponding corrections if the expected scenario is different from the originally planned one. This approach postulates that even a primitive living species possesses additional, non-Newtonian properties that are not included in the laws of Newtonian or statistical mechanics. These properties follow from a privileged ability of living systems to possess a self-image (a concept introduced in psychology) and to interact with it. The proposed mathematical system is based on the coupling of the classical dynamical system representing the motor dynamics with the corresponding Liouville equation describing the evolution of initial uncertainties in terms of the probability density and representing the mental dynamics. The coupling is implemented by the information-based supervising forces that can be associated with self-awareness. These forces fundamentally change the pattern of the probability evolution, and therefore, lead to a major departure of the behavior of living systems from the patterns of both Newtonian and statistical mechanics. This innovation is meant to capture the signature of life based only on observable behavior, not on any biochemistry. This will not prevent the use of this model for developing artificial living systems, as well as for studying some general properties of behavior of natural, living systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... money balance, or account activity with the Participant or at a clearing firm that provides clearing... principles of trade, to remove impediments to and perfect the mechanism of a free and open market and a... principles of trade and removes impediments to, and perfects the mechanism of, a free and open market and a...
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
On cortical coding of vocal communication sounds in primates
NASA Astrophysics Data System (ADS)
Wang, Xiaoqin
2000-10-01
Understanding how the brain processes vocal communication sounds is one of the most challenging problems in neuroscience. Our understanding of how the cortex accomplishes this unique task should greatly facilitate our understanding of cortical mechanisms in general. Perception of species-specific communication sounds is an important aspect of the auditory behavior of many animal species and is crucial for their social interactions, reproductive success, and survival. The principles of neural representations of these behaviorally important sounds in the cerebral cortex have direct implications for the neural mechanisms underlying human speech perception. Our progress in this area has been relatively slow, compared with our understanding of other auditory functions such as echolocation and sound localization. This article discusses previous and current studies in this field, with emphasis on nonhuman primates, and proposes a conceptual platform to further our exploration of this frontier. It is argued that the prerequisite condition for understanding cortical mechanisms underlying communication sound perception and production is an appropriate animal model. Three issues are central to this work: (i) neural encoding of statistical structure of communication sounds, (ii) the role of behavioral relevance in shaping cortical representations, and (iii) sensory-motor interactions between vocal production and perception systems.
Aftershock Energy Distribution by Statistical Mechanics Approach
NASA Astrophysics Data System (ADS)
Daminelli, R.; Marcellini, A.
2015-12-01
The aim of our work is to research the most probable distribution of the energy of aftershocks. We started by applying one of the fundamental principles of statistical mechanics that, in case of aftershock sequences, it could be expressed as: the greater the number of different ways in which the energy of aftershocks can be arranged among the energy cells in phase space the more probable the distribution. We assume that each cell in phase space has the same possibility to be occupied, and that more than one cell in the phase space can have the same energy. Seeing that seismic energy is proportional to products of different parameters, a number of different combinations of parameters can produce different energies (e.g., different combination of stress drop and fault area can release the same seismic energy). Let us assume that there are gi cells in the aftershock phase space characterised by the same energy released ɛi. Therefore we can assume that the Maxwell-Boltzmann statistics can be applied to aftershock sequences with the proviso that the judgment on the validity of this hypothesis is the agreement with the data. The aftershock energy distribution can therefore be written as follow: n(ɛ)=Ag(ɛ)exp(-βɛ)where n(ɛ) is the number of aftershocks with energy, ɛ, A and β are constants. Considering the above hypothesis, we can assume g(ɛ) is proportional to ɛ. We selected and analysed different aftershock sequences (data extracted from Earthquake Catalogs of SCEC, of INGV-CNT and other institutions) with a minimum magnitude retained ML=2 (in some cases ML=2.6) and a time window of 35 days. The results of our model are in agreement with the data, except in the very low energy band, where our model resulted in a moderate overestimation.
NASA Astrophysics Data System (ADS)
Garrett, T. J.; Alva, S.; Glenn, I. B.; Krueger, S. K.
2015-12-01
There are two possible approaches for parameterizing sub-grid cloud dynamics in a coarser grid model. The most common is to use a fine scale model to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to parameterize these behaviors cloud state for the coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical mechanics. This approach avoids any requirement to resolve time-dependent processes in order to arrive at a suitable solution. The second approach is widely used elsewhere in the atmospheric sciences: for example the Planck function for blackbody radiation is derived this way, where no mention is made of the complexities of modeling a large ensemble of time-dependent radiation-dipole interactions in order to obtain the "grid-scale" spectrum of thermal emission by the blackbody as a whole. We find that this statistical approach may be equally suitable for modeling convective clouds. Specifically, we make the physical argument that the dissipation of buoyant energy in convective clouds is done through mixing across a cloud perimeter. From thermodynamic reasoning, one might then anticipate that vertically stacked isentropic surfaces are characterized by a power law dlnN/dlnP = -1, where N(P) is the number clouds of perimeter P. In a Giga-LES simulation of convective clouds within a 100 km square domain we find that such a power law does appear to characterize simulated cloud perimeters along isentropes, provided a sufficient cloudy sample. The suggestion is that it may be possible to parameterize certain important aspects of cloud state without appealing to computationally expensive dynamic simulations.
ERIC Educational Resources Information Center
School Science Review, 1989
1989-01-01
Twenty-two activities are presented. Topics include: acid rain, microcomputers, fish farming, school-industry research projects, enzymes, equilibrium, assessment, science equipment, logic, Archimedes principle, electronics, optics, and statistics. (CW)
Role of differential physical properties in the collective mechanics and dynamics of tissues
NASA Astrophysics Data System (ADS)
Das, Moumita
Living cells and tissues are highly mechanically sensitive and active. Mechanical stimuli influence the shape, motility, and functions of cells, modulate the behavior of tissues, and play a key role in several diseases. In this talk I will discuss how collective biophysical properties of tissues emerge from the interplay between differential mechanical properties and statistical physics of underlying components, focusing on two complementary tissue types whose properties are primarily determined by (1) the extracellular matrix (ECM), and (2) individual and collective cell properties. I will start with the structure-mechanics-function relationships in articular cartilage (AC), a soft tissue that has very few cells, and its mechanical response is primarily due to its ECM. AC is a remarkable tissue: it can support loads exceeding ten times our body weight and bear 60+ years of daily mechanical loading despite having minimal regenerative capacity. I will discuss the biophysical principles underlying this exceptional mechanical response using the framework of rigidity percolation theory, and compare our predictions with experiments done by our collaborators. Next I will discuss ongoing theoretical work on how the differences in cell mechanics, motility, adhesion, and proliferation in a co-culture of breast cancer cells and healthy breast epithelial cells may modulate experimentally observed differential migration and segregation. Our results may provide insights into the mechanobiology of tissues with cell populations with different physical properties present together such as during the formation of embryos or the initiation of tumors. This work was partially supported by a Cottrell College Science Award.
Principles of recruitment and retention in clinical trials.
Aitken, Leanne; Gallagher, Robyn; Madronio, Christine
2003-12-01
Efficient and effective recruitment and retention of participants is the largest single component of the study workload and forms an essential component in the conduct of clinical trials. In this paper, we present five principles to guide the processes of both recruitment and retention. These principles include the selection of an appropriate population to adequately answer the research question, followed by the establishment of a sampling process that accurately represents that population. Creation of systematic and effective recruitment mechanisms should be supported by implementation of follow-up mechanisms that promote participant retention. Finally, all activities related to recruitment and retention must be conducted within the framework of ethics and privacy regulations. Adherence to these principles will assist the researcher in achieving the goals of the study within the available resources.
General cognitive principles for learning structure in time and space.
Goldstein, Michael H; Waterfall, Heidi R; Lotem, Arnon; Halpern, Joseph Y; Schwade, Jennifer A; Onnis, Luca; Edelman, Shimon
2010-06-01
How are hierarchically structured sequences of objects, events or actions learned from experience and represented in the brain? When several streams of regularities present themselves, which will be learned and which ignored? Can statistical regularities take effect on their own, or are additional factors such as behavioral outcomes expected to influence statistical learning? Answers to these questions are starting to emerge through a convergence of findings from naturalistic observations, behavioral experiments, neurobiological studies, and computational analyses and simulations. We propose that a small set of principles are at work in every situation that involves learning of structure from patterns of experience and outline a general framework that accounts for such learning. (c) 2010 Elsevier Ltd. All rights reserved.
Exploring the Action Landscape via Trial World-Lines
ERIC Educational Resources Information Center
Joglekar, Yogesh N.; Tham, Weng Kian
2011-01-01
The Hamilton action principle, also known as the principle of least action, and Lagrange equations are an integral part of intermediate and advanced undergraduate mechanics. Although the Hamilton principle is oft stated as "the action for any nearby trial world-line is greater than the action for the classical world-line," the landscape of action…
ERIC Educational Resources Information Center
Lee, Hollylynne S.; Lovett, Jennifer N.; Mojica, Gemma M.
2017-01-01
With online learning becoming a more viable option for teachers to develop their expertise, our report shares one such effort focused on improving the teaching of statistics. We share design principles and learning opportunities, as well as discuss specific impacts evident in classroom teachers' course activity concerning changes to their beliefs…
An Optimization Principle for Deriving Nonequilibrium Statistical Models of Hamiltonian Dynamics
NASA Astrophysics Data System (ADS)
Turkington, Bruce
2013-08-01
A general method for deriving closed reduced models of Hamiltonian dynamical systems is developed using techniques from optimization and statistical estimation. Given a vector of resolved variables, selected to describe the macroscopic state of the system, a family of quasi-equilibrium probability densities on phase space corresponding to the resolved variables is employed as a statistical model, and the evolution of the mean resolved vector is estimated by optimizing over paths of these densities. Specifically, a cost function is constructed to quantify the lack-of-fit to the microscopic dynamics of any feasible path of densities from the statistical model; it is an ensemble-averaged, weighted, squared-norm of the residual that results from submitting the path of densities to the Liouville equation. The path that minimizes the time integral of the cost function determines the best-fit evolution of the mean resolved vector. The closed reduced equations satisfied by the optimal path are derived by Hamilton-Jacobi theory. When expressed in terms of the macroscopic variables, these equations have the generic structure of governing equations for nonequilibrium thermodynamics. In particular, the value function for the optimization principle coincides with the dissipation potential that defines the relation between thermodynamic forces and fluxes. The adjustable closure parameters in the best-fit reduced equations depend explicitly on the arbitrary weights that enter into the lack-of-fit cost function. Two particular model reductions are outlined to illustrate the general method. In each example the set of weights in the optimization principle contracts into a single effective closure parameter.
2009-12-01
events. Work associated with aperiodic tasks have the same statistical behavior and the same timing requirements. The timing deadlines are soft. • Sporadic...answers, but it is possible to calculate how precise the estimates are. Simulation-based performance analysis of a model includes a statistical ...to evaluate all pos- sible states in a timely manner. This is the principle reason for resorting to simulation and statistical analysis to evaluate
A variational principle for compressible fluid mechanics: Discussion of the multi-dimensional theory
NASA Technical Reports Server (NTRS)
Prozan, R. J.
1982-01-01
The variational principle for compressible fluid mechanics previously introduced is extended to two dimensional flow. The analysis is stable, exactly conservative, adaptable to coarse or fine grids, and very fast. Solutions for two dimensional problems are included. The excellent behavior and results lend further credence to the variational concept and its applicability to the numerical analysis of complex flow fields.
Protein Multifunctionality: Principles and Mechanisms
Zaretsky, Joseph Z.; Wreschner, Daniel H.
2008-01-01
In the review, the nature of protein multifunctionality is analyzed. In the first part of the review the principles of structural/functional organization of protein are discussed. In the second part, the main mechanisms involved in development of multiple functions on a single gene product(s) are analyzed. The last part represents a number of examples showing that multifunctionality is a basic feature of biologically active proteins. PMID:21566747
Engineering principles to assure compatible docking between future spacecraft of USA and USSR
NASA Technical Reports Server (NTRS)
Johnson, C. C.
1973-01-01
An androgynous peripheral type docking mechanism concept selected by the U.S. and the USSR is described. The rationale supporting the selection of the concept, the mechanical principles inherent to the concept, and the probable nature of future designs stemming from the concept are discussed. Operational situations prior to docking, impact conditions, energy absorption, and structural joining of two spacecraft are examined.
Econ Simulation Cited as Success
ERIC Educational Resources Information Center
Workman, Robert; Maher, John
1973-01-01
A brief description of a computerized economics simulation model which provides students with an opportunity to apply microeconomic principles along with elementary accounting and statistical techniques.'' (Author/AK)
Essentials of the disclosure review process: a federal perspective.
Zarate, Alvan O; Zayatz, Laura
2006-09-01
MANY RESEARCHERS NEED TO MAKE arrangements to share de-identified electronic data files. However, the ways in which respondent identity may be protected are not well understood or are assumed to be the special province of large statistical agencies or specialized statisticians. Approaches to data sharing and protecting respondent identity have been pioneered by federal agencies which gather data vital to political and economic decision making. These agencies are required by statutory law both to assure confidentiality and to share data in usable form with other governmental agencies and with scholars who perform needed analyses of those data. The basic principles of disclosure limitation developed by the Census Bureau, the National Center for Health Statistics, and other federal agencies are fundamental to meeting new funding requirements to share and deidentify data, and are often referred to in the literature on data sharing. We describe how these principles are employed by the Disclosure Review Boards (DRBs) of these two agencies, and then state these principles in more general terms that are applicable to any disclosure review process. The kinds of data that academic institutions share may call for less complex or stringent DRBs and specific nondisclosure procedures different from those employed by federal agencies, but the same general principles apply. Specific application of these six principles by non-government researchers will depend on the nature of their data, their own institutional resources, and the likely future usefulness of their data.
Acar, Nihat; Karakasli, Ahmet; Karaarslan, Ahmet; Mas, Nermin Ng; Hapa, Onur
2017-01-01
Volumetric measurements of benign tumors enable surgeons to trace volume changes during follow-up periods. For a volumetric measurement technique to be applicable, it should be easy, rapid, and inexpensive and should carry a high interobserver reliability. We aimed to assess the interobserver reliability of a volumetric measurement technique using the Cavalier's principle of stereological methods. The computerized tomography (CT) of 15 patients with a histopathologically confirmed diagnosis of enchondroma with variant tumor sizes and localizations was retrospectively reviewed for interobserver reliability evaluation of the volumetric stereological measurement with the Cavalier's principle, V = t × [((SU) × d) /SL]2 × Σ P. The volumes of the 15 tumors collected by the observers are demonstrated in Table 1. There was no statistical significance between the first and second observers ( p = 0.000 and intraclass correlation coefficient = 0.970) and between the first and third observers ( p = 0.000 and intraclass correlation coefficient = 0.981). No statistical significance was detected between the second and third observers ( p = 0.000 and intraclass correlation coefficient = 0.976). The Cavalier's principle with the stereological technique using the CT scans is an easy, rapid, and inexpensive technique in volumetric evaluation of enchondromas with a trustable interobserver reliability.
Spontaneous ultraweak photon emission from biological systems and the endogenous light field.
Schwabl, Herbert; Klima, Herbert
2005-04-01
Still one of the most astonishing biological electromagnetic phenomena is the ultraweak photon emission (UPE) from living systems. Organisms and tissues spontaneously emit measurable intensities of light, i.e. photons in the visible part of the electromagnetic spectrum (380-780 nm), in the range from 1 to 1,000 photons x s-1 x cm-2, depending on their condition and vitality. It is important not to confuse UPE from living systems with other biogenic light emitting processes such as bioluminescence or chemiluminescence. This article examines with basic considerations from physics on the quantum nature of photons the empirical phenomenon of UPE. This leads to the description of the non-thermal origin of this radiation. This is in good correspondence with the modern understanding of life phenomena as dissipative processes far from thermodynamic equilibrium. UPE also supports the understanding of life sustaining processes as basically driven by electromagnetic fields. The basic features of UPE, like intensity and spectral distribution, are known in principle for many experimental situations. The UPE of human leukocytes contributes to an endogenous light field of about 1011 photons x s-1 which can be influenced by certain factors. Further research is needed to reveal the statistical properties of UPE and in consequence to answer questions about the underlying mechanics of the biological system. In principle, statistical properties of UPE allow to reconstruct phase-space dynamics of the light emitting structures. Many open questions remain until a proper understanding of the electromagnetic interaction of the human organism can be achieved: which structures act as receptors and emitters for electromagnetic radiation? How is electromagnetic information received and processed within cells?
Fundamental Principles of Proper Space Kinematics
NASA Astrophysics Data System (ADS)
Wade, Sean
It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.
First-principles investigation of polarization and ion conduction mechanisms in hydroxyapatite
NASA Astrophysics Data System (ADS)
Kasamatsu, Shusuke; Sugino, Osamu
We report first-principles simulation of polarization mechanisms in hydroxyapatite to explain the underlying mechanism behind the reported ion conductivities and polarization under electrical poling at elevated temperatures. It is found that ion conduction occurs mainly in the column of OH$^-$ ions along the $c$-axis through a combination of the flipping of OH$^-$ ions, exchange of proton vacancies between OH$^-$ ions, and the hopping of the OH$^-$ vacancy. The calculated activation energies are consistent with those found in conductivity measurements and thermally stimulated depolarization current measurements.
Grover Search and the No-Signaling Principle
NASA Astrophysics Data System (ADS)
Bao, Ning; Bouland, Adam; Jordan, Stephen P.
2016-09-01
Two of the key properties of quantum physics are the no-signaling principle and the Grover search lower bound. That is, despite admitting stronger-than-classical correlations, quantum mechanics does not imply superluminal signaling, and despite a form of exponential parallelism, quantum mechanics does not imply polynomial-time brute force solution of NP-complete problems. Here, we investigate the degree to which these two properties are connected. We examine four classes of deviations from quantum mechanics, for which we draw inspiration from the literature on the black hole information paradox. We show that in these models, the physical resources required to send a superluminal signal scale polynomially with the resources needed to speed up Grover's algorithm. Hence the no-signaling principle is equivalent to the inability to solve NP-hard problems efficiently by brute force within the classes of theories analyzed.
New approach in the quantum statistical parton distribution
NASA Astrophysics Data System (ADS)
Sohaily, Sozha; Vaziri (Khamedi), Mohammad
2017-12-01
An attempt to find simple parton distribution functions (PDFs) based on quantum statistical approach is presented. The PDFs described by the statistical model have very interesting physical properties which help to understand the structure of partons. The longitudinal portion of distribution functions are given by applying the maximum entropy principle. An interesting and simple approach to determine the statistical variables exactly without fitting and fixing parameters is surveyed. Analytic expressions of the x-dependent PDFs are obtained in the whole x region [0, 1], and the computed distributions are consistent with the experimental observations. The agreement with experimental data, gives a robust confirm of our simple presented statistical model.
NASA Astrophysics Data System (ADS)
Artrith, Nongnuch; Urban, Alexander; Ceder, Gerbrand
2018-06-01
The atomistic modeling of amorphous materials requires structure sizes and sampling statistics that are challenging to achieve with first-principles methods. Here, we propose a methodology to speed up the sampling of amorphous and disordered materials using a combination of a genetic algorithm and a specialized machine-learning potential based on artificial neural networks (ANNs). We show for the example of the amorphous LiSi alloy that around 1000 first-principles calculations are sufficient for the ANN-potential assisted sampling of low-energy atomic configurations in the entire amorphous LixSi phase space. The obtained phase diagram is validated by comparison with the results from an extensive sampling of LixSi configurations using molecular dynamics simulations and a general ANN potential trained to ˜45 000 first-principles calculations. This demonstrates the utility of the approach for the first-principles modeling of amorphous materials.
Local quantum measurement and no-signaling imply quantum correlations.
Barnum, H; Beigi, S; Boixo, S; Elliott, M B; Wehner, S
2010-04-09
We show that, assuming that quantum mechanics holds locally, the finite speed of information is the principle that limits all possible correlations between distant parties to be quantum mechanical as well. Local quantum mechanics means that a Hilbert space is assigned to each party, and then all local positive-operator-valued measurements are (in principle) available; however, the joint system is not necessarily described by a Hilbert space. In particular, we do not assume the tensor product formalism between the joint systems. Our result shows that if any experiment would give nonlocal correlations beyond quantum mechanics, quantum theory would be invalidated even locally.
Statistical Contact Model for Confined Molecules
NASA Astrophysics Data System (ADS)
Santamaria, Ruben; de la Paz, Antonio Alvarez; Roskop, Luke; Adamowicz, Ludwik
2016-08-01
A theory that describes in a realistic form a system of atoms under the effects of temperature and confinement is presented. The theory departs from a Lagrangian of the Zwanzig type and contains the main ingredients for describing a system of atoms immersed in a heat bath that is also formed by atoms. The equations of motion are derived according to Lagrangian mechanics. The application of statistical mechanics to describe the bulk effects greatly reduces the complexity of the equations. The resultant equations of motion are of the Langevin type with the viscosity and the temperature of the heat reservoir able to influence the trajectories of the particles. The pressure effects are introduced mechanically by using a container with an atomic structure immersed in the heat bath. The relevant variables that determine the equation of state are included in the formulation. The theory is illustrated by the derivation of the equation of state for a system with 76 atoms confined inside of a 180-atom fullerene-like cage that is immersed in fluid forming the heat bath at a temperature of 350 K and with the friction coefficient of 3.0 {ps}^{-1}. The atoms are of the type believed to form the cores of the Uranus and Neptune planets. The dynamic and the static pressures of the confined system are varied in the 3-5 KBar and 2-30 MBar ranges, respectively. The formulation can be equally used to analyze chemical reactions under specific conditions of pressure and temperature, determine the structure of clusters with their corresponding equation of state, the conditions for hydrogen storage, etc. The theory is consistent with the principles of thermodynamics and it is intrinsically ergodic, of general use, and the first of this kind.
Design principles of a rotating medium speed mechanism
NASA Technical Reports Server (NTRS)
Hostenkamp, R. G.; Achtermann, E.; Bentall, R. H.
1976-01-01
Design principles of a medium speed mechanism (MSM) are presented, including discussion on the relative merits of beryllium and aluminium as structural materials. Rotating at a speed of 60 rpm, the application envisaged for the MSM was as a despin bearing for the despun platform or despun antenna of a spin stabilized satellite. The MSM was built and tested to qualification level and is currently undergoing real time life testing.
BOOK REVIEW: Conversations on the Dark Secrets of Physics
NASA Astrophysics Data System (ADS)
Teller, Edward
2003-07-01
Over many years Edward Teller delivered a course of Physical Science Appreciation Lectures. This book is based on those lectures, which must have been very stimulating. In the preparation of the book, Edward Teller was assisted by his daughter, Wendy Teller, and also by Wilson Talley. On many pages there are footnotes in the form of conversations between 'ET', who explains, and 'WT', who asks intelligent questions. (It is never clear which 'WT' is which.) I mention these footnotes as they contribute enormously to the charm and humour of the book. The book contains numerous anecdotes, many of which were new to me. The verse in the New Yorker, by Harold Furth, recording the famous meeting between Dr Teller and Dr Anti-Teller, is included. Dr Teller's comment is `The remarkable fact is that Harold got paid for the poem'. Dr Anti-Teller's comment is anti-recorded. The topics in the book include simple mechanics, statistical mechanics, electromagnetism, quantum mechanics and 'uses of new knowledge'. Despite its origins, the book does not avoid mathematics ('I will use mathematics because physics without mathematics is meaningless' (p1)), but Teller does attempt to explain the mathematics he uses. In much of the book the mathematics is at school level, but in his treatment of quantum mechanics he uses differential equations. If one skips past the equations then his final chapters are less mathematically demanding. I have enjoyed reading this book. Teller's approach is refreshing, and his coverage comprehensive and generally authoritative. My only disquiet is over his coverage of electrons in solids, where it would be clearer to consider the one-dimensional case first, before treating the three-dimensional case. There is a substantial discussion on the correspondence principle, wave-particle duality and on the uncertainty principle. His disposal of Schrödinger's notorious cat is masterly. There are questions at the end of each chapter. One question is based on a possible experiment suggested by Einstein to measure both energy and time precisely, thus violating the uncertainty principle. (We are reminded that Einstein was unhappy with the uncertainty principle.) The question is to find the flaw in the argument: we are told it took Bohr a (sleepless?) night to find it. Answers to all the questions are included at the end of the book. The last chapter is the epilogue, 'After the Revolution', in which Teller makes clear his belief that there will continue to be new discoveries in the physical sciences for a long time to come. This is a book which all readers of this journal should enjoy. It may give you fresh insight into some of the topics. Buy a copy, read it and then keep it at your bedside for occasional browsing. Make sure your institutional library has a copy, and recommend it to all physics students, both graduates and undergraduates. P Borcherds
Statistical Entropy of the G-H-S Black Hole to All Orders in Planck Length
NASA Astrophysics Data System (ADS)
Sun, Hangbin; He, Feng; Huang, Hai
2012-02-01
Considering corrections to all orders in Planck length on the quantum state density from generalized uncertainty principle, we calculate the statistical entropy of the scalar field near the horizon of Garfinkle-Horowitz-Strominger (G-H-S) black hole without any artificial cutoff. It is shown that the entropy is proportional to the horizon area.
On a logical basis for division of responsibilities in statistical practice
NASA Technical Reports Server (NTRS)
Deming, W. Edwards
1966-01-01
The purpose of this paper is to explain principles for division of responsibilities between the statistician and the people that he works with, and reasons why this division of responsibilities is important -- that is, how it improves the performance of both statistician and expert in subject-matter. The aim is to find and illustrate principles of practice by which statisticians may make effective use of their knowledge of theory. The specialist in statistical methods may find himself applying the same basic theory in a dozen different fields in a week, rotating through the same projects the next week. Or, he may work day after day primarily in a single substantive field. Either way, he requires rules of practice. A statement of statistical reliability should present any information that might help the reader to form his own opinion concerning the validity of conclusions likely to be drawn from the results. The aim of a statistical report is to protect the client from seeing merely what he would like to see; to protect him from losses that could come from misuse of results. A further aim is to forestall unwarranted claims of accuracy that the client's public might otherwise accept.
The statistical fluctuation study of quantum key distribution in means of uncertainty principle
NASA Astrophysics Data System (ADS)
Liu, Dunwei; An, Huiyao; Zhang, Xiaoyu; Shi, Xuemei
2018-03-01
Laser defects in emitting single photon, photon signal attenuation and propagation of error cause our serious headaches in practical long-distance quantum key distribution (QKD) experiment for a long time. In this paper, we study the uncertainty principle in metrology and use this tool to analyze the statistical fluctuation of the number of received single photons, the yield of single photons and quantum bit error rate (QBER). After that we calculate the error between measured value and real value of every parameter, and concern the propagation error among all the measure values. We paraphrase the Gottesman-Lo-Lutkenhaus-Preskill (GLLP) formula in consideration of those parameters and generate the QKD simulation result. In this study, with the increase in coding photon length, the safe distribution distance is longer and longer. When the coding photon's length is N = 10^{11}, the safe distribution distance can be almost 118 km. It gives a lower bound of safe transmission distance than without uncertainty principle's 127 km. So our study is in line with established theory, but we make it more realistic.
Schwaiberger, David; Pickerodt, Philipp A; Pomprapa, Anake; Tjarks, Onno; Kork, Felix; Boemke, Willehad; Francis, Roland C E; Leonhardt, Steffen; Lachmann, Burkhard
2018-06-01
Adherence to low tidal volume (V T ) ventilation and selected positive end-expiratory pressures are low during mechanical ventilation for treatment of the acute respiratory distress syndrome. Using a pig model of severe lung injury, we tested the feasibility and physiological responses to a novel fully closed-loop mechanical ventilation algorithm based on the "open lung" concept. Lung injury was induced by surfactant washout in pigs (n = 8). Animals were ventilated following the principles of the "open lung approach" (OLA) using a fully closed-loop physiological feedback algorithm for mechanical ventilation. Standard gas exchange, respiratory- and hemodynamic parameters were measured. Electrical impedance tomography was used to quantify regional ventilation distribution during mechanical ventilation. Automatized mechanical ventilation provided strict adherence to low V T -ventilation for 6 h in severely lung injured pigs. Using the "open lung" approach, tidal volume delivery required low lung distending pressures, increased recruitment and ventilation of dorsal lung regions and improved arterial blood oxygenation. Physiological feedback closed-loop mechanical ventilation according to the principles of the open lung concept is feasible and provides low tidal volume ventilation without human intervention. Of importance, the "open lung approach"-ventilation improved gas exchange and reduced lung driving pressures by opening atelectasis and shifting of ventilation to dorsal lung regions.
Dirac structures in vakonomic mechanics
NASA Astrophysics Data System (ADS)
Jiménez, Fernando; Yoshimura, Hiroaki
2015-08-01
In this paper, we explore dynamics of the nonholonomic system called vakonomic mechanics in the context of Lagrange-Dirac dynamical systems using a Dirac structure and its associated Hamilton-Pontryagin variational principle. We first show the link between vakonomic mechanics and nonholonomic mechanics from the viewpoints of Dirac structures as well as Lagrangian submanifolds. Namely, we clarify that Lagrangian submanifold theory cannot represent nonholonomic mechanics properly, but vakonomic mechanics instead. Second, in order to represent vakonomic mechanics, we employ the space TQ ×V∗, where a vakonomic Lagrangian is defined from a given Lagrangian (possibly degenerate) subject to nonholonomic constraints. Then, we show how implicit vakonomic Euler-Lagrange equations can be formulated by the Hamilton-Pontryagin variational principle for the vakonomic Lagrangian on the extended Pontryagin bundle (TQ ⊕T∗ Q) ×V∗. Associated with this variational principle, we establish a Dirac structure on (TQ ⊕T∗ Q) ×V∗ in order to define an intrinsic vakonomic Lagrange-Dirac system. Furthermore, we also establish another construction for the vakonomic Lagrange-Dirac system using a Dirac structure on T∗ Q ×V∗, where we introduce a vakonomic Dirac differential. Finally, we illustrate our theory of vakonomic Lagrange-Dirac systems by some examples such as the vakonomic skate and the vertical rolling coin.
Detection of light-matter interaction in the weak-coupling regime by quantum light
NASA Astrophysics Data System (ADS)
Bin, Qian; Lü, Xin-You; Zheng, Li-Li; Bin, Shang-Wu; Wu, Ying
2018-04-01
"Mollow spectroscopy" is a photon statistics spectroscopy, obtained by scanning the quantum light scattered from a source system. Here, we apply this technique to detect the weak light-matter interaction between the cavity and atom (or a mechanical oscillator) when the strong system dissipation is included. We find that the weak interaction can be measured with high accuracy when exciting the target cavity by quantum light scattered from the source halfway between the central peak and each side peak. This originally comes from the strong correlation of the injected quantum photons. In principle, our proposal can be applied into the normal cavity quantum electrodynamics system described by the Jaynes-Cummings model and an optomechanical system. Furthermore, it is state of the art for experiment even when the interaction strength is reduced to a very small value.
The ambiguity of simplicity in quantum and classical simulation
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.
2017-04-01
A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.
Statistical Mechanics of Viral Entry
NASA Astrophysics Data System (ADS)
Zhang, Yaojun; Dudko, Olga K.
2015-01-01
Viruses that have lipid-membrane envelopes infect cells by fusing with the cell membrane to release viral genes. Membrane fusion is known to be hindered by high kinetic barriers associated with drastic structural rearrangements—yet viral infection, which occurs by fusion, proceeds on remarkably short time scales. Here, we present a quantitative framework that captures the principles behind the invasion strategy shared by all enveloped viruses. The key to this strategy—ligand-triggered conformational changes in the viral proteins that pull the membranes together—is treated as a set of concurrent, bias field-induced activated rate processes. The framework results in analytical solutions for experimentally measurable characteristics of virus-cell fusion and enables us to express the efficiency of the viral strategy in quantitative terms. The predictive value of the theory is validated through simulations and illustrated through recent experimental data on influenza virus infection.
Social Ecology, Genomics, and African American Health: A Nonlinear Dynamical Perspective
Madhere, Serge; Harrell, Jules; Royal, Charmaine D. M.
2009-01-01
This article offers a model that clarifies the degree of interdependence between social ecology and genomic processes. Drawing on principles from nonlinear dynamics, the model delineates major lines of bifurcation involving people's habitat, their family health history, and collective catastrophes experienced by their community. It shows how mechanisms of resource acquisition, depletion, and preservation can lead to disruptions in basic metabolism and in the activity of cytokines, neurotransmitters, and protein kinases, thus giving impetus to epigenetic changes. The hypotheses generated from the model are discussed throughout the article for their relevance to health problems among African Americans. Where appropriate, they are examined in light of data from the National Vital Statistics System. Multiple health outcomes are considered. For any one of them, the model makes clear the unique and converging contributions of multiple antecedent factors. PMID:19672481
Atomistic mechanisms of ReRAM cell operation and reliability
NASA Astrophysics Data System (ADS)
Pandey, Sumeet C.
2018-01-01
We present results from first-principles-based modeling that captures functionally important physical phenomena critical to cell materials selection, operation, and reliability for resistance-switching memory technologies. An atomic-scale description of retention, the low- and high-resistance states (RS), and the sources of intrinsic cell-level variability in ReRAM is discussed. Through the results obtained from density functional theory, non-equilibrium Green’s function, molecular dynamics, and kinetic Monte Carlo simulations; the role of variable-charge vacancy defects and metal impurities in determining the RS, the LRS-stability, and electron-conduction in such RS is reported. Although, the statistical electrical characteristics of the oxygen-vacancy (Ox-ReRAM) and conductive-bridging RAM (M-ReRAM) are notably different, the underlying similar electrochemical phenomena describing retention and formation/dissolution of RS are being discussed.
Statistical coding and decoding of heartbeat intervals.
Lucena, Fausto; Barros, Allan Kardec; Príncipe, José C; Ohnishi, Noboru
2011-01-01
The heart integrates neuroregulatory messages into specific bands of frequency, such that the overall amplitude spectrum of the cardiac output reflects the variations of the autonomic nervous system. This modulatory mechanism seems to be well adjusted to the unpredictability of the cardiac demand, maintaining a proper cardiac regulation. A longstanding theory holds that biological organisms facing an ever-changing environment are likely to evolve adaptive mechanisms to extract essential features in order to adjust their behavior. The key question, however, has been to understand how the neural circuitry self-organizes these feature detectors to select behaviorally relevant information. Previous studies in computational perception suggest that a neural population enhances information that is important for survival by minimizing the statistical redundancy of the stimuli. Herein we investigate whether the cardiac system makes use of a redundancy reduction strategy to regulate the cardiac rhythm. Based on a network of neural filters optimized to code heartbeat intervals, we learn a population code that maximizes the information across the neural ensemble. The emerging population code displays filter tuning proprieties whose characteristics explain diverse aspects of the autonomic cardiac regulation, such as the compromise between fast and slow cardiac responses. We show that the filters yield responses that are quantitatively similar to observed heart rate responses during direct sympathetic or parasympathetic nerve stimulation. Our findings suggest that the heart decodes autonomic stimuli according to information theory principles analogous to how perceptual cues are encoded by sensory systems.
On the Correct Analysis of the Foundations of Theoretical Physics
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2007-04-01
The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.
On spectroscopy for a whole Abelian model
NASA Astrophysics Data System (ADS)
Chauca, J.; Doria, R.
2012-10-01
Postulated on the whole meaning a whole abelian gauge symmetry is being introduced. Various physical areas as complexity, statistical mechanics, quantum mechanics are partially supporting this approach where the whole is at origin. However, the reductionist crisis given by quark confinement definitely sustains this insight. It says that fundamental parts can not be seen isolatedely. Consequently, there is an experimental situation where the parts should be substituted by something more. This makes us to look for writing the wholeness principle under gauge theory. For this, one reinterprets the gauge parameter where instead of compensating fields it is organizing a systemic gauge symmetry. Now, it introduces a fields set {AμI} rotating under a common gauge symmetry. Thus, given a fields collection {AμI} as origin, the effort at this work is to investigate on its spectroscopy. Analyze for the abelian case the correspondent involved quanta. Understand that for a whole model diversity replaces elementarity. Derive the associated quantum numbers as spin, mass, charge, discrete symmetries in terms of such systemic symmetry. Observe how the particles diversity is manifested in terms of wholeness.
Fully Resolved Simulations of Particle-Bed-Turbulence Interactions in Oscillatory Flows
NASA Astrophysics Data System (ADS)
Apte, S.; Ghodke, C.
2017-12-01
Particle-resolved direct numerical simulations (DNS) are performed to investigate the behavior of an oscillatory flow field over a bed of closely packed fixed spherical particles for a range of Reynolds numbers in transitional and rough turbulent flow regime. Presence of roughness leads to a substantial modification of the underlying boundary layer mechanism resulting in increased bed shear stress, reduction in the near-bed anisotropy, modification of the near-bed sweep and ejection motions along with marked changes in turbulent energy transport mechanisms. Characterization of such resulting flow field is performed by studying statistical descriptions of the near-bed turbulence for different roughness parameters. A double-averaging technique is employed to reveal spatial inhomogeneities at the roughness scale that provide alternate paths of energy transport in the turbulent kinetic energy (TKE) budget. Spatio-temporal characteristics of unsteady particle forces by studying their spatial distribution, temporal auto-correlations, frequency spectra, cross-correlations with near-bed turbulent flow variables and intermittency intermittency in the forces using the concept of impulse are investigated in detail. These first principle simulations provide substantial insights into the modeling of incipient motion of sediments.
NASA Astrophysics Data System (ADS)
Singaravelu, J.; Sundaresan, S.; Nageswara Rao, B.
2013-04-01
This article presents a methodology for evaluation of the proof load factor (PLF) for clamp band system (CBS) made of M250 Maraging steel following fracture mechanics principles.CBS is most widely used as a structural element and as a separation system. Using Taguchi's design of experiments and the response surface method (RSM) the compact tension specimens were tested to establish an empirical relation for the failure load ( P max) in terms of the ultimate strength, width, thickness, and initial crack length. The test results of P max closely matched with the developed RSM empirical relation. Crack growth rates of the maraging steel in different environments were examined. Fracture strength (σf) of center surface cracks and through-crack tension specimens are evaluated utilizing the fracture toughness ( K IC). Stress induced in merman band at flight loading conditions is evaluated to estimate the higher load factor and PLF. Statistical safety factor and reliability assessments were made for the specified flaw sizes useful in the development of fracture control plan for CBS of launch vehicles.
Transfer of Learning in Quantum Mechanics
NASA Astrophysics Data System (ADS)
Singh, Chandralekha
2005-09-01
We investigate the difficulties that undergraduate students in quantum mechanics courses have in transferring learning from previous courses or within the same course from one context to another by administering written tests and conducting individual interviews. Quantum mechanics is abstract and its paradigm is very different from the classical one. A good grasp of the principles of quantum mechanics requires creating and organizing a knowledge structure consistent with the quantum postulates. Previously learned concepts such as the principle of superposition and probability can be useful in quantum mechanics if students are given opportunity to build associations between new and prior knowledge. We also discuss the need for better alignment between quantum mechanics and modern physics courses taken previously because semi-classical models can impede internalization of the quantum paradigm in more advanced courses.
Noel, Jean-Paul; Blanke, Olaf; Serino, Andrea
2018-06-06
Integrating information across sensory systems is a critical step toward building a cohesive representation of the environment and one's body, and as illustrated by numerous illusions, scaffolds subjective experience of the world and self. In the last years, classic principles of multisensory integration elucidated in the subcortex have been translated into the language of statistical inference understood by the neocortical mantle. Most importantly, a mechanistic systems-level description of multisensory computations via probabilistic population coding and divisive normalization is actively being put forward. In parallel, by describing and understanding bodily illusions, researchers have suggested multisensory integration of bodily inputs within the peripersonal space as a key mechanism in bodily self-consciousness. Importantly, certain aspects of bodily self-consciousness, although still very much a minority, have been recently casted under the light of modern computational understandings of multisensory integration. In doing so, we argue, the field of bodily self-consciousness may borrow mechanistic descriptions regarding the neural implementation of inference computations outlined by the multisensory field. This computational approach, leveraged on the understanding of multisensory processes generally, promises to advance scientific comprehension regarding one of the most mysterious questions puzzling humankind, that is, how our brain creates the experience of a self in interaction with the environment. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
The Statistical Basis of Chemical Equilibria.
ERIC Educational Resources Information Center
Hauptmann, Siegfried; Menger, Eva
1978-01-01
Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)
Violations of the ceiling principle: exact conditions and statistical evidence.
Slimowitz, J R; Cohen, J E
1993-01-01
The National Research Council recommended the use of the ceiling principle in forensic applications of DNA testing on the grounds that the ceiling principle was believed to be "conservative," giving estimates greater than or equal to the actual genotype frequencies in the appropriate reference population. We show here that the ceiling principle can fail to be conservative in a population with two subpopulations and two loci, each with two alleles at Hardy-Weinberg equilibrium, if there is some linkage disequilibrium between loci. We also show that the ceiling principle can fail in a population with two subpopulations and a single locus with two alleles if Hardy-Weinberg equilibrium does not hold. We give explicit analytical formulas to describe when the ceiling principle fails. By showing that the ceiling principle is not always mathematically reliable, this analysis gives users of the ceiling principle the responsibility of demonstrating that it is conservative for the particular data with which it is used. Our reanalysis of VNTR data bases of the FBI provides compelling evidence of two-locus associations within three major ethnic groups (Caucasian, black, and Hispanic) in the United States, even though the loci tested are located on different chromosomes. Before the ceiling principle is implemented, more research should be done to determine whether it may be violated in practice. PMID:8328450
Disturbance, the uncertainty principle and quantum optics
NASA Technical Reports Server (NTRS)
Martens, Hans; Demuynck, Willem M.
1993-01-01
It is shown how a disturbance-type uncertainty principle can be derived from an uncertainty principle for joint measurements. To achieve this, we first clarify the meaning of 'inaccuracy' and 'disturbance' in quantum mechanical measurements. The case of photon number and phase is treated as an example, and it is applied to a quantum non-demolition measurement using the optical Kerr effect.
Two new kinds of uncertainty relations
NASA Technical Reports Server (NTRS)
Uffink, Jos
1994-01-01
We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.
Engineering principles to assure compatible docking between future spacecraft of USA and USSR
NASA Technical Reports Server (NTRS)
Johnson, C. C.
1975-01-01
Working jointly the USA and the USSR have selected an androgynous, peripheral type docking mechanism concept. The mechanical principles inherent to the concept, the rationale supporting its selection, and the probable nature of future designs stemming from the concept, are described. Operational situations just prior to docking, impact conditions, energy absorption, and the structural joining of the spacecraft, are specified. Docking procedures for the Apollo-Soyuz missions are discussed.
Anti-Mechanized Defense: A Computerized Simulation for Squad Leader Training.
1983-09-01
applicability of cybernetic principles is easily transformed to meet the needs of this research. Specifi- cally, the basic principle governing management...and as in other areas this is better meas- ured in actual field conditions. COMMAND AND CONTROL (S2A.2) Control of Organic Fi repower In general...AD-AI34 962 UNCLASSIFIED ANTI-MECHANIZED DEFENSE: A COMPUTERIZED SQUAD LEADER TRAINING(U) AIR FORCE INST WRIGHT-PATTERSON AFB OH SCHOOL OF SYST
A survey of parametrized variational principles and applications to computational mechanics
NASA Technical Reports Server (NTRS)
Felippa, Carlos A.
1993-01-01
This survey paper describes recent developments in the area of parametrized variational principles (PVP's) and selected applications to finite-element computational mechanics. A PVP is a variational principle containing free parameters that have no effect on the Euler-Lagrange equations. The theory of single-field PVP's based on gauge functions (also known as null Lagrangians) is a subset of the inverse problem of variational calculus that has limited value. On the other hand, multifield PVP's are more interesting from theoretical and practical standpoints. Following a tutorial introduction, the paper describes the recent construction of multifield PVP's in several areas of elasticity and electromagnetics. It then discusses three applications to finite-element computational mechanics: the derivation of high-performance finite elements, the development of element-level error indicators, and the constructions of finite element templates. The paper concludes with an overview of open research areas.
Cloud Macroscopic Organization: Order Emerging from Randomness
NASA Technical Reports Server (NTRS)
Yuan, Tianle
2011-01-01
Clouds play a central role in many aspects of the climate system and their forms and shapes are remarkably diverse. Appropriate representation of clouds in climate models is a major challenge because cloud processes span at least eight orders of magnitude in spatial scales. Here we show that there exists order in cloud size distribution of low-level clouds, and that it follows a power-law distribution with exponent gamma close to 2. gamma is insensitive to yearly variations in environmental conditions, but has regional variations and land-ocean contrasts. More importantly, we demonstrate this self-organizing behavior of clouds emerges naturally from a complex network model with simple, physical organizing principles: random clumping and merging. We also demonstrate symmetry between clear and cloudy skies in terms of macroscopic organization because of similar fundamental underlying organizing principles. The order in the apparently complex cloud-clear field thus has its root in random local interactions. Studying cloud organization with complex network models is an attractive new approach that has wide applications in climate science. We also propose a concept of cloud statistic mechanics approach. This approach is fully complementary to deterministic models, and the two approaches provide a powerful framework to meet the challenge of representing clouds in our climate models when working in tandem.
Urban growth simulation from "first principles".
Andersson, Claes; Lindgren, Kristian; Rasmussen, Steen; White, Roger
2002-08-01
General and mathematically transparent models of urban growth have so far suffered from a lack in microscopic realism. Physical models that have been used for this purpose, i.e., diffusion-limited aggregation, dielectric breakdown models, and correlated percolation all have microscopic dynamics for which analogies with urban growth appear stretched. Based on a Markov random field formulation we have developed a model that is capable of reproducing a variety of important characteristic urban morphologies and that has realistic microscopic dynamics. The results presented in this paper are particularly important in relation to "urban sprawl," an important aspect of which is aggressively spreading low-density land uses. This type of growth is increasingly causing environmental, social, and economical problems around the world. The microdynamics of our model, or its "first principles," can be mapped to human decisions and motivations and thus potentially also to policies and regulations. We measure statistical properties of macrostates generated by the urban growth mechanism that we propose, and we compare these to empirical measurements as well as to results from other models. To showcase the open-endedness of the model and to thereby relate our work to applied urban planning we have also included a simulated city consisting of a large number of land use classes in which also topographical data have been used.
NASA Astrophysics Data System (ADS)
Bai, Xian-Xu; Zhong, Wei-Min; Zou, Qi; Zhu, An-Ding; Sun, Jun
2018-07-01
Based on the structural design concept of ‘functional integration’, this paper proposes the principle of a power-generated magnetorheological energy absorber with velocity self-sensing capability (PGMREA), which realizes the integration of controllable damping mechanism and mechanical energy-electrical energy conversion mechanism in structure profile and multiple functions in function profile, including controllable damping, power generation and velocity self-sensing. The controllable damping mechanism consists of an annular gap and a ball screw. The annular gap fulfilled with MR fluid that operates in pure shear mode under controllable electromagnetic field. The rotational damping torque generated from the controllable damping mechanism is translated to a linear damping force via the ball screw. The mechanical energy-electrical energy conversion mechanism is realized by the ball screw and a generator composed of a permanent magnet rotor and a generator stator. The ball screw based mechanical energy-electrical energy conversion mechanism converts the mechanical energy of excitations to electrical energy for storage or directly to power the controllable damping mechanism of the PGMREA. The velocity self-sensing capability of the PGMREA is achieved via signal processing using the mechanical energy-electrical energy conversion information. Based on the principle of the proposed PGMREA, the mathematical model of the PGMREA is established, including the damping force, generated power and self-sensing velocity. The electromagnetic circuit of the PGMREA is simulated and verified via a finite element analysis software ANSYS. The developed PGMREA prototype is experimentally tested on a servo-hydraulic testing system. The model-based predicted results and the experimental results are compared and analyzed.
Meijer, Rob R; Niessen, A Susan M; Tendeiro, Jorge N
2016-02-01
Although there are many studies devoted to person-fit statistics to detect inconsistent item score patterns, most studies are difficult to understand for nonspecialists. The aim of this tutorial is to explain the principles of these statistics for researchers and clinicians who are interested in applying these statistics. In particular, we first explain how invalid test scores can be detected using person-fit statistics; second, we provide the reader practical examples of existing studies that used person-fit statistics to detect and to interpret inconsistent item score patterns; and third, we discuss a new R-package that can be used to identify and interpret inconsistent score patterns. © The Author(s) 2015.
Farewell, Vern; Johnson, Tony; Gear, Rosemary
2012-01-01
We have previously described the content of a text by Woods and Russell, An Introduction to Medical Statistics, compared it with Principles of Medical Statistics by Hill and set both volumes against the background of vital statistics up until 1937. The two books mark a watershed in the history of medical statistics. Very little has been recorded about the life and career of the first author of the earlier textbook, who was a Fellow of the Royal Statistical Society for at least 25 years, an omission which we can now rectify with this paper. We describe her education, entry into medical statistics, relationship with Major Greenwood and her subsequent career and life in Ceylon, Kenya, Australia, England and South Africa. PMID:22973076
Analysis of hydraulic steering system of tracked all-terrain vehicles' articulated mechanism
NASA Astrophysics Data System (ADS)
Meng, Zhongliang; Zang, Hao
2018-04-01
As for the researches on the dynamic characteristics of tracked all-terrain vehicles' articulated mechanism, the hydraulic feature of their steering system needs researching more, apart from the study on mechanical models. According to the maximum pressure required by the steering system of tracked all-terrain vehicle and the principle of the steering system, this paper conducts an analysis of the hydraulic steering system of the articulated mechanism. Based on the structure principle of the steering gear, a simulation model of the tracked all-terrain vehicle turning left is built. When building the simulation model of the steering gear, it makes a simulation analysis, taking the tracked all-terrain vehicle turning left as an example.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freitag, Mark A.
2001-12-31
The major title of this dissertation, 'From first principles,' is a phase often heard in the study of thermodynamics and quantum mechanics. These words embody a powerful idea in the physical sciences; namely, that it is possible to distill the complexities of nature into a set of simple, well defined mathematical laws from which specific relations can then be derived . In thermodynamics, these fundamental laws are immediately familiar to the physical scientist by their numerical order: the First, Second and Third Laws. However, the subject of the present volume is quantum mechanics-specifically, non-relativistic quantum mechanics, which is appropriate formore » most systems of chemical interest.« less
NASA Astrophysics Data System (ADS)
Gurjanov, A. V.; Zakoldaev, D. A.; Shukalov, A. V.; Zharinov, I. O.
2018-03-01
The task of developing principles of cyber-physical system constitution at the Industry 4.0 company of the item designing components of mechanical assembly production is being studied. The task has been solved by analyzing the components and technologies, which have some practical application in the digital production organization. The list of components has been defined and the authors proposed the scheme of the components and technologies interconnection in the Industry 4.0 of mechanical assembly production to make an uninterrupted manufacturing route of the item designing components with application of some cyber-physical systems.
NASA Astrophysics Data System (ADS)
Gavrus, Adinel
2017-10-01
This scientific paper proposes to prove that the maximum work principle used by theory of continuum media plasticity can be regarded as a consequence of an optimization problem based on constructal theory (prof. Adrian BEJAN). It is known that the thermodynamics define the conservation of energy and the irreversibility of natural systems evolution. From mechanical point of view the first one permits to define the momentum balance equation, respectively the virtual power principle while the second one explains the tendency of all currents to flow from high to low values. According to the constructal law all finite-size system searches to evolve in such configurations that flow more and more easily over time distributing the imperfections in order to maximize entropy and to minimize the losses or dissipations. During a material forming process the application of constructal theory principles leads to the conclusion that under external loads the material flow is that which all dissipated mechanical power (deformation and friction) become minimal. On a mechanical point of view it is then possible to formulate the real state of all mechanical variables (stress, strain, strain rate) as those that minimize the total dissipated power. So between all other virtual non-equilibrium states, the real state minimizes the total dissipated power. It can be then obtained a variational minimization problem and this paper proof in a mathematical sense that starting from this formulation can be finding in a more general form the maximum work principle together with an equivalent form for the friction term. An application in the case of a plane compression of a plastic material shows the feasibility of the proposed minimization problem formulation to find analytical solution for both cases: one without friction influence and a second which take into account Tresca friction law. To valid the proposed formulation, a comparison with a classical analytical analysis based on slices, upper/lower bound methods and numerical Finite Element simulation is also presented.
Time Series Model Identification by Estimating Information.
1982-11-01
principle, Applications of Statistics, P. R. Krishnaiah , ed., North-Holland: Amsterdam, 27-41. Anderson, T. W. (1971). The Statistical Analysis of Time Series...E. (1969). Multiple Time Series Modeling, Multivariate Analysis II, edited by P. Krishnaiah , Academic Press: New York, 389-409. Parzen, E. (1981...Newton, H. J. (1980). Multiple Time Series Modeling, II Multivariate Analysis - V, edited by P. Krishnaiah , North Holland: Amsterdam, 181-197. Shibata, R
ERIC Educational Resources Information Center
de Vries, John
This paper addresses the issue of measuring the integration of various ethnocultural communities into Canadian society by means of statistical or social indicators. The overall philosophy of the study is based on the following principles: (1) indicators should have a clear meaning with respect to the underlying concept of integration; (2)…
Alanna Conners and the Origins of Principled Data Analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
2013-01-01
Alanna was one of the most important pioneers in the development of not just sophisticated algorithms for analyzing astronomical data but more importantly an overall viewpoint emphasizing the use of statistically sound principles in place of blind application of cook-book recipes, or black boxes. I will outline some of the threads of this viewpoint, emphasizing time series data, with a focus on the importance of these developments for the Age of Digital Astronomy that we are entering.
Statistical properties of Chinese phonemic networks
NASA Astrophysics Data System (ADS)
Yu, Shuiyuan; Liu, Haitao; Xu, Chunshan
2011-04-01
The study of properties of speech sound systems is of great significance in understanding the human cognitive mechanism and the working principles of speech sound systems. Some properties of speech sound systems, such as the listener-oriented feature and the talker-oriented feature, have been unveiled with the statistical study of phonemes in human languages and the research of the interrelations between human articulatory gestures and the corresponding acoustic parameters. With all the phonemes of speech sound systems treated as a coherent whole, our research, which focuses on the dynamic properties of speech sound systems in operation, investigates some statistical parameters of Chinese phoneme networks based on real text and dictionaries. The findings are as follows: phonemic networks have high connectivity degrees and short average distances; the degrees obey normal distribution and the weighted degrees obey power law distribution; vowels enjoy higher priority than consonants in the actual operation of speech sound systems; the phonemic networks have high robustness against targeted attacks and random errors. In addition, for investigating the structural properties of a speech sound system, a statistical study of dictionaries is conducted, which shows the higher frequency of shorter words and syllables and the tendency that the longer a word is, the shorter the syllables composing it are. From these structural properties and dynamic properties one can derive the following conclusion: the static structure of a speech sound system tends to promote communication efficiency and save articulation effort while the dynamic operation of this system gives preference to reliable transmission and easy recognition. In short, a speech sound system is an effective, efficient and reliable communication system optimized in many aspects.
Approaches to Foster Transfer of Formal Principles: Which Route to Take?
Schalk, Lennart; Saalbach, Henrik; Stern, Elsbeth
2016-01-01
Enabling learners to transfer knowledge about formal principles to new problems is a major aim of science and mathematics education, which, however, is notoriously difficult to reach. Previous research advocates different approaches of how to introduce principles to foster the transfer of knowledge about formal principles. One approach suggests teaching a generic formalism of the principles. Another approach suggests presenting (at least) two concrete cases instantiating the principle. A third approach suggests presenting a generic formalism accompanied by a case. As yet, though, empirical results regarding the transfer potential of these approaches are mixed and difficult to integrate as the three approaches have rarely been tested competitively. Furthermore, the approaches have been evaluated in relation to different control conditions, and they have been assessed using varying transfer measures. In the present experiment, we introduced undergraduates to the formal principles of propositional logic with the aim to systematically compare the transfer potential of the different approaches in relation to each other and to a common control condition by using various learning and transfer tasks. Results indicate that all approaches supported successful learning and transfer of the principles, but also caused systematic differences in the magnitude of transfer. Results indicate that the combination of a generic formalism with a case was surprisingly unsuccessful while learners who compared two cases outperformed the control condition. We discuss how the simultaneous assessment of the different approaches allows to more precisely capture the underlying learning mechanisms and to advance theory on how these mechanisms contribute to transfer performance.
Uncertainty principle in loop quantum cosmology by Moyal formalism
NASA Astrophysics Data System (ADS)
Perlov, Leonid
2018-03-01
In this paper, we derive the uncertainty principle for the loop quantum cosmology homogeneous and isotropic Friedmann-Lemaiter-Robertson-Walker model with the holonomy-flux algebra. The uncertainty principle is between the variables c, with the meaning of connection and μ having the meaning of the physical cell volume to the power 2/3, i.e., v2 /3 or a plaquette area. Since both μ and c are not operators, but rather the random variables, the Robertson uncertainty principle derivation that works for hermitian operators cannot be used. Instead we use the Wigner-Moyal-Groenewold phase space formalism. The Wigner-Moyal-Groenewold formalism was originally applied to the Heisenberg algebra of the quantum mechanics. One can derive it from both the canonical and path integral quantum mechanics as well as the uncertainty principle. In this paper, we apply it to the holonomy-flux algebra in the case of the homogeneous and isotropic space. Another result is the expression for the Wigner function on the space of the cylindrical wave functions defined on Rb in c variables rather than in dual space μ variables.
Phase separation driven by density-dependent movement: A novel mechanism for ecological patterns.
Liu, Quan-Xing; Rietkerk, Max; Herman, Peter M J; Piersma, Theunis; Fryxell, John M; van de Koppel, Johan
2016-12-01
Many ecosystems develop strikingly regular spatial patterns because of small-scale interactions between organisms, a process generally referred to as spatial self-organization. Self-organized spatial patterns are important determinants of the functioning of ecosystems, promoting the growth and survival of the involved organisms, and affecting the capacity of the organisms to cope with changing environmental conditions. The predominant explanation for self-organized pattern formation is spatial heterogeneity in establishment, growth and mortality, resulting from the self-organization processes. A number of recent studies, however, have revealed that movement of organisms can be an important driving process creating extensive spatial patterning in many ecosystems. Here, we review studies that detail movement-based pattern formation in contrasting ecological settings. Our review highlights that a common principle, where movement of organisms is density-dependent, explains observed spatial regular patterns in all of these studies. This principle, well known to physics as the Cahn-Hilliard principle of phase separation, has so-far remained unrecognized as a general mechanism for self-organized complexity in ecology. Using the examples presented in this paper, we explain how this movement principle can be discerned in ecological settings, and clarify how to test this mechanism experimentally. Our study highlights that animal movement, both in isolation and in unison with other processes, is an important mechanism for regular pattern formation in ecosystems. Copyright © 2016 Elsevier B.V. All rights reserved.
Morrison, Geoffrey Stewart
2014-05-01
In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.
NASA Astrophysics Data System (ADS)
Carlson, Shawn
2016-01-01
Energy conservation is a deep principle that is obeyed by all of the fundamental forces of nature. It puts stringent constraints on all systems, particularly systems that are ‘isolated,’ meaning that no energy can enter or escape. Notwithstanding the success of the principle of stationary action, it is fair to wonder to what extent physics can be formulated from the principle of stationary energy. We show that if one interprets mechanical energy as a state function, then its stationarity leads to a novel formulation of classical mechanics. However, unlike Lagrangian and Hamiltonian mechanics, which deliver their state functions via algebraic proscriptions (i.e., the Lagrangian is always the difference between a system’s kinetic and potential energies), this new formalism identifies its state functions as the solutions to a differential equation. This is an important difference because differential equations can generate more general solutions than algebraic recipes. When applied to Newtonian systems for which the energy function is separable, these state functions are always the mechanical energy. However, while the stationary state function for a charged particle moving in an electromagnetic field proves not to be energy, the function nevertheless correctly encodes the dynamics of the system. Moreover, the stationary state function for a free relativistic particle proves not to be the energy either. Rather, our differential equation yields the relativistic free-particle Lagrangian (plus a non-dynamical constant) in its correct dynamical context. To explain how this new formalism can consistently deliver stationary state functions that give the correct dynamics but that are not always the mechanical energy, we propose that energy conservation is a specific realization of a deeper principle of stationarity that governs both relativistic and non-relativistic mechanics.
Coupled Structural, Thermal, Phase-change and Electromagnetic Analysis for Superconductors, Volume 2
NASA Technical Reports Server (NTRS)
Felippa, C. A.; Farhat, C.; Park, K. C.; Militello, C.; Schuler, J. J.
1996-01-01
Described are the theoretical development and computer implementation of reliable and efficient methods for the analysis of coupled mechanical problems that involve the interaction of mechanical, thermal, phase-change and electromag subproblems. The focus application has been the modeling of superconductivity and associated quantum-state phase change phenomena. In support of this objective the work has addressed the following issues: (1) development of variational principles for finite elements, (2) finite element modeling of the electromagnetic problem, (3) coupling of thermel and mechanical effects, and (4) computer implementation and solution of the superconductivity transition problem. The main accomplishments have been: (1) the development of the theory of parametrized and gauged variational principles, (2) the application of those principled to the construction of electromagnetic, thermal and mechanical finite elements, and (3) the coupling of electromagnetic finite elements with thermal and superconducting effects, and (4) the first detailed finite element simulations of bulk superconductors, in particular the Meissner effect and the nature of the normal conducting boundary layer. The theoretical development is described in two volumes. Volume 1 describes mostly formulation specific problems. Volume 2 describes generalization of those formulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hourdequin, Marion, E-mail: Marion.Hourdequin@ColoradoCollege.edu; Department of Philosophy, Colorado College, 14 E. Cache La Poudre St., Colorado Springs, CO 80903; Landres, Peter
Traditional mechanisms for public participation in environmental impact assessment under U.S. federal law have been criticized as ineffective and unable to resolve conflict. As these mechanisms are modified and new approaches developed, we argue that participation should be designed and evaluated not only on practical grounds of cost-effectiveness and efficiency, but also on ethical grounds based on democratic ideals. In this paper, we review and synthesize modern democratic theory to develop and justify four ethical principles for public participation: equal opportunity to participate, equal access to information, genuine deliberation, and shared commitment. We then explore several tensions that are inherentmore » in applying these ethical principles to public participation in EIA. We next examine traditional NEPA processes and newer collaborative approaches in light of these principles. Finally, we explore the circumstances that argue for more in-depth participatory processes. While improved EIA participatory processes do not guarantee improved outcomes in environmental management, processes informed by these four ethical principles derived from democratic theory may lead to increased public engagement and satisfaction with government agency decisions. - Highlights: Black-Right-Pointing-Pointer Four ethical principles based on democratic theory for public participation in EIA. Black-Right-Pointing-Pointer NEPA and collaboration offer different strengths in meeting these principles. Black-Right-Pointing-Pointer We explore tensions inherent in applying these principles. Black-Right-Pointing-Pointer Improved participatory processes may improve public acceptance of agency decisions.« less
NASA Astrophysics Data System (ADS)
Momida, Hiroyoshi; Oguchi, Tamio
2018-04-01
Longitudinal piezoelectric constant (e 33) values of wurtzite materials, which are listed in a structure database, are calculated and analyzed by using first-principles and statistical learning methods. It is theoretically shown that wurtzite materials with high e 33 generally have small lattice constant ratios (c/a) almost independent of constituent elements, and approximately expressed as e 33 ∝ c/a - (c/a)0 with ideal lattice constant ratio (c/a)0. This relation also holds for highly-piezoelectric ternary materials such as Sc x Al1- x N. We conducted a search for high-piezoelectric wurtzite materials by identifying materials with smaller c/a values. It is proposed that the piezoelectricity of ZnO can be significantly enhanced by substitutions of Zn with Ca.
Statistical Limits to Super Resolution
NASA Astrophysics Data System (ADS)
Lucy, L. B.
1992-08-01
The limits imposed by photon statistics on the degree to which Rayleigh's resolution limit for diffraction-limited images can be surpassed by applying image restoration techniques are investigated. An approximate statistical theory is given for the number of detected photons required in the image of an unresolved pair of equal point sources in order that its information content allows in principle resolution by restoration. This theory is confirmed by numerical restoration experiments on synthetic images, and quantitative limits are presented for restoration of diffraction-limited images formed by slit and circular apertures.
A Note on the Conservation of Mechanical Energy and the Galilean Principle of Relativity
ERIC Educational Resources Information Center
Santos, F. C.; Soares, V.; Tort, A. C.
2010-01-01
A reexamination of simple examples that we usually teach to our students in introductory courses is the starting point for a discussion about the principle of conservation of energy and Galilean invariance. (Contains 5 figures.)
A Mechanical Lattice Aid for Crystallography Teaching.
ERIC Educational Resources Information Center
Amezcua-Lopez, J.; Cordero-Borboa, A. E.
1988-01-01
Introduces a 3-dimensional mechanical lattice with adjustable telescoping mechanisms. Discusses the crystalline state, the 14 Bravais lattices, operational principles of the mechanical lattice, construction methods, and demonstrations in classroom. Provides lattice diagrams, schemes of the lattice, and various pictures of the lattice. (YP)
Wen, Quan; Stepanyants, Armen; Elston, Guy N.; Grosberg, Alexander Y.; Chklovskii, Dmitri B.
2009-01-01
The shapes of dendritic arbors are fascinating and important, yet the principles underlying these complex and diverse structures remain unclear. Here, we analyzed basal dendritic arbors of 2,171 pyramidal neurons sampled from mammalian brains and discovered 3 statistical properties: the dendritic arbor size scales with the total dendritic length, the spatial correlation of dendritic branches within an arbor has a universal functional form, and small parts of an arbor are self-similar. We proposed that these properties result from maximizing the repertoire of possible connectivity patterns between dendrites and surrounding axons while keeping the cost of dendrites low. We solved this optimization problem by drawing an analogy with maximization of the entropy for a given energy in statistical physics. The solution is consistent with the above observations and predicts scaling relations that can be tested experimentally. In addition, our theory explains why dendritic branches of pyramidal cells are distributed more sparsely than those of Purkinje cells. Our results represent a step toward a unifying view of the relationship between neuronal morphology and function. PMID:19622738
The principles of teratology: are they still true?
Friedman, Jan M
2010-10-01
James Wilson originally proposed a set of "Principles of Teratology" in 1959, the year before he helped to found the Teratology Society. By 1977, when these Principles were presented in a more definitive form in Wilson and Fraser's Handbook of Teratology, they had become a standard formulation of the basic tenets of the field. Wilson's Principles have continued to guide scientific research in teratology, and they are widely used in teaching. Recent advances in our knowledge of the molecular and cellular bases of embryogenesis serve only to provide a deeper understanding of the fundamental developmental mechanisms that underlie Wilson's Principles of Teratology. © 2010 Wiley-Liss, Inc.
John, S D
2007-04-01
In this paper the coherence of the precautionary principle as a guide to public health policy is considered. Two conditions that any account of the principle must meet are outlined, a condition of practicality and a condition of publicity. The principle is interpreted in terms of a tripartite division of the outcomes of action (good outcomes, normal bad outcomes and special bad outcomes). Such a division of outcomes can be justified on either "consequentialist" or "deontological" grounds. In the second half of the paper, it is argued that the precautionary principle is not necessarily opposed to risk-cost-benefit analysis, but, rather, should be interpreted as suggesting a lowering of our epistemic standards for assessing evidence that there is a link between some policy and "special bad" outcomes. This suggestion is defended against the claim that it mistakes the nature of statistical testing and against the charge that it is unscientific or antiscientific, and therefore irrational.
Temperature equilibration rate with Fermi-Dirac statistics.
Brown, Lowell S; Singleton, Robert L
2007-12-01
We calculate analytically the electron-ion temperature equilibration rate in a fully ionized, weakly to moderately coupled plasma, using an exact treatment of the Fermi-Dirac electrons. The temperature is sufficiently high so that the quantum-mechanical Born approximation to the scattering is valid. It should be emphasized that we do not build a model of the energy exchange mechanism, but rather, we perform a systematic first principles calculation of the energy exchange. At the heart of this calculation lies the method of dimensional continuation, a technique that we borrow from quantum field theory and use in a different fashion to regulate the kinetic equations in a consistent manner. We can then perform a systematic perturbation expansion and thereby obtain a finite first-principles result to leading and next-to-leading order. Unlike model building, this systematic calculation yields an estimate of its own error and thus prescribes its domain of applicability. The calculational error is small for a weakly to moderately coupled plasma, for which our result is nearly exact. It should also be emphasized that our calculation becomes unreliable for a strongly coupled plasma, where the perturbative expansion that we employ breaks down, and one must then utilize model building and computer simulations. Besides providing different and potentially useful results, we use this calculation as an opportunity to explain the method of dimensional continuation in a pedagogical fashion. Interestingly, in the regime of relevance for many inertial confinement fusion experiments, the degeneracy corrections are comparable in size to the subleading quantum correction below the Born approximation. For consistency, we therefore present this subleading quantum-to-classical transition correction in addition to the degeneracy correction.
Dynamic analysis of multirigid-body system based on the Gauss principle
NASA Astrophysics Data System (ADS)
Lilov, L.; Lorer, M.
Two different approaches can be used for solving the basic dynamic problem in the case of a multirigid body system. The first approach is based on the derivation of the nonlinear equations of motion of the mechanical system, while the second approach is concerned with the direct derivation of the unknown accelerations. Using the Gauss principle, the accelerations can be determined by using the condition for the minimum of a functional. The present investigation is concerned with an algorithm for a dynamical study of a multibody system on the basis of the Gauss principle. The system may contain an arbitrary number of closed loops. The main purpose of the proposed algorithm is the investigation of the dynamics of industrial manipulators, robots, and similar mechanisms.
Peripheral neuropathic pain: a mechanism-related organizing principle based on sensory profiles
Baron, Ralf; Maier, Christoph; Attal, Nadine; Binder, Andreas; Bouhassira, Didier; Cruccu, Giorgio; Finnerup, Nanna B.; Haanpää, Maija; Hansson, Per; Hüllemann, Philipp; Jensen, Troels S.; Freynhagen, Rainer; Kennedy, Jeffrey D.; Magerl, Walter; Mainka, Tina; Reimer, Maren; Rice, Andrew S.C.; Segerdahl, Märta; Serra, Jordi; Sindrup, Sören; Sommer, Claudia; Tölle, Thomas; Vollert, Jan; Treede, Rolf-Detlef
2016-01-01
Abstract Patients with neuropathic pain are heterogeneous in etiology, pathophysiology, and clinical appearance. They exhibit a variety of pain-related sensory symptoms and signs (sensory profile). Different sensory profiles might indicate different classes of neurobiological mechanisms, and hence subgroups with different sensory profiles might respond differently to treatment. The aim of the investigation was to identify subgroups in a large sample of patients with neuropathic pain using hypothesis-free statistical methods on the database of 3 large multinational research networks (German Research Network on Neuropathic Pain (DFNS), IMI-Europain, and Neuropain). Standardized quantitative sensory testing was used in 902 (test cohort) and 233 (validation cohort) patients with peripheral neuropathic pain of different etiologies. For subgrouping, we performed a cluster analysis using 13 quantitative sensory testing parameters. Three distinct subgroups with characteristic sensory profiles were identified and replicated. Cluster 1 (sensory loss, 42%) showed a loss of small and large fiber function in combination with paradoxical heat sensations. Cluster 2 (thermal hyperalgesia, 33%) was characterized by preserved sensory functions in combination with heat and cold hyperalgesia and mild dynamic mechanical allodynia. Cluster 3 (mechanical hyperalgesia, 24%) was characterized by a loss of small fiber function in combination with pinprick hyperalgesia and dynamic mechanical allodynia. All clusters occurred across etiologies but frequencies differed. We present a new approach of subgrouping patients with peripheral neuropathic pain of different etiologies according to intrinsic sensory profiles. These 3 profiles may be related to pathophysiological mechanisms and may be useful in clinical trial design to enrich the study population for treatment responders. PMID:27893485
Peripheral neuropathic pain: a mechanism-related organizing principle based on sensory profiles.
Baron, Ralf; Maier, Christoph; Attal, Nadine; Binder, Andreas; Bouhassira, Didier; Cruccu, Giorgio; Finnerup, Nanna B; Haanpää, Maija; Hansson, Per; Hüllemann, Philipp; Jensen, Troels S; Freynhagen, Rainer; Kennedy, Jeffrey D; Magerl, Walter; Mainka, Tina; Reimer, Maren; Rice, Andrew S C; Segerdahl, Märta; Serra, Jordi; Sindrup, Sören; Sommer, Claudia; Tölle, Thomas; Vollert, Jan; Treede, Rolf-Detlef
2017-02-01
Patients with neuropathic pain are heterogeneous in etiology, pathophysiology, and clinical appearance. They exhibit a variety of pain-related sensory symptoms and signs (sensory profile). Different sensory profiles might indicate different classes of neurobiological mechanisms, and hence subgroups with different sensory profiles might respond differently to treatment. The aim of the investigation was to identify subgroups in a large sample of patients with neuropathic pain using hypothesis-free statistical methods on the database of 3 large multinational research networks (German Research Network on Neuropathic Pain (DFNS), IMI-Europain, and Neuropain). Standardized quantitative sensory testing was used in 902 (test cohort) and 233 (validation cohort) patients with peripheral neuropathic pain of different etiologies. For subgrouping, we performed a cluster analysis using 13 quantitative sensory testing parameters. Three distinct subgroups with characteristic sensory profiles were identified and replicated. Cluster 1 (sensory loss, 42%) showed a loss of small and large fiber function in combination with paradoxical heat sensations. Cluster 2 (thermal hyperalgesia, 33%) was characterized by preserved sensory functions in combination with heat and cold hyperalgesia and mild dynamic mechanical allodynia. Cluster 3 (mechanical hyperalgesia, 24%) was characterized by a loss of small fiber function in combination with pinprick hyperalgesia and dynamic mechanical allodynia. All clusters occurred across etiologies but frequencies differed. We present a new approach of subgrouping patients with peripheral neuropathic pain of different etiologies according to intrinsic sensory profiles. These 3 profiles may be related to pathophysiological mechanisms and may be useful in clinical trial design to enrich the study population for treatment responders.
NASA Astrophysics Data System (ADS)
Zharinov, I. O.; Zharinov, O. O.
2017-12-01
The problem of the research is concerned with quantitative analysis of influence of technological variation of the screen color profile parameters on chromaticity coordinates of the displayed image. Some mathematical expressions which approximate the two-dimensional distribution of chromaticity coordinates of an image, which is displayed on the screen with a three-component color formation principle were proposed. Proposed mathematical expressions show the way to development of correction techniques to improve reproducibility of the colorimetric features of displays.
A New Principle in Physiscs: the Principle "Finiteness", and Some Consequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham Sternlieb
2010-06-25
In this paper I propose a new principle in physics: the principle of "finiteness". It stems from the definition of physics as a science that deals (among other things) with measurable dimensional physical quantities. Since measurement results, including their errors, are always finite, the principle of finiteness postulates that the mathematical formulation of "legitimate" laws of physics should prevent exactly zero or infinite solutions. Some consequences of the principle of finiteness are discussed, in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The consequences are derived independently of any other theory ormore » principle in physics. I propose "finiteness" as a postulate (like the constancy of the speed of light in vacuum, "c"), as opposed to a notion whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories, or principles.« less
On information, negentropy and H-theorem
NASA Astrophysics Data System (ADS)
Chakrabarti, C. G.; Sarker, N. G.
1983-09-01
The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.
Where statistics and molecular microarray experiments biology meet.
Kelmansky, Diana M
2013-01-01
This review chapter presents a statistical point of view to microarray experiments with the purpose of understanding the apparent contradictions that often appear in relation to their results. We give a brief introduction of molecular biology for nonspecialists. We describe microarray experiments from their construction and the biological principles the experiments rely on, to data acquisition and analysis. The role of epidemiological approaches and sample size considerations are also discussed.
Molecular dynamics simulations on networks of heparin and collagen.
Kulke, Martin; Geist, Norman; Friedrichs, Wenke; Langel, Walter
2017-06-01
Synthetic scaffolds containing collagen (Type I) are of increasing interest for bone tissue engineering, especially for highly porous biomaterials in combination with glycosaminoglycans. In experiments the integration of heparin during the fibrillogenesis resulted in different types of collagen fibrils, but models for this aggregation on a molecular scale were only tentative. We conducted molecular dynamic simulations investigating the binding of heparin to collagen and the influence of the telopeptides during collagen aggregation. This aims at explaining experimental findings on a molecular level. Novel structures for N- and C-telopeptides were developed with the TIGER2 replica exchange algorithm and dihedral principle component analysis. We present an extended statistical analysis of the mainly electrostatic interaction between heparin and collagen and identify several binding sites. Finally, we propose a molecular mechanism for the influence of glycosaminoglycans on the morphology of collagen fibrils. Proteins 2017; 85:1119-1130. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Comment on 'Nonlocality, Counterfactuals and Quantum Mechanics'
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, H.P.
A recent proof [H. P. Stapp, Am. J. Phys. 65, 300 (1997)], formulated in the symbolic language of modal logic, claims to show that contemporary quantum theory, viewed as a set of rules that allow us to calculate statistical predictions among certain kinds of observations, cannot be imbedded in any rational framework that conforms to the principles that (1) the experimenters' choices of which experiments they will perform can be considered to be free choices, (2) outcomes of measurements are unique, and (3) the free choices just mentioned have no backward-in-time effects of any kind. This claim is similar tomore » Bell's theorem, but much stronger, because no reality assumption alien to quantum philosophy is used. The paper being commented on [W. Unruh, Phys. Rev. A 59, 126 (1999)] argues that some such reality assumption has been ''smuggled'' in. That argument is examined here and shown, I believe, to be defective.« less
Large Fluctuations for Spatial Diffusion of Cold Atoms
NASA Astrophysics Data System (ADS)
Aghion, Erez; Kessler, David A.; Barkai, Eli
2017-06-01
We use a new approach to study the large fluctuations of a heavy-tailed system, where the standard large-deviations principle does not apply. Large-deviations theory deals with tails of probability distributions and the rare events of random processes, for example, spreading packets of particles. Mathematically, it concerns the exponential falloff of the density of thin-tailed systems. Here we investigate the spatial density Pt(x ) of laser-cooled atoms, where at intermediate length scales the shape is fat tailed. We focus on the rare events beyond this range, which dominate important statistical properties of the system. Through a novel friction mechanism induced by the laser fields, the density is explored with the recently proposed non-normalized infinite-covariant density approach. The small and large fluctuations give rise to a bifractal nature of the spreading packet. We derive general relations which extend our theory to a class of systems with multifractal moments.
Continuity equation for probability as a requirement of inference over paths
NASA Astrophysics Data System (ADS)
González, Diego; Díaz, Daniela; Davis, Sergio
2016-09-01
Local conservation of probability, expressed as the continuity equation, is a central feature of non-equilibrium Statistical Mechanics. In the existing literature, the continuity equation is always motivated by heuristic arguments with no derivation from first principles. In this work we show that the continuity equation is a logical consequence of the laws of probability and the application of the formalism of inference over paths for dynamical systems. That is, the simple postulate that a system moves continuously through time following paths implies the continuity equation. The translation between the language of dynamical paths to the usual representation in terms of probability densities of states is performed by means of an identity derived from Bayes' theorem. The formalism presented here is valid independently of the nature of the system studied: it is applicable to physical systems and also to more abstract dynamics such as financial indicators, population dynamics in ecology among others.
Extracting Models in Single Molecule Experiments
NASA Astrophysics Data System (ADS)
Presse, Steve
2013-03-01
Single molecule experiments can now monitor the journey of a protein from its assembly near a ribosome to its proteolytic demise. Ideally all single molecule data should be self-explanatory. However data originating from single molecule experiments is particularly challenging to interpret on account of fluctuations and noise at such small scales. Realistically, basic understanding comes from models carefully extracted from the noisy data. Statistical mechanics, and maximum entropy in particular, provide a powerful framework for accomplishing this task in a principled fashion. Here I will discuss our work in extracting conformational memory from single molecule force spectroscopy experiments on large biomolecules. One clear advantage of this method is that we let the data tend towards the correct model, we do not fit the data. I will show that the dynamical model of the single molecule dynamics which emerges from this analysis is often more textured and complex than could otherwise come from fitting the data to a pre-conceived model.
Defects in codoped NiO with gigantic dielectric response
NASA Astrophysics Data System (ADS)
Wu, Ping; Ligatchev, Valeri; Yu, Zhi Gen; Zheng, Jianwei; Sullivan, Michael B.; Zeng, Yingzhi
2009-06-01
We combine first-principles, statistical, and phenomenological methods to investigate the electronic and dielectric properties of NiO and clarify the nature of the gigantic dielectric response in codoped NiO. Unlike previous models which are dependent on grain-boundary effects, our model based on small polaron hopping in homogeneous material predicts the dielectric permittivity (104-5) for heavily Li- and MD -codoped NiO (MD=Ti,Al,Si) . Furthermore, we reproduce the experimental trends in dielectric properties as a function of the dopants nature and their concentrations, as well as the reported activation energies for the relaxation in Li- and Ti-codoped NiO (0.308 eV or 0.153 eV depending on the Fermi-level position). In this study, we demonstrate that small polaron hopping on dopant levels is the dominant mechanism for the gigantic dielectric response in these codoped NiO.
NASA Technical Reports Server (NTRS)
Knaub, D.; Yerazunis, S. W.
1978-01-01
Vertical wheel loads, wheel speeds, and torque relationships are considered in the design of a propulsion system capable of responding to steering, slope climbing, and irregular local terrains. The system developed is applied to the RPI Mars roving vehicle. The mechanical system required to implement the elevation laser scanning/multidetector principle was the design and construction of a mechanical system for implementing the elevation scanning/multidetector principle is also discussed.
Primer of statistics in dental research: part I.
Shintani, Ayumi
2014-01-01
Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.
Rationally designed synthetic protein hydrogels with predictable mechanical properties.
Wu, Junhua; Li, Pengfei; Dong, Chenling; Jiang, Heting; Bin Xue; Gao, Xiang; Qin, Meng; Wang, Wei; Bin Chen; Cao, Yi
2018-02-12
Designing synthetic protein hydrogels with tailored mechanical properties similar to naturally occurring tissues is an eternal pursuit in tissue engineering and stem cell and cancer research. However, it remains challenging to correlate the mechanical properties of protein hydrogels with the nanomechanics of individual building blocks. Here we use single-molecule force spectroscopy, protein engineering and theoretical modeling to prove that the mechanical properties of protein hydrogels are predictable based on the mechanical hierarchy of the cross-linkers and the load-bearing modules at the molecular level. These findings provide a framework for rationally designing protein hydrogels with independently tunable elasticity, extensibility, toughness and self-healing. Using this principle, we demonstrate the engineering of self-healable muscle-mimicking hydrogels that can significantly dissipate energy through protein unfolding. We expect that this principle can be generalized for the construction of protein hydrogels with customized mechanical properties for biomedical applications.
Biomechanical concepts applicable to minimally invasive fracture repair in small animals.
Chao, Peini; Lewis, Daniel D; Kowaleski, Michael P; Pozzi, Antonio
2012-09-01
Understanding the basic biomechanical principles of surgical stabilization of fractures is essential for developing an appropriate preoperative plan as well as making prudent intraoperative decisions. This article aims to provide basic biomechanical knowledge essential to the understanding of the complex interaction between the mechanics and biology of fracture healing. The type of healing and the outcome can be influenced by several mechanical factors, which depend on the interaction between bone and implant. The surgeon should understand the mechanical principles of fracture fixation and be able to choose the best type of fixation for each specific fracture. Copyright © 2012 Elsevier Inc. All rights reserved.
Mechanisms of developmental neurite pruning.
Schuldiner, Oren; Yaron, Avraham
2015-01-01
The precise wiring of the nervous system is a combined outcome of progressive and regressive events during development. Axon guidance and synapse formation intertwined with cell death and neurite pruning sculpt the mature circuitry. It is now well recognized that pruning of dendrites and axons as means to refine neuronal networks, is a wide spread phenomena required for the normal development of vertebrate and invertebrate nervous systems. Here we will review the arising principles of cellular and molecular mechanisms of neurite pruning. We will discuss these principles in light of studies in multiple neuronal systems, and speculate on potential explanations for the emergence of neurite pruning as a mechanism to sculpt the nervous system.
Conceptual Models and Theory-Embedded Principles on Effective Schooling.
ERIC Educational Resources Information Center
Scheerens, Jaap
1997-01-01
Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…
NASA Astrophysics Data System (ADS)
Eisenbach, Markus
The Locally Self-consistent Multiple Scattering (LSMS) code solves the first principles Density Functional theory Kohn-Sham equation for a wide range of materials with a special focus on metals, alloys and metallic nano-structures. It has traditionally exhibited near perfect scalability on massively parallel high performance computer architectures. We present our efforts to exploit GPUs to accelerate the LSMS code to enable first principles calculations of O(100,000) atoms and statistical physics sampling of finite temperature properties. Using the Cray XK7 system Titan at the Oak Ridge Leadership Computing Facility we achieve a sustained performance of 14.5PFlop/s and a speedup of 8.6 compared to the CPU only code. This work has been sponsored by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Material Sciences and Engineering Division and by the Office of Advanced Scientific Computing. This work used resources of the Oak Ridge Leadership Computing Facility, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725.
NASA Astrophysics Data System (ADS)
Lytvynenko, D. M.; Slyusarenko, Yu V.
2017-08-01
A theory of quasi-neutral equilibrium states of charges above a liquid dielectric surface is developed. This theory is based on the first principles of quantum statistics for systems comprising many identical particles. The proposed approach involves applying the variational principle, modified for the considered systems, and the Thomas-Fermi model. In the terms of the developed theory self-consistency equations are obtained. These equations provide the relation between the main parameters describing the system: the potential of the static electric field, the distribution function of charges and the surface profile of the liquid dielectric. The equations are used to study the phase transition in the system to a spatially periodic state. The proposed method can be applied in analyzing the properties of the phase transition in the system in relation to the spatially periodic states of wave type. Using the analytical and numerical methods, we perform a detailed study of the dependence of the critical parameters of such a phase transition on the thickness of the liquid dielectric film. Some stability criteria for the new asymmetric phase of the studied system are discussed.
NASA Astrophysics Data System (ADS)
Tanona, Scott Daniel
I develop a new analysis of Niels Bohr's Copenhagen interpretation of quantum mechanics by examining the development of his views from his earlier use of the correspondence principle in the so-called 'old quantum theory' to his articulation of the idea of complementarity in the context of the novel mathematical formalism of quantum mechanics. I argue that Bohr was motivated not by controversial and perhaps dispensable epistemological ideas---positivism or neo-Kantianism, for example---but by his own unique perspective on the difficulties of creating a new working physics of the internal structure of the atom. Bohr's use of the correspondence principle in the old quantum theory was associated with an empirical methodology that used this principle as an epistemological bridge to connect empirical phenomena with quantum models. The application of the correspondence principle required that one determine the validity of the idealizations and approximations necessary for the judicious use of classical physics within quantum theory. Bohr's interpretation of the new quantum mechanics then focused on the largely unexamined ways in which the developing abstract mathematical formalism is given empirical content by precisely this process of approximation. Significant consistency between his later interpretive framework and his forms of argument with the correspondence principle indicate that complementarity is best understood as a relationship among the various approximations and idealizations that must be made when one connects otherwise meaningless quantum mechanical symbols to empirical situations or 'experimental arrangements' described using concepts from classical physics. We discover that this relationship is unavoidable not through any sort of a priori analysis of the priority of classical concepts, but because quantum mechanics incorporates the correspondence approach in the way in which it represents quantum properties with matrices of transition probabilities, the empirical meaning of which depend on the situation but in general are tied to the correspondence connection to the spectra. For Bohr, it is then the commutation relations, which arise from the formalism, which inform us of the complementary nature of this approximate representation of quantum properties via the classical equations through which we connect them to experiments.
Bag-breakup control of surface drag in hurricanes
NASA Astrophysics Data System (ADS)
Troitskaya, Yuliya; Zilitinkevich, Sergej; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil
2016-04-01
Air-sea interaction at extreme winds is of special interest now in connection with the problem of the sea surface drag reduction at the wind speed exceeding 30-35 m/s. This phenomenon predicted by Emanuel (1995) and confirmed by a number of field (e.g., Powell, et al, 2003) and laboratory (Donelan et al, 2004) experiments still waits its physical explanation. Several papers attributed the drag reduction to spume droplets - spray turning off the crests of breaking waves (e.g., Kudryavtsev, Makin, 2011, Bao, et al, 2011). The fluxes associated with the spray are determined by the rate of droplet production at the surface quantified by the sea spray generation function (SSGF), defined as the number of spray particles of radius r produced from the unit area of water surface in unit time. However, the mechanism of spume droplets' formation is unknown and empirical estimates of SSGF varied over six orders of magnitude; therefore, the production rate of large sea spray droplets is not adequately described and there are significant uncertainties in estimations of exchange processes in hurricanes. Herewith, it is unknown what is air-sea interface and how water is fragmented to spray at hurricane wind. Using high-speed video, we observed mechanisms of production of spume droplets at strong winds by high-speed video filming, investigated statistics and compared their efficiency. Experiments showed, that the generation of the spume droplets near the wave crest is caused by the following events: bursting of submerged bubbles, generation and breakup of "projections" and "bag breakup". Statistical analysis of results of these experiments showed that the main mechanism of spray-generation is attributed to "bag-breakup mechanism", namely, inflating and consequent blowing of short-lived, sail-like pieces of the water-surface film. Using high-speed video, we show that at hurricane winds the main mechanism of spray production is attributed to "bag-breakup", namely, inflating and consequent breaking of short-lived, sail-like pieces of the water-surface film - "bags". On the base of general principles of statistical physics (model of a canonical ensemble) we developed statistics of the "bag-breakup" events: their number and statistical distribution of geometrical parameters depending on wind speed. Basing on the developed statistics, we estimated the surface stress caused by bags as the average sum of stresses caused by individual bags depending on their eometrical parameters. The resulting stress is subjected to counteracting impacts of the increasing wind speed: the increasing number of bags, and their decreasing sizes and life times and the balance yields a peaking dependence of the bag resistance on the wind speed: the share of bag-stress peaks at U10 35 m/s and then reduces. Peaking of surface stress associated with the "bag-breakup" explains seemingly paradoxical non-monotonous wind-dependence of surface drag coefficient peaking at winds about 35 m/s. This work was supported by the Russian Foundation of Basic Research (14-05-91767, 13-05-12093, 16-05-00839, 14-05-91767, 16-55-52025, 15-35-20953) and experiment and equipment was supported by Russian Science Foundation (Agreements 14-17-00667 and 15-17-20009 respectively), Yu.Troitskaya, A.Kandaurov and D.Sergeev were partially supported by FP7 Collaborative Project No. 612610.
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
Ionospheric scintillation studies
NASA Technical Reports Server (NTRS)
Rino, C. L.; Freemouw, E. J.
1973-01-01
The diffracted field of a monochromatic plane wave was characterized by two complex correlation functions. For a Gaussian complex field, these quantities suffice to completely define the statistics of the field. Thus, one can in principle calculate the statistics of any measurable quantity in terms of the model parameters. The best data fits were achieved for intensity statistics derived under the Gaussian statistics hypothesis. The signal structure that achieved the best fit was nearly invariant with scintillation level and irregularity source (ionosphere or solar wind). It was characterized by the fact that more than 80% of the scattered signal power is in phase quadrature with the undeviated or coherent signal component. Thus, the Gaussian-statistics hypothesis is both convenient and accurate for channel modeling work.
Statistical mechanics based on fractional classical and quantum mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
A mechanical design principle for tissue structure and function in the airway tree.
LaPrad, Adam S; Lutchen, Kenneth R; Suki, Béla
2013-01-01
With every breath, the dynamically changing mechanical pressures must work in unison with the cells and soft tissue structures of the lung to permit air to efficiently traverse the airway tree and undergo gas exchange in the alveoli. The influence of mechanics on cell and tissue function is becoming apparent, raising the question: how does the airway tree co-exist within its mechanical environment to maintain normal cell function throughout its branching structure of diminishing dimensions? We introduce a new mechanical design principle for the conducting airway tree in which mechanotransduction at the level of cells is driven to orchestrate airway wall structural changes that can best maintain a preferred mechanical microenvironment. To support this principle, we report in vitro radius-transmural pressure relations for a range of airway radii obtained from healthy bovine lungs and model the data using a strain energy function together with a thick-walled cylinder description. From this framework, we estimate circumferential stresses and incremental Young's moduli throughout the airway tree. Our results indicate that the conducting airways consistently operate within a preferred mechanical homeostatic state, termed mechanical homeostasis, that is characterized by a narrow range of circumferential stresses and Young's moduli. This mechanical homeostatic state is maintained for all airways throughout the tree via airway wall dimensional and mechanical relationships. As a consequence, cells within the airway walls throughout the airway tree experience similar oscillatory strains during breathing that are much smaller than previously thought. Finally, we discuss the potential implications of how the maintenance of mechanical homeostasis, while facilitating healthy tissue-level alterations necessary for maturation, may lead to airway wall structural changes capable of chronic asthma.
A Mechanical Design Principle for Tissue Structure and Function in the Airway Tree
LaPrad, Adam S.; Lutchen, Kenneth R.; Suki, Béla
2013-01-01
With every breath, the dynamically changing mechanical pressures must work in unison with the cells and soft tissue structures of the lung to permit air to efficiently traverse the airway tree and undergo gas exchange in the alveoli. The influence of mechanics on cell and tissue function is becoming apparent, raising the question: how does the airway tree co-exist within its mechanical environment to maintain normal cell function throughout its branching structure of diminishing dimensions? We introduce a new mechanical design principle for the conducting airway tree in which mechanotransduction at the level of cells is driven to orchestrate airway wall structural changes that can best maintain a preferred mechanical microenvironment. To support this principle, we report in vitro radius-transmural pressure relations for a range of airway radii obtained from healthy bovine lungs and model the data using a strain energy function together with a thick-walled cylinder description. From this framework, we estimate circumferential stresses and incremental Young's moduli throughout the airway tree. Our results indicate that the conducting airways consistently operate within a preferred mechanical homeostatic state, termed mechanical homeostasis, that is characterized by a narrow range of circumferential stresses and Young's moduli. This mechanical homeostatic state is maintained for all airways throughout the tree via airway wall dimensional and mechanical relationships. As a consequence, cells within the airway walls throughout the airway tree experience similar oscillatory strains during breathing that are much smaller than previously thought. Finally, we discuss the potential implications of how the maintenance of mechanical homeostasis, while facilitating healthy tissue-level alterations necessary for maturation, may lead to airway wall structural changes capable of chronic asthma. PMID:23737742
Many-Body Localization and Thermalization in Quantum Statistical Mechanics
NASA Astrophysics Data System (ADS)
Nandkishore, Rahul; Huse, David A.
2015-03-01
We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.
LeChâtelier's Principle in the Sciences
NASA Astrophysics Data System (ADS)
Thomsen, Volker B. E.
2000-02-01
LeChâtelier's principle of chemical equilibrium is actually a very general statement about systems in equilibrium and their behavior when subjected to external force or stress. Although one almost never finds mention of his name or law in other sciences, analogous principles and concepts do exist. In this note we examine some of the similar forms taken by this chemical principle in the fields of physics, geology, biology, and economics. Lenz's law in physics is an example of electromagnetic equilibrium and the geological principle of isostatic uplift concerns mechanical equilibrium. Both are strictly consequences of conservation of energy. LeChâtelier's principle deals with thermodynamic equilibrium and involves both the first and second laws of thermodynamics. The concept of homeostasis in biology and the economic law of supply and demand are both equilibrium-like principles, but involve systems in the steady state. However, all these principles involve the stability of the system under consideration and the analogies presented may be useful in the teaching of LeChâtelier's principle.
Approaches to Foster Transfer of Formal Principles: Which Route to Take?
Schalk, Lennart; Saalbach, Henrik; Stern, Elsbeth
2016-01-01
Enabling learners to transfer knowledge about formal principles to new problems is a major aim of science and mathematics education, which, however, is notoriously difficult to reach. Previous research advocates different approaches of how to introduce principles to foster the transfer of knowledge about formal principles. One approach suggests teaching a generic formalism of the principles. Another approach suggests presenting (at least) two concrete cases instantiating the principle. A third approach suggests presenting a generic formalism accompanied by a case. As yet, though, empirical results regarding the transfer potential of these approaches are mixed and difficult to integrate as the three approaches have rarely been tested competitively. Furthermore, the approaches have been evaluated in relation to different control conditions, and they have been assessed using varying transfer measures. In the present experiment, we introduced undergraduates to the formal principles of propositional logic with the aim to systematically compare the transfer potential of the different approaches in relation to each other and to a common control condition by using various learning and transfer tasks. Results indicate that all approaches supported successful learning and transfer of the principles, but also caused systematic differences in the magnitude of transfer. Results indicate that the combination of a generic formalism with a case was surprisingly unsuccessful while learners who compared two cases outperformed the control condition. We discuss how the simultaneous assessment of the different approaches allows to more precisely capture the underlying learning mechanisms and to advance theory on how these mechanisms contribute to transfer performance. PMID:26871902
NASA Astrophysics Data System (ADS)
Lu, Haibao; Yu, Kai; Huang, Wei Min; Leng, Jinsong
2016-12-01
We present an explicit model to study the mechanics and physics of the shape memory effect (SME) in polymers based on the Takayanagi principle. The molecular structural characteristics and elastic behavior of shape memory polymers (SMPs) with multi-phases are investigated in terms of the thermomechanical properties of the individual components, of which the contributions are combined by using Takayanagi’s series-parallel model and parallel-series model, respectively. After that, Boltzmann superposition principle is employed to couple the multi-SME, elastic modulus parameter (E) and temperature parameter (T) in SMPs. Furthermore, the extended Takayanagi model is proposed to separate the plasticizing effect and physical swelling effect on the thermo-/chemo-responsive SME in polymers and then compared with the available experimental data reported in the literature. This study is expected to provide a powerful simulation tool for modeling and experimental substantiation of the mechanics and working mechanism of SME in polymers.
NASA Astrophysics Data System (ADS)
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
“Stringy” coherent states inspired by generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Ghosh, Subir; Roy, Pinaki
2012-05-01
Coherent States with Fractional Revival property, that explicitly satisfy the Generalized Uncertainty Principle (GUP), have been constructed in the context of Generalized Harmonic Oscillator. The existence of such states is essential in motivating the GUP based phenomenological results present in the literature which otherwise would be of purely academic interest. The effective phase space is Non-Canonical (or Non-Commutative in popular terminology). Our results have a smooth commutative limit, equivalent to Heisenberg Uncertainty Principle. The Fractional Revival time analysis yields an independent bound on the GUP parameter. Using this and similar bounds obtained here, we derive the largest possible value of the (GUP induced) minimum length scale. Mandel parameter analysis shows that the statistics is Sub-Poissonian. Correspondence Principle is deformed in an interesting way. Our computational scheme is very simple as it requires only first order corrected energy values and undeformed basis states.
Similarity principles for the biology of pelagic animals
Barenblatt, G. I.; Monin, A. S.
1983-01-01
A similarity principle is formulated according to which the statistical pattern of the pelagic population is identical in all scales sufficiently large in comparison with the molecular one. From this principle, a power law is obtained analytically for the pelagic animal biomass distribution over the animal sizes. A hypothesis is presented according to which, under fixed external conditions, the oxygen exchange intensity of an animal is governed only by its mass and density and by the specific absorbing capacity of the animal's respiratory organ. From this hypothesis a power law is obtained by the method of dimensional analysis for the exchange intensity mass dependence. The known empirical values of the exponent of this power law are interpreted as an indication that the oxygen-absorbing organs of the animals can be represented as so-called fractal surfaces. In conclusion the biological principle of the decrease in specific exchange intensity with increase in animal mass is discussed. PMID:16593327
Quantum Mechanical Earth: Where Orbitals Become Orbits
ERIC Educational Resources Information Center
Keeports, David
2012-01-01
Macroscopic objects, although quantum mechanical by nature, conform to Newtonian mechanics under normal observation. According to the quantum mechanical correspondence principle, quantum behavior is indistinguishable from classical behavior in the limit of very large quantum numbers. The purpose of this paper is to provide an example of the…
The Architect's Guide to Mechanical Systems.
ERIC Educational Resources Information Center
Andrews, F. T.
The principles and problems of designing new building mechanical systems are discussed in this reference source in the light of data on the functions and operation of mechanical systems. As a practical guide to understanding mechanical systems it describes system types, functions, space requirements, weights, installation, maintenance and…
High-Performance First-Principles Molecular Dynamics for Predictive Theory and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gygi, Francois; Galli, Giulia; Schwegler, Eric
This project focused on developing high-performance software tools for First-Principles Molecular Dynamics (FPMD) simulations, and applying them in investigations of materials relevant to energy conversion processes. FPMD is an atomistic simulation method that combines a quantum-mechanical description of electronic structure with the statistical description provided by molecular dynamics (MD) simulations. This reliance on fundamental principles allows FPMD simulations to provide a consistent description of structural, dynamical and electronic properties of a material. This is particularly useful in systems for which reliable empirical models are lacking. FPMD simulations are increasingly used as a predictive tool for applications such as batteries, solarmore » energy conversion, light-emitting devices, electro-chemical energy conversion devices and other materials. During the course of the project, several new features were developed and added to the open-source Qbox FPMD code. The code was further optimized for scalable operation of large-scale, Leadership-Class DOE computers. When combined with Many-Body Perturbation Theory (MBPT) calculations, this infrastructure was used to investigate structural and electronic properties of liquid water, ice, aqueous solutions, nanoparticles and solid-liquid interfaces. Computing both ionic trajectories and electronic structure in a consistent manner enabled the simulation of several spectroscopic properties, such as Raman spectra, infrared spectra, and sum-frequency generation spectra. The accuracy of the approximations used allowed for direct comparisons of results with experimental data such as optical spectra, X-ray and neutron diffraction spectra. The software infrastructure developed in this project, as applied to various investigations of solids, liquids and interfaces, demonstrates that FPMD simulations can provide a detailed, atomic-scale picture of structural, vibrational and electronic properties of complex systems relevant to energy conversion devices.« less
The growth of language: Universal Grammar, experience, and principles of computation.
Yang, Charles; Crain, Stephen; Berwick, Robert C; Chomsky, Noam; Bolhuis, Johan J
2017-10-01
Human infants develop language remarkably rapidly and without overt instruction. We argue that the distinctive ontogenesis of child language arises from the interplay of three factors: domain-specific principles of language (Universal Grammar), external experience, and properties of non-linguistic domains of cognition including general learning mechanisms and principles of efficient computation. We review developmental evidence that children make use of hierarchically composed structures ('Merge') from the earliest stages and at all levels of linguistic organization. At the same time, longitudinal trajectories of development show sensitivity to the quantity of specific patterns in the input, which suggests the use of probabilistic processes as well as inductive learning mechanisms that are suitable for the psychological constraints on language acquisition. By considering the place of language in human biology and evolution, we propose an approach that integrates principles from Universal Grammar and constraints from other domains of cognition. We outline some initial results of this approach as well as challenges for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.
1983-09-01
research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis
Principle of Maximum Fisher Information from Hardy’s Axioms Applied to Statistical Systems
Frieden, B. Roy; Gatenby, Robert A.
2014-01-01
Consider a finite-sized, multidimensional system in a parameter state a. The system is in either a state of equilibrium or general non-equilibrium, and may obey either classical or quantum physics. L. Hardy’s mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N = max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N = max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I = Imax. This is important because many physical laws have been derived, assuming as a working hypothesis that I = Imax. These derivations include uses of the principle of Extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell’s equations, new laws of biology (e.g. of Coulomb force-directed cell development, and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I = Imax itself derives, from suitably extended Hardy axioms, thereby eliminates its need to be assumed in these derivations. Thus, uses of I = Imax and EPI express physics at its most fundamental level – its axiomatic basis in math. PMID:24229152
ENVIRONMENTAL SAMPLING: A BRIEF REVIEW
Proper application of statistical principles at the outset of an environmental study can make the difference between an effective, efficient study and wasted resources. This review distills some of the thoughts current among environmental scientists from a variety of backgrounds ...
Visual aftereffects and sensory nonlinearities from a single statistical framework
Laparra, Valero; Malo, Jesús
2015-01-01
When adapted to a particular scenery our senses may fool us: colors are misinterpreted, certain spatial patterns seem to fade out, and static objects appear to move in reverse. A mere empirical description of the mechanisms tuned to color, texture, and motion may tell us where these visual illusions come from. However, such empirical models of gain control do not explain why these mechanisms work in this apparently dysfunctional manner. Current normative explanations of aftereffects based on scene statistics derive gain changes by (1) invoking decorrelation and linear manifold matching/equalization, or (2) using nonlinear divisive normalization obtained from parametric scene models. These principled approaches have different drawbacks: the first is not compatible with the known saturation nonlinearities in the sensors and it cannot fully accomplish information maximization due to its linear nature. In the second, gain change is almost determined a priori by the assumed parametric image model linked to divisive normalization. In this study we show that both the response changes that lead to aftereffects and the nonlinear behavior can be simultaneously derived from a single statistical framework: the Sequential Principal Curves Analysis (SPCA). As opposed to mechanistic models, SPCA is not intended to describe how physiological sensors work, but it is focused on explaining why they behave as they do. Nonparametric SPCA has two key advantages as a normative model of adaptation: (i) it is better than linear techniques as it is a flexible equalization that can be tuned for more sensible criteria other than plain decorrelation (either full information maximization or error minimization); and (ii) it makes no a priori functional assumption regarding the nonlinearity, so the saturations emerge directly from the scene data and the goal (and not from the assumed function). It turns out that the optimal responses derived from these more sensible criteria and SPCA are consistent with dysfunctional behaviors such as aftereffects. PMID:26528165
NASA Astrophysics Data System (ADS)
Rau, Uwe; Brendel, Rolf
1998-12-01
It is shown that a recently described general relationship between the local collection efficiency of solar cells and the dark carrier concentration (reciprocity theorem) directly follows from the principle of detailed balance. We derive the relationship for situations where transport of charge carriers occurs between discrete states as well as for the situation where electronic transport is described in terms of continuous functions. Combining both situations allows to extend the range of applicability of the reciprocity theorem to all types of solar cells, including, e.g., metal-insulator-semiconductor-type, electrochemical solar cells, as well as the inclusion of the impurity photovoltaic effect. We generalize the theorem further to situations where the occupation probability of electronic states is governed by Fermi-Dirac statistics instead of Boltzmann statistics as underlying preceding work. In such a situation the reciprocity theorem is restricted to small departures from equilibrium.
A.N. Kolmogorov’s defence of Mendelism
Stark, Alan; Seneta, Eugene
2011-01-01
In 1939 N.I. Ermolaeva published the results of an experiment which repeated parts of Mendel’s classical experiments. On the basis of her experiment she concluded that Mendel’s principle that self-pollination of hybrid plants gave rise to segregation proportions 3:1 was false. The great probability theorist A.N. Kolmogorov reviewed Ermolaeva’s data using a test, now referred to as Kolmogorov’s, or Kolmogorov-Smirnov, test, which he had proposed in 1933. He found, contrary to Ermolaeva, that her results clearly confirmed Mendel’s principle. This paper shows that there were methodological flaws in Kolmogorov’s statistical analysis and presents a substantially adjusted approach, which confirms his conclusions. Some historical commentary on the Lysenko-era background is given, to illuminate the relationship of the disciplines of genetics and statistics in the struggle against the prevailing politically-correct pseudoscience in the Soviet Union. There is a Brazilian connection through the person of Th. Dobzhansky. PMID:21734813
NASA Astrophysics Data System (ADS)
Eisenthal, Joshua
2018-05-01
At the time of Heinrich Hertz's premature death in 1894, he was regarded as one of the leading scientists of his generation. However, the posthumous publication of his treatise in the foundations of physics, Principles of Mechanics, presents a curious historical situation. Although Hertz's book was widely praised and admired, it was also met with a general sense of dissatisfaction. Almost all of Hertz's contemporaries criticized Principles for the lack of any plausible way to construct a mechanism from the "hidden masses" that are particularly characteristic of Hertz's framework. This issue seemed especially glaring given the expectation that Hertz's work might lead to a model of the underlying workings of the ether. In this paper I seek an explanation for why Hertz seemed so unperturbed by the difficulties of constructing such a mechanism. In arriving at this explanation, I explore how the development of Hertz's image-theory of representation framed the project of Principles. The image-theory brings with it an austere view of the "essential content" of mechanics, only requiring a kind of structural isomorphism between symbolic representations and target phenomena. I argue that bringing this into view makes clear why Hertz felt no need to work out the kinds of mechanisms that many of his readers looked for. Furthermore, I argue that a crucial role of Hertz's hypothesis of hidden masses has been widely overlooked. Far from acting as a proposal for the underlying structure of the ether, I show that Hertz's hypothesis ruled out knowledge of such underlying structure.
Difference to Inference: teaching logical and statistical reasoning through on-line interactivity.
Malloy, T E
2001-05-01
Difference to Inference is an on-line JAVA program that simulates theory testing and falsification through research design and data collection in a game format. The program, based on cognitive and epistemological principles, is designed to support learning of the thinking skills underlying deductive and inductive logic and statistical reasoning. Difference to Inference has database connectivity so that game scores can be counted as part of course grades.
Statistical aspects of the Klein-Gordon oscillator in the frame work of GUP
NASA Astrophysics Data System (ADS)
Khosropour, B.
2018-01-01
Investigation in perturbative string theory and quantum gravity suggest that there is a measurable minimal length in nature. In this work, according to generalized uncertainty principle, we study the statistical characteristics of Klein-Gordon Oscillator (KLO). The modified energy spectrum of the KLO are obtained. The generalized thermodynamical quantities of the KLO such as partition function, mean energy and entropy are calculated by using the modified energy spectrum.
A Non-Intrusive Algorithm for Sensitivity Analysis of Chaotic Flow Simulations
NASA Technical Reports Server (NTRS)
Blonigan, Patrick J.; Wang, Qiqi; Nielsen, Eric J.; Diskin, Boris
2017-01-01
We demonstrate a novel algorithm for computing the sensitivity of statistics in chaotic flow simulations to parameter perturbations. The algorithm is non-intrusive but requires exposing an interface. Based on the principle of shadowing in dynamical systems, this algorithm is designed to reduce the effect of the sampling error in computing sensitivity of statistics in chaotic simulations. We compare the effectiveness of this method to that of the conventional finite difference method.
MECHANICAL POWER TRANSFER SYSTEMS. AGRICULTURAL MACHINERY-SERVICE OCCUPATIONS, MODULE NUMBER 8.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Center for Vocational and Technical Education.
ONE OF A SERIES DESIGNED TO HELP TEACHERS PREPARE POSTSECONDARY-LEVEL STUDENTS FOR THE AGRICULTURAL MACHINERY SERVICE OCCUPATIONS AS PARTS MEN, MECHANICS, MECHANIC'S HELPERS, AND SERVICE SUPERVISORS, THIS GUIDE AIMS TO DEVELOP STUDENT COMPETENCY IN UNDERSTANDING AND APPLYING THE PRINCIPLES OF MECHANICAL POWER TRANSMISSION IN AGRICULTURAL…
MODELING THE INTERACTION THRESHOLD: THE BREAK-POINT BETWEEN ADDITIVITY AND NON-ADDITIVITY
Dose-dependent changes in toxicity mechanisms of single chemicals may take place along the full dose-response spectrum. At high doses, the possibility exists for some steps in the principle mechanism of toxicity to shift to other mechanisms. The possibility of mechanism shifts fo...
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
Particles, Waves, and the Interpretation of Quantum Mechanics
ERIC Educational Resources Information Center
Christoudouleas, N. D.
1975-01-01
Presents an explanation, without mathematical equations, of the basic principles of quantum mechanics. Includes wave-particle duality, the probability character of the wavefunction, and the uncertainty relations. (MLH)
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
MICROSCOPE Mission: First Results of a Space Test of the Equivalence Principle.
Touboul, Pierre; Métris, Gilles; Rodrigues, Manuel; André, Yves; Baghi, Quentin; Bergé, Joël; Boulanger, Damien; Bremer, Stefanie; Carle, Patrice; Chhun, Ratana; Christophe, Bruno; Cipolla, Valerio; Damour, Thibault; Danto, Pascale; Dittus, Hansjoerg; Fayet, Pierre; Foulon, Bernard; Gageant, Claude; Guidotti, Pierre-Yves; Hagedorn, Daniel; Hardy, Emilie; Huynh, Phuong-Anh; Inchauspe, Henri; Kayser, Patrick; Lala, Stéphanie; Lämmerzahl, Claus; Lebat, Vincent; Leseur, Pierre; Liorzou, Françoise; List, Meike; Löffler, Frank; Panet, Isabelle; Pouilloux, Benjamin; Prieur, Pascal; Rebray, Alexandre; Reynaud, Serge; Rievers, Benny; Robert, Alain; Selig, Hanns; Serron, Laura; Sumner, Timothy; Tanguy, Nicolas; Visser, Pieter
2017-12-08
According to the weak equivalence principle, all bodies should fall at the same rate in a gravitational field. The MICROSCOPE satellite, launched in April 2016, aims to test its validity at the 10^{-15} precision level, by measuring the force required to maintain two test masses (of titanium and platinum alloys) exactly in the same orbit. A nonvanishing result would correspond to a violation of the equivalence principle, or to the discovery of a new long-range force. Analysis of the first data gives δ(Ti,Pt)=[-1±9(stat)±9(syst)]×10^{-15} (1σ statistical uncertainty) for the titanium-platinum Eötvös parameter characterizing the relative difference in their free-fall accelerations.
MICROSCOPE Mission: First Results of a Space Test of the Equivalence Principle
NASA Astrophysics Data System (ADS)
Touboul, Pierre; Métris, Gilles; Rodrigues, Manuel; André, Yves; Baghi, Quentin; Bergé, Joël; Boulanger, Damien; Bremer, Stefanie; Carle, Patrice; Chhun, Ratana; Christophe, Bruno; Cipolla, Valerio; Damour, Thibault; Danto, Pascale; Dittus, Hansjoerg; Fayet, Pierre; Foulon, Bernard; Gageant, Claude; Guidotti, Pierre-Yves; Hagedorn, Daniel; Hardy, Emilie; Huynh, Phuong-Anh; Inchauspe, Henri; Kayser, Patrick; Lala, Stéphanie; Lämmerzahl, Claus; Lebat, Vincent; Leseur, Pierre; Liorzou, Françoise; List, Meike; Löffler, Frank; Panet, Isabelle; Pouilloux, Benjamin; Prieur, Pascal; Rebray, Alexandre; Reynaud, Serge; Rievers, Benny; Robert, Alain; Selig, Hanns; Serron, Laura; Sumner, Timothy; Tanguy, Nicolas; Visser, Pieter
2017-12-01
According to the weak equivalence principle, all bodies should fall at the same rate in a gravitational field. The MICROSCOPE satellite, launched in April 2016, aims to test its validity at the 10-15 precision level, by measuring the force required to maintain two test masses (of titanium and platinum alloys) exactly in the same orbit. A nonvanishing result would correspond to a violation of the equivalence principle, or to the discovery of a new long-range force. Analysis of the first data gives δ (Ti ,Pt )=[-1 ±9 (stat)±9 (syst)]×10-15 (1 σ statistical uncertainty) for the titanium-platinum Eötvös parameter characterizing the relative difference in their free-fall accelerations.
ERIC Educational Resources Information Center
Morris, Richard; Stuckey, Mary E.
1997-01-01
Sketches a key mechanism called "Substitute Vocabularies" whereby practitioners of democracies seek to reconcile contradictions between democratic political principle and repressive political action. Illustrates this by identifying the Richard Nixon administration's political principles regarding Native Americans as articulated in…
Tested Demonstrations: Thermodynamic Changes, Kinetics, Equilibrium, and LeChatelier's Principle.
ERIC Educational Resources Information Center
Gilbert, George L., Ed.
1984-01-01
Procedures for demonstrating thermodynamic changes, kinetics and reaction mechanisms, equilibrium, and LeChatelier's principle are presented. The only materials needed for these demonstrations are beakers, water, assorted wooden blocks of varying thickness, assorted rubber tubing, and a sponge. The concepts illustrated in each demonstration are…
What Makes the Foucault Pendulum Move among the Stars?
NASA Astrophysics Data System (ADS)
Phillips, Norman
2004-11-01
Foucault's pendulum exhibition in 1851 occurred in an era now known by development of the theorems of Coriolis and the formulation of dynamical meteorology by Ferrel. Yet today the behavior of the pendulum is often misunderstood. The existence of a horizontal component of Newtonian gravitation is essential for understanding the behavior with respect to the stars. Two simple mechanical principles describe why the path of oscillation is fixed only at the poles; the principle of centripetal acceleration and the principle of conservation of angular momentum. A sky map is used to describe the elegant path among the stars produced by these principles.
NASA Astrophysics Data System (ADS)
Nanson, Gerald C.; Huang, He Qing
2018-02-01
Until recently no universal agreement as to a philosophical or scientific methodological framework has been proposed to guide the study of fluvial geomorphology. An understanding of river form and process requires an understanding of the principles that govern the behaviour and evolution of alluvial rivers at the most fundamental level. To date, the investigations of such principles have followed four approaches: develop qualitative unifying theories that are usually untested; collect and examine data visually and statistically to define semi-quantitative relationships among variables; apply Newtonian theoretical and empirical mechanics in a reductionist manner; resolve the primary flow equations theoretically by assuming maximum or minimum outputs. Here we recommend not a fifth but an overarching philosophy to embrace all four: clarifying and formalising an understanding of the evolution of river channels and iterative directional changes in the context of least action principle (LAP), the theoretical basis of variational mechanics. LAP is exemplified in rivers in the form of maximum flow efficiency (MFE). A sophisticated understanding of evolution in its broadest sense is essential to understand how rivers adjust towards an optimum state rather than towards some other. Because rivers, as dynamic contemporary systems, flow in valleys that are commonly historical landforms and often tectonically determined, we propose that most of the world's alluvial rivers are over-powered for the work they must do. To remain stable they commonly evolve to expend surplus energy via a variety of dynamic equilibrium forms that will further adjust, where possible, to maximise their stability as much less common MFE forms in stationary equilibrium. This paper: 1. Shows that the theory of evolution is derived from, and applicable to, both the physical and biological sciences; 2. Focusses the development of theory in geomorphology on the development of equilibrium theory; 3. Proposes that river channels, like organisms, evolve teleomatically (progression towards an end-state by following natural laws) and iteratively (one stage forming the basis for the next) towards an optimal end-state; 4. Describes LAP as the methodological basis for understanding the self-adjustment alluvial channels towards MFE. 5. Acknowledges that whereas river channels that form within their unmodified alluvium evolve into optimal minimum-energy systems, exogenic variables, such as riparian or aquatic vegetation, can cause significant variations in resultant river-styles. We specifically attempt to address Luna Leopold's lament in 1994 that no clearly expressed philosophy explains the remarkable self-adjustment of alluvial channels.
Detector noise statistics in the non-linear regime
NASA Technical Reports Server (NTRS)
Shopbell, P. L.; Bland-Hawthorn, J.
1992-01-01
The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
New Optical Transforms For Statistical Image Recognition
NASA Astrophysics Data System (ADS)
Lee, Sing H.
1983-12-01
In optical implementation of statistical image recognition, new optical transforms on large images for real-time recognition are of special interest. Several important linear transformations frequently used in statistical pattern recognition have now been optically implemented, including the Karhunen-Loeve transform (KLT), the Fukunaga-Koontz transform (FKT) and the least-squares linear mapping technique (LSLMT).1-3 The KLT performs principle components analysis on one class of patterns for feature extraction. The FKT performs feature extraction for separating two classes of patterns. The LSLMT separates multiple classes of patterns by maximizing the interclass differences and minimizing the intraclass variations.
Coupled Structural, Thermal, Phase-Change and Electromagnetic Analysis for Superconductors. Volume 1
NASA Technical Reports Server (NTRS)
Felippa, C. A.; Farhat, C.; Park, K. C.; Militello, C.; Schuler, J. J.
1996-01-01
Described are the theoretical development and computer implementation of reliable and efficient methods for the analysis of coupled mechanical problems that involve the interaction of mechanical, thermal, phase-change and electromagnetic subproblems. The focus application has been the modeling of superconductivity and associated quantum-state phase-change phenomena. In support of this objective the work has addressed the following issues: (1) development of variational principles for finite elements, (2) finite element modeling of the electromagnetic problem, (3) coupling of thermal and mechanical effects, and (4) computer implementation and solution of the superconductivity transition problem. The main accomplishments have been: (1) the development of the theory of parametrized and gauged variational principles, (2) the application of those principled to the construction of electromagnetic, thermal and mechanical finite elements, and (3) the coupling of electromagnetic finite elements with thermal and superconducting effects, and (4) the first detailed finite element simulations of bulk superconductors, in particular the Meissner effect and the nature of the normal conducting boundary layer. The theoretical development is described in two volumes. This volume, Volume 1, describes mostly formulations for specific problems. Volume 2 describes generalization of those formulations.
On spectroscopy for a whole Abelian model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chauca, J.; Doria, R.; Aprendanet, Petropolis, 25600
Postulated on the whole meaning a whole abelian gauge symmetry is being introduced. Various physical areas as complexity, statistical mechanics, quantum mechanics are partially supporting this approach where the whole is at origin. However, the reductionist crisis given by quark confinement definitely sustains this insight. It says that fundamental parts can not be seen isolatedely. Consequently, there is an experimental situation where the parts should be substituted by something more. This makes us to look for writing the wholeness principle under gauge theory. For this, one reinterprets the gauge parameter where instead of compensating fields it is organizing a systemicmore » gauge symmetry. Now, it introduces a fields set {l_brace}A{sub {mu}I}{r_brace} rotating under a common gauge symmetry. Thus, given a fields collection {l_brace}A{sub {mu}I}{r_brace} as origin, the effort at this work is to investigate on its spectroscopy. Analyze for the abelian case the correspondent involved quanta. Understand that for a whole model diversity replaces elementarity. Derive the associated quantum numbers as spin, mass, charge, discrete symmetries in terms of such systemic symmetry. Observe how the particles diversity is manifested in terms of wholeness.« less
Random Evolution of Idiotypic Networks: Dynamics and Architecture
NASA Astrophysics Data System (ADS)
Brede, Markus; Behn, Ulrich
The paper deals with modelling a subsystem of the immune system, the so-called idiotypic network (INW). INWs, conceived by N.K. Jerne in 1974, are functional networks of interacting antibodies and B cells. In principle, Jernes' framework provides solutions to many issues in immunology, such as immunological memory, mechanisms for antigen recognition and self/non-self discrimination. Explaining the interconnection between the elementary components, local dynamics, network formation and architecture, and possible modes of global system function appears to be an ideal playground of statistical mechanics. We present a simple cellular automaton model, based on a graph representation of the system. From a simplified description of idiotypic interactions, rules for the random evolution of networks of occupied and empty sites on these graphs are derived. In certain biologically relevant parameter ranges the resultant dynamics leads to stationary states. A stationary state is found to correspond to a specific pattern of network organization. It turns out that even these very simple rules give rise to a multitude of different kinds of patterns. We characterize these networks by classifying `static' and `dynamic' network-patterns. A type of `dynamic' network is found to display many features of real INWs.
Stability of direct band gap under mechanical strains for monolayer MoS2, MoSe2, WS2 and WSe2
NASA Astrophysics Data System (ADS)
Deng, Shuo; Li, Lijie; Li, Min
2018-07-01
Single layer transition-metal dichalcogenides materials (MoS2, MoSe2, WS2 and WSe2) are investigated using the first-principles method with the emphasis on their responses to mechanical strains. All these materials display the direct band gap under a certain range of strains from compressive to tensile (stable range). We have found that this stable range is different for these materials. Through studying on their mechanical properties again using the first-principles approach, it is unveiled that this stable strain range is determined by the Young's modulus. More analysis on strains induced electronic band gap properties have also been conducted.
Biological Implications of Dynamical Phases in Non-equilibrium Networks
NASA Astrophysics Data System (ADS)
Murugan, Arvind; Vaikuntanathan, Suriyanarayanan
2016-03-01
Biology achieves novel functions like error correction, ultra-sensitivity and accurate concentration measurement at the expense of free energy through Maxwell Demon-like mechanisms. The design principles and free energy trade-offs have been studied for a variety of such mechanisms. In this review, we emphasize a perspective based on dynamical phases that can explain commonalities shared by these mechanisms. Dynamical phases are defined by typical trajectories executed by non-equilibrium systems in the space of internal states. We find that coexistence of dynamical phases can have dramatic consequences for function vs free energy cost trade-offs. Dynamical phases can also provide an intuitive picture of the design principles behind such biological Maxwell Demons.
Mechanisms of developmental neurite pruning
Schuldiner, Oren; Yaron, Avraham
2016-01-01
The precise wiring of the nervous system is a combined outcome of progressive and regressive events during development. Axon guidance and synapse formation intertwined with cell death and neurite pruning sculpt the mature circuitry. It is now well recognized that pruning of dendrites and axons as means to refine neuronal networks, is a wide spread phenomena required for the normal development of vertebrate and invertebrate nervous systems. Here we will review the arising principles of cellular and molecular mechanisms of neurite pruning. We will discuss these principles in light of studies in multiple neuronal systems, and speculate on potential explanations for the emergence of neurite pruning as a mechanism to sculpt the nervous system. PMID:25213356
NASA Astrophysics Data System (ADS)
Saidi, F.; Sebaa, N.; Mahmoudi, A.; Aourag, H.; Merad, G.; Dergal, M.
2018-06-01
We performed first-principle calculations to investigate structural, phase stability, electronic and mechanical properties for the Laves phases YM2 (M = Mn, Fe, Co) with C15, C14 and C36 structures. We used the density functional theory within the framework of both pseudo-potentials and plane wave basis using VASP (Vienna Ab Initio Software Package). The calculated equilibrium structural parameters are in accordance with available theoretical values. Mechanical properties were calculated, discussed, and analyzed with data mining approach in terms of structure stability. The results reveal that YCo2 is harder than YFe2 and YMn2.
Fuzzy classifier based support vector regression framework for Poisson ratio determination
NASA Astrophysics Data System (ADS)
Asoodeh, Mojtaba; Bagheripour, Parisa
2013-09-01
Poisson ratio is considered as one of the most important rock mechanical properties of hydrocarbon reservoirs. Determination of this parameter through laboratory measurement is time, cost, and labor intensive. Furthermore, laboratory measurements do not provide continuous data along the reservoir intervals. Hence, a fast, accurate, and inexpensive way of determining Poisson ratio which produces continuous data over the whole reservoir interval is desirable. For this purpose, support vector regression (SVR) method based on statistical learning theory (SLT) was employed as a supervised learning algorithm to estimate Poisson ratio from conventional well log data. SVR is capable of accurately extracting the implicit knowledge contained in conventional well logs and converting the gained knowledge into Poisson ratio data. Structural risk minimization (SRM) principle which is embedded in the SVR structure in addition to empirical risk minimization (EMR) principle provides a robust model for finding quantitative formulation between conventional well log data and Poisson ratio. Although satisfying results were obtained from an individual SVR model, it had flaws of overestimation in low Poisson ratios and underestimation in high Poisson ratios. These errors were eliminated through implementation of fuzzy classifier based SVR (FCBSVR). The FCBSVR significantly improved accuracy of the final prediction. This strategy was successfully applied to data from carbonate reservoir rocks of an Iranian Oil Field. Results indicated that SVR predicted Poisson ratio values are in good agreement with measured values.
Modelling Trial-by-Trial Changes in the Mismatch Negativity
Lieder, Falk; Daunizeau, Jean; Garrido, Marta I.; Friston, Karl J.; Stephan, Klaas E.
2013-01-01
The mismatch negativity (MMN) is a differential brain response to violations of learned regularities. It has been used to demonstrate that the brain learns the statistical structure of its environment and predicts future sensory inputs. However, the algorithmic nature of these computations and the underlying neurobiological implementation remain controversial. This article introduces a mathematical framework with which competing ideas about the computational quantities indexed by MMN responses can be formalized and tested against single-trial EEG data. This framework was applied to five major theories of the MMN, comparing their ability to explain trial-by-trial changes in MMN amplitude. Three of these theories (predictive coding, model adjustment, and novelty detection) were formalized by linking the MMN to different manifestations of the same computational mechanism: approximate Bayesian inference according to the free-energy principle. We thereby propose a unifying view on three distinct theories of the MMN. The relative plausibility of each theory was assessed against empirical single-trial MMN amplitudes acquired from eight healthy volunteers in a roving oddball experiment. Models based on the free-energy principle provided more plausible explanations of trial-by-trial changes in MMN amplitude than models representing the two more traditional theories (change detection and adaptation). Our results suggest that the MMN reflects approximate Bayesian learning of sensory regularities, and that the MMN-generating process adjusts a probabilistic model of the environment according to prediction errors. PMID:23436989
ERIC Educational Resources Information Center
Woodward, Robert L.; Myers, Norman L.
The instructional units and related materials in this guide are designed to assist in the preparation of courses of study/instruction in (1) power mechanics specifically, (2) power mechanics which serve as introductory courses in other areas of industrial arts, and (3) automotive mechanics which also cover the broader aspects of power mechanics.…
NASA Astrophysics Data System (ADS)
Guzzardi, Luca
2014-06-01
This paper discusses Ernst Mach's interpretation of the principle of energy conservation (EC) in the context of the development of energy concepts and ideas about causality in nineteenth-century physics and theory of science. In doing this, it focuses on the close relationship between causality, energy conservation and space in Mach's antireductionist view of science. Mach expounds his thesis about EC in his first historical-epistemological essay, Die Geschichte und die Wurzel des Satzes von der Erhaltung der Arbeit (1872): far from being a new principle, it is used from the early beginnings of mechanics independently from other principles; in fact, EC is a pre-mechanical principle which is generally applied in investigating nature: it is, indeed, nothing but a form of the principle of causality. The paper focuses on the scientific-historical premises and philosophical underpinnings of Mach's thesis, beginning with the classic debate on the validity and limits of the notion of cause by Hume, Kant, and Helmholtz. Such reference also implies a discussion of the relationship between causality on the one hand and space and time on the other. This connection plays a major role for Mach, and in the final paragraphs its importance is argued in order to understand his antireductionist perspective, i.e. the rejection of any attempt to give an ultimate explanation of the world via reduction of nature to one fundamental set of phenomena.
Dynamical basis sets for algebraic variational calculations in quantum-mechanical scattering theory
NASA Technical Reports Server (NTRS)
Sun, Yan; Kouri, Donald J.; Truhlar, Donald G.; Schwenke, David W.
1990-01-01
New basis sets are proposed for linear algebraic variational calculations of transition amplitudes in quantum-mechanical scattering problems. These basis sets are hybrids of those that yield the Kohn variational principle (KVP) and those that yield the generalized Newton variational principle (GNVP) when substituted in Schlessinger's stationary expression for the T operator. Trial calculations show that efficiencies almost as great as that of the GNVP and much greater than the KVP can be obtained, even for basis sets with the majority of the members independent of energy.
NASA Astrophysics Data System (ADS)
Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.
2018-01-01
We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.
Feedback in Action--The Mechanism of the Iris.
ERIC Educational Resources Information Center
Pingnet, B.; And Others
1988-01-01
Describes two demonstration experiments. Outlines a demonstration of the general principle of positive and negative feedback and the influence of time delays in feedback circuits. Elucidates the principle of negative feedback with a model of the iris of the eye. Emphasizes the importance of feedback in biological systems. (CW)
77 FR 27015 - Announcement of Grant and Loan Application Deadlines and Funding Levels
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
... principles and general administrative requirements for grants pertaining to their organizational type in..., interest rates, terms, and collateral requirements. (5) Provide a marketing plan. (6) Explain the mechanics... figures to the nearest dollar. Applicants should consult OMB Circular A-122: ``Cost Principles for Non...
Development of Canonical Transformations from Hamilton's Principle.
ERIC Educational Resources Information Center
Quade, C. Richard
1979-01-01
The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)
The Readability of Principles of Macroeconomics Textbooks
ERIC Educational Resources Information Center
Tinkler, Sarah; Woods, James
2013-01-01
The authors evaluated principles of macroeconomics textbooks for readability using Coh-Metrix, a computational linguistics tool. Additionally, they conducted an experiment on Amazon's Mechanical Turk Web site in which participants ranked the readability of text samples. There was a wide range of scores on readability indexes both among…
Optically nonlinear energy transfer in light-harvesting dendrimers.
Andrews, David L; Bradshaw, David S
2004-08-01
Dendrimeric polymers are the subject of intense research activity geared towards their implementation in nanodevice applications such as energy harvesting systems, organic light-emitting diodes, photosensitizers, low-threshold lasers, and quantum logic elements, etc. A recent development in this area has been the construction of dendrimers specifically designed to exhibit novel forms of optical nonlinearity, exploiting the unique properties of these materials at high levels of photon flux. Starting from a thorough treatment of the underlying theory based on the principles of molecular quantum electrodynamics, it is possible to identify and characterize several optically nonlinear mechanisms for directed energy transfer and energy pooling in multichromophore dendrimers. Such mechanisms fall into two classes: first, those where two-photon absorption by individual donors is followed by transfer of the net energy to an acceptor; second, those where the excitation of two electronically distinct but neighboring donor groups is followed by a collective migration of their energy to a suitable acceptor. Each transfer process is subject to minor dissipative losses. In this paper we describe in detail the balance of factors and the constraints that determines the favored mechanism, which include the excitation statistics, structure of the energy levels, laser coherence factors, chromophore selection rules and architecture, possibilities for the formation of delocalized excitons, spectral overlap, and the overall distribution of donors and acceptors. Furthermore, it transpires that quantum interference between different mechanisms can play an important role. Thus, as the relative importance of each mechanism determines the relevant nanophotonic characteristics, the results reported here afford the means for optimizing highly efficient light-harvesting dendrimer devices. (c) 2004 American Institute of Physics.
Optically nonlinear energy transfer in light-harvesting dendrimers
NASA Astrophysics Data System (ADS)
Andrews, David L.; Bradshaw, David S.
2004-08-01
Dendrimeric polymers are the subject of intense research activity geared towards their implementation in nanodevice applications such as energy harvesting systems, organic light-emitting diodes, photosensitizers, low-threshold lasers, and quantum logic elements, etc. A recent development in this area has been the construction of dendrimers specifically designed to exhibit novel forms of optical nonlinearity, exploiting the unique properties of these materials at high levels of photon flux. Starting from a thorough treatment of the underlying theory based on the principles of molecular quantum electrodynamics, it is possible to identify and characterize several optically nonlinear mechanisms for directed energy transfer and energy pooling in multichromophore dendrimers. Such mechanisms fall into two classes: first, those where two-photon absorption by individual donors is followed by transfer of the net energy to an acceptor; second, those where the excitation of two electronically distinct but neighboring donor groups is followed by a collective migration of their energy to a suitable acceptor. Each transfer process is subject to minor dissipative losses. In this paper we describe in detail the balance of factors and the constraints that determines the favored mechanism, which include the excitation statistics, structure of the energy levels, laser coherence factors, chromophore selection rules and architecture, possibilities for the formation of delocalized excitons, spectral overlap, and the overall distribution of donors and acceptors. Furthermore, it transpires that quantum interference between different mechanisms can play an important role. Thus, as the relative importance of each mechanism determines the relevant nanophotonic characteristics, the results reported here afford the means for optimizing highly efficient light-harvesting dendrimer devices.
Speck, Olga; Schlechtendahl, Mark; Borm, Florian; Kampowski, Tim; Speck, Thomas
2018-01-01
During evolution, plants evolved various reactions to wounding. Fast wound sealing and subsequent healing represent a selective advantage of particular importance for plants growing in arid habitats. An effective self-sealing function by internal deformation has been found in the succulent leaves of Delosperma cooperi. After a transversal incision, the entire leaf bends until the wound is closed. Our results indicate that the underlying sealing principle is a combination of hydraulic shrinking and swelling as the main driving forces and growth-induced mechanical pre-stresses in the tissues. Hydraulic effects were measured in terms of the relative bending angle over 55 minutes under various humidity conditions. The higher the relative air humidity, the lower the bending angle. Negative bending angles were found when a droplet of liquid water was applied to the wound. The statistical analysis revealed highly significant differences of the single main effects such as "humidity conditions in the wound region" and "time after wounding" and their interaction effect. The centripetal arrangement of five tissue layers with various thicknesses and significantly different mechanical properties might play an additional role with regard to mechanically driven effects. Injury disturbs the mechanical equilibrium, with pre-stresses leading to internal deformation until a new equilibrium is reached. In the context of self-sealing by internal deformation, the highly flexible wide-band tracheids, which form a net of vascular bundles, are regarded as paedomorphic tracheids, which are specialised to prevent cell collapse under drought stress and allow for building growth-induced mechanical pre-stresses.
Matsushima, Takashi; Blumenfeld, Raphael
2017-03-01
The microstructural organization of a granular system is the most important determinant of its macroscopic behavior. Here we identify the fundamental factors that determine the statistics of such microstructures, using numerical experiments to gain a general understanding. The experiments consist of preparing and compacting isotropically two-dimensional granular assemblies of polydisperse frictional disks and analyzing the emergent statistical properties of quadrons-the basic structural elements of granular solids. The focus on quadrons is because the statistics of their volumes have been found to display intriguing universal-like features [T. Matsushima and R. Blumenfeld, Phys. Rev. Lett. 112, 098003 (2014)PRLTAO0031-900710.1103/PhysRevLett.112.098003]. The dependence of the structures and of the packing fraction on the intergranular friction and the initial state is analyzed, and a number of significant results are found. (i) An analytical formula is derived for the mean quadron volume in terms of three macroscopic quantities: the mean coordination number, the packing fraction, and the rattlers fraction. (ii) We derive a unique, initial-state-independent relation between the mean coordination number and the rattler-free packing fraction. The relation is supported numerically for a range of different systems. (iii) We collapse the quadron volume distributions from all systems onto one curve, and we verify that they all have an exponential tail. (iv) The nature of the quadron volume distribution is investigated by decomposition into conditional distributions of volumes given the cell order, and we find that each of these also collapses onto a single curve. (v) We find that the mean quadron volume decreases with increasing intergranular friction coefficients, an effect that is prominent in high-order cells. We argue that this phenomenon is due to an increased probability of stable irregularly shaped cells, and we test this using a herewith developed free cell analytical model. We conclude that, in principle, the microstructural characteristics are governed mainly by the packing procedure, while the effects of intergranular friction and initial states are details that can be scaled away. However, mechanical stability constraints suppress slightly the occurrence of small quadron volumes in cells of order ≥6, and the magnitude of this effect does depend on friction. We quantify in detail this dependence and the deviation it causes from an exact collapse for these cells. (vi) We argue that our results support strongly the view that ensemble granular statistical mechanics does not satisfy the uniform measure assumption of conventional statistical mechanics. Results (i)-(iv) have been reported in the aforementioned reference, and they are reviewed and elaborated on here.
Classroom Demonstrations of Polymer Principles.
ERIC Educational Resources Information Center
Rodriguez, F.
1990-01-01
Classroom demonstrations of selected mechanical properties of polymers are described that can be used to make quantitative measurements. Stiffness, strength, and extensibility are mechanical properties used to distinguish one polymer from another. (KR)
Seeking parsimony in hydrology and water resources technology
NASA Astrophysics Data System (ADS)
Koutsoyiannis, D.
2009-04-01
The principle of parsimony, also known as the principle of simplicity, the principle of economy and Ockham's razor, advises scientists to prefer the simplest theory among those that fit the data equally well. In this, it is an epistemic principle but reflects an ontological characterization that the universe is ultimately parsimonious. Is this principle useful and can it really be reconciled with, and implemented to, our modelling approaches of complex hydrological systems, whose elements and events are extraordinarily numerous, different and unique? The answer underlying the mainstream hydrological research of the last two decades seems to be negative. Hopes were invested to the power of computers that would enable faithful and detailed representation of the diverse system elements and the hydrological processes, based on merely "first principles" and resulting in "physically-based" models that tend to approach in complexity the real world systems. Today the account of such research endeavour seems not positive, as it did not improve model predictive capacity and processes comprehension. A return to parsimonious modelling seems to be again the promising route. The experience from recent research and from comparisons of parsimonious and complicated models indicates that the former can facilitate insight and comprehension, improve accuracy and predictive capacity, and increase efficiency. In addition - and despite aspiration that "physically based" models will have lower data requirements and, even, they ultimately become "data-free" - parsimonious models require fewer data to achieve the same accuracy with more complicated models. Naturally, the concepts that reconcile the simplicity of parsimonious models with the complexity of hydrological systems are probability theory and statistics. Probability theory provides the theoretical basis for moving from a microscopic to a macroscopic view of phenomena, by mapping sets of diverse elements and events of hydrological systems to single numbers (a probability or an expected value), and statistics provides the empirical basis of summarizing data, making inference from them, and supporting decision making in water resource management. Unfortunately, the current state of the art in probability, statistics and their union, often called stochastics, is not fully satisfactory for the needs of modelling of hydrological and water resource systems. A first problem is that stochastic modelling has traditionally relied on classical statistics, which is based on the independent "coin-tossing" prototype, rather than on the study of real-world systems whose behaviour is very different from the classical prototype. A second problem is that the stochastic models (particularly the multivariate ones) are often not parsimonious themselves. Therefore, substantial advancement of stochastics is necessary in a new paradigm of parsimonious hydrological modelling. These ideas are illustrated using several examples, namely: (a) hydrological modelling of a karst system in Bosnia and Herzegovina using three different approaches ranging from parsimonious to detailed "physically-based"; (b) parsimonious modelling of a peculiar modified catchment in Greece; (c) a stochastic approach that can replace parameter-excessive ARMA-type models with a generalized algorithm that produces any shape of autocorrelation function (consistent with the accuracy provided by the data) using a couple of parameters; (d) a multivariate stochastic approach which replaces a huge number of parameters estimated from data with coefficients estimated by the principle of maximum entropy; and (e) a parsimonious approach for decision making in multi-reservoir systems using a handful of parameters instead of thousands of decision variables.
Bostick, David L.; Brooks, Charles L.
2009-01-01
To provide utility in understanding the molecular evolution of ion-selective biomembrane channels/transporters, globular proteins, and ionophoric compounds, as well as in guiding their modification and design, we present a statistical mechanical basis for deconstructing the impact of the coordination structure and chemistry of selective multidentate ionic complexes. The deconstruction augments familiar ideas in liquid structure theory to realize the ionic complex as an open ion-ligated system acting under the influence of an “external field” provided by the host (or surrounding medium). Using considerations derived from this basis, we show that selective complexation arises from exploitation of a particular ion's coordination preferences. These preferences derive from a balance of interactions much like that which dictates the Hofmeister effect. By analyzing the coordination-state space of small family IA and VIIA ions in simulated fluid media, we derive domains of coordinated states that confer selectivity for a given ion upon isolating and constraining particular attributes (order parameters) of a complex comprised of a given type of ligand. We demonstrate that such domains may be used to rationalize the ion-coordinated environments provided by selective ionophores and biological ion channels/transporters of known structure, and that they can serve as a means toward deriving rational design principles for ion-selective hosts. PMID:19486671
Nourhashemi, Mina; Kongolo, Guy; Mahmoudzadeh, Mahdi; Goudjil, Sabrina; Wallois, Fabrice
2017-04-01
The mechanisms responsible for coupling between relative cerebral blood flow (rCBF), relative cerebral blood volume (rCBV), and relative cerebral metabolic rate of oxygen ([Formula: see text]), an important function of the microcirculation in preterm infants, remain unclear. Identification of a causal relationship between rCBF-rCBV and [Formula: see text] in preterms may, therefore, help to elucidate the principles of cortical hemodynamics during development. We simultaneously recorded rCBF and rCBV and estimated [Formula: see text] by two independent acquisition systems: diffuse correlation spectroscopy and near-infrared spectroscopy, respectively, in 10 preterms aged between 28 and 35 weeks of gestational age. Transfer entropy was calculated in order to determine the directionality between rCBF-rCBV and [Formula: see text]. The surrogate method was applied to determine statistical significance. The results show that rCBV and [Formula: see text] have a predominant driving influence on rCBF at the resting state in the preterm neonatal brain. Statistical analysis robustly detected the correct directionality of rCBV on rCBF and [Formula: see text] on rCBF. This study helps to clarify the early organization of the rCBV-rCBF and [Formula: see text] inter-relationship in the immature cortex.
Single-Atom Demonstration of the Quantum Landauer Principle
NASA Astrophysics Data System (ADS)
Yan, L. L.; Xiong, T. P.; Rehan, K.; Zhou, F.; Liang, D. F.; Chen, L.; Zhang, J. Q.; Yang, W. L.; Ma, Z. H.; Feng, M.
2018-05-01
One of the outstanding challenges to information processing is the eloquent suppression of energy consumption in the execution of logic operations. The Landauer principle sets an energy constraint in deletion of a classical bit of information. Although some attempts have been made to experimentally approach the fundamental limit restricted by this principle, exploring the Landauer principle in a purely quantum mechanical fashion is still an open question. Employing a trapped ultracold ion, we experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost of information erasure in conjunction with the entropy change of the associated quantized environment. Our experimental investigation substantiates an intimate link between information thermodynamics and quantum candidate systems for information processing.
A crash course on data analysis in asteroseismology
NASA Astrophysics Data System (ADS)
Appourchaux, Thierry
2014-02-01
In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.
Cortical Surround Interactions and Perceptual Salience via Natural Scene Statistics
Coen-Cagli, Ruben; Dayan, Peter; Schwartz, Odelia
2012-01-01
Spatial context in images induces perceptual phenomena associated with salience and modulates the responses of neurons in primary visual cortex (V1). However, the computational and ecological principles underlying contextual effects are incompletely understood. We introduce a model of natural images that includes grouping and segmentation of neighboring features based on their joint statistics, and we interpret the firing rates of V1 neurons as performing optimal recognition in this model. We show that this leads to a substantial generalization of divisive normalization, a computation that is ubiquitous in many neural areas and systems. A main novelty in our model is that the influence of the context on a target stimulus is determined by their degree of statistical dependence. We optimized the parameters of the model on natural image patches, and then simulated neural and perceptual responses on stimuli used in classical experiments. The model reproduces some rich and complex response patterns observed in V1, such as the contrast dependence, orientation tuning and spatial asymmetry of surround suppression, while also allowing for surround facilitation under conditions of weak stimulation. It also mimics the perceptual salience produced by simple displays, and leads to readily testable predictions. Our results provide a principled account of orientation-based contextual modulation in early vision and its sensitivity to the homogeneity and spatial arrangement of inputs, and lends statistical support to the theory that V1 computes visual salience. PMID:22396635
Large Deviations for Stochastic Models of Two-Dimensional Second Grade Fluids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhai, Jianliang, E-mail: zhaijl@ustc.edu.cn; Zhang, Tusheng, E-mail: Tusheng.Zhang@manchester.ac.uk
2017-06-15
In this paper, we establish a large deviation principle for stochastic models of incompressible second grade fluids. The weak convergence method introduced by Budhiraja and Dupuis (Probab Math Statist 20:39–61, 2000) plays an important role.
Code of Federal Regulations, 2014 CFR
2014-01-01
... security for the credit or collateral). The creditor shall exercise reasonable diligence in obtaining such... creditor utilizing the system (including, but not limited to, minimizing bad debt losses and operating... statistical principles and methodology and adjusted as necessary to maintain predictive ability. (2) A...
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Ultrasound-enhanced bioscouring process factors for greige cotton fabric are examined using custom experimental design utilizing statistical principles. An equation is presented which predicts bioscouring performance based upon percent reflectance values obtained from UV-Vis measurements of rutheniu...
Areal Control Using Generalized Least Squares As An Alternative to Stratification
Raymond L. Czaplewski
2001-01-01
Stratification for both variance reduction and areal control proliferates the number of strata, which causes small sample sizes in many strata. This might compromise statistical efficiency. Generalized least squares can, in principle, replace stratification for areal control.
Spiers Memorial Lecture. Quantum chemistry: the first seventy years.
McWeeny, Roy
2007-01-01
Present-day theoretical chemistry is rooted in Quantum Mechanics. The aim of the opening lecture is to trace the evolution of Quantum Chemistry from the Heitler-London paper of 1927 up to the end of the last century, emphasizing concepts rather than calculations. The importance of symmetry concepts became evident in the early years: one thinks of the necessary anti-symmetry of the wave function under electron permutations, the Pauli principle, the aufbau scheme, and the classification of spectroscopic states. But for chemists perhaps the key concept is embodied in the Hellmann-Feynman theorem, which provides a pictorial interpretation of chemical bonding in terms of classical electrostatic forces exerted on the nuclei by the electron distribution. Much of the lecture is concerned with various electron distribution functions--the electron density, the current density, the spin density, and other 'property densities'--and with their use in interpreting both molecular structure and molecular properties. Other topics touched upon include Response theory and propagators; Chemical groups in molecules and the group function approach; Atoms in molecules and Bader's theory; Electron correlation and the 'pair function'. Finally, some long-standing controversies, in particular the EPR paradox, are re-examined in the context of molecular dissociation. By admitting the concept of symmetry breaking, along with the use of the von Neumann-Dirac statistical ensemble, orthodox quantum mechanics can lead to a convincing picture of the dissociation mechanism.
[Application of finite element method in spinal biomechanics].
Liu, Qiang; Zhang, Jun; Sun, Shu-Chun; Wang, Fei
2017-02-25
The finite element model is one of the most important methods in study of modern spinal biomechanics, according to the needs to simulate the various states of the spine, calculate the stress force and strain distribution of the different groups in the state, and explore its principle of mechanics, mechanism of injury, and treatment effectiveness. In addition, in the study of the pathological state of the spine, the finite element is mainly used in the understanding the mechanism of lesion location, evaluating the effects of different therapeutic tool, assisting and completing the selection and improvement of therapeutic tool, in order to provide a theoretical basis for the rehabilitation of spinal lesions. Finite element method can be more provide the service for the patients suffering from spinal correction, operation and individual implant design. Among the design and performance evaluation of the implant need to pay attention to the individual difference and perfect the evaluation system. At present, how to establish a model which is more close to the real situation has been the focus and difficulty of the study of human body's finite element.Although finite element method can better simulate complex working condition, it is necessary to improve the authenticity of the model and the sharing of the group by using many kinds of methods, such as image science, statistics, kinematics and so on. Copyright© 2017 by the China Journal of Orthopaedics and Traumatology Press.
Computational fluid mechanics utilizing the variational principle of modeling damping seals
NASA Technical Reports Server (NTRS)
Abernathy, J. M.; Farmer, R.
1985-01-01
An analysis for modeling damping seals for use in Space Shuttle main engine turbomachinery is being produced. Development of a computational fluid mechanics code for turbulent, incompressible flow is required.
NASA Astrophysics Data System (ADS)
Aiyoshi, Eitaro; Masuda, Kazuaki
On the basis of market fundamentalism, new types of social systems with the market mechanism such as electricity trading markets and carbon dioxide (CO2) emission trading markets have been developed. However, there are few textbooks in science and technology which present the explanation that Lagrange multipliers can be interpreted as market prices. This tutorial paper explains that (1) the steepest descent method for dual problems in optimization, and (2) Gauss-Seidel method for solving the stationary conditions of Lagrange problems with market principles, can formulate the mechanism of market pricing, which works even in the information-oriented modern society. The authors expect readers to acquire basic knowledge on optimization theory and algorithms related to economics and to utilize them for designing the mechanism of more complicated markets.
Woods and Russell, Hill, and the emergence of medical statistics
Farewell, Vern; Johnson, Tony
2010-01-01
In 1937, Austin Bradford Hill wrote Principles of Medical Statistics (Lancet: London, 1937) that became renowned throughout the world and is widely associated with the birth of modern medical statistics. Some 6 years earlier Hilda Mary Woods and William Thomas Russell, colleagues of Hill at the London School of Hygiene and Tropical Medicine, wrote a similar book An Introduction to Medical Statistics (PS King and Son: London, 1931) that is little known today. We trace the origins of these two books from the foundations of early demography and vital statistics, and make a detailed examination of some of their chapters. It is clear that these texts mark a watershed in the history of medical statistics that demarcates the vital statistics of the nineteenth and early twentieth centuries from the modern discipline. Moreover, we consider that the book by Woods and Russell is of some importance in the development of medical statistics and we describe and acknowledge their place in the history of this discipline. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20535761
Woods and Russell, Hill, and the emergence of medical statistics.
Farewell, Vern; Johnson, Tony
2010-06-30
In 1937, Austin Bradford Hill wrote Principles of Medical Statistics (Lancet: London, 1937) that became renowned throughout the world and is widely associated with the birth of modern medical statistics. Some 6 years earlier Hilda Mary Woods and William Thomas Russell, colleagues of Hill at the London School of Hygiene and Tropical Medicine, wrote a similar book An Introduction to Medical Statistics (PS King and Son: London, 1931) that is little known today. We trace the origins of these two books from the foundations of early demography and vital statistics, and make a detailed examination of some of their chapters. It is clear that these texts mark a watershed in the history of medical statistics that demarcates the vital statistics of the nineteenth and early twentieth centuries from the modern discipline. Moreover, we consider that the book by Woods and Russell is of some importance in the development of medical statistics and we describe and acknowledge their place in the history of this discipline. (c) 2010 John Wiley & Sons, Ltd.
2007-06-01
information than another party, such as in the market for used cars. Mankiw , in his book, “The Principles of Economics ,” defines adverse selection as...Options,” Power Point presentation e-mailed to the author, 18 May 2007, 1-9. 37 Ibid., 2. 38 Gregory N. Mankiw , Principles of Economics , Third...the Internet.” The American Economic Review. December 1999. 89 No. 5. 1066- 1078. Mankiw , Gregory N. Principles of Economics , Third Edition (Mason
Fienup, Daniel M; Critchfield, Thomas S
2010-01-01
Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904
Standardized Curriculum for Diesel Engine Mechanics.
ERIC Educational Resources Information Center
Mississippi State Dept. of Education, Jackson. Office of Vocational, Technical and Adult Education.
Standardized curricula are provided for two courses for the secondary vocational education program in Mississippi: diesel engine mechanics I and II. The eight units in diesel engine mechanics I are as follows: orientation; shop safety; basic shop tools; fasteners; measurement; engine operating principles; engine components; and basic auxiliary…
The Principle of General Tovariance
NASA Astrophysics Data System (ADS)
Heunen, C.; Landsman, N. P.; Spitters, B.
2008-06-01
We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's interpretation of quantum mechanics, i.e. his doctrine of classical concepts and his principle of complementarity. An appropriate mathematical language for combining these ideas is topos theory, a framework earlier proposed for physics by Isham and collaborators. Our principle of general tovariance states that any mathematical structure appearing in the laws of physics must be definable in an arbitrary topos (with natural numbers object) and must be preserved under so-called geometric morphisms. This principle identifies geometric logic as the mathematical language of physics and restricts the constructions and theorems to those valid in intuitionism: neither Aristotle's principle of the excluded third nor Zermelo's Axiom of Choice may be invoked. Subsequently, our equivalence principle states that any algebra of observables (initially defined in the topos Sets) is empirically equivalent to a commutative one in some other topos.
VOCATIONAL TALENT EXERCISES, PART B.
ERIC Educational Resources Information Center
George Washington Univ., Washington, DC. School of Education.
THIS WORKBOOK WAS DEVELOPED IN A CURRICULUM PROJECT DESCRIBED IN VT 004 454, TO HELP YOUNG PEOPLE LEARN BASIC PRINCIPLES AND CONCEPTS OF MECHANICS AND TECHNOLOGY BY PROVIDING EXERCISES SIMILAR TO THOSE IN APTITUDE TESTS, EXPLANATIONS OF THE UNDERLYING PRINCIPLES, AND THE CORRECT ANSWERS. IT IS THE SECOND OF FOUR BOOKS WHICH PRESENT 30 EXERCISES…
Perceptual Organization of Visual Structure Requires a Flexible Learning Mechanism
ERIC Educational Resources Information Center
Aslin, Richard N.
2011-01-01
Bhatt and Quinn (2011) provide a compelling and comprehensive review of empirical evidence that supports the operation of principles of perceptual organization in young infants. They also have provided a comprehensive list of experiences that could serve to trigger the learning of at least some of these principles of perceptual organization, and…
Power Product Equipment Technician: Equipment Systems. Teacher Edition. Student Edition.
ERIC Educational Resources Information Center
Hilley, Robert
This packet contains teacher and student editions on the topic of equipment systems, intended for the preparation of power product equipment technicians. This publication contains seven units: (1) principles of power transmission; (2) mechanical drive systems; (3) principles of fluid power; (4) hydraulic and pneumatic drive systems; (5) wheel and…
Mathematical foundations of biomechanics.
Niederer, Peter F
2010-01-01
The aim of biomechanics is the analysis of the structure and function of humans, animals, and plants by means of the methods of mechanics. Its foundations are in particular embedded in mathematics, physics, and informatics. Due to the inherent multidisciplinary character deriving from its aim, biomechanics has numerous connections and overlapping areas with biology, biochemistry, physiology, and pathophysiology, along with clinical medicine, so its range is enormously wide. This treatise is mainly meant to serve as an introduction and overview for readers and students who intend to acquire a basic understanding of the mathematical principles and mechanics that constitute the foundation of biomechanics; accordingly, its contents are limited to basic theoretical principles of general validity and long-range significance. Selected examples are included that are representative for the problems treated in biomechanics. Although ultimate mathematical generality is not in the foreground, an attempt is made to derive the theory from basic principles. A concise and systematic formulation is thereby intended with the aim that the reader is provided with a working knowledge. It is assumed that he or she is familiar with the principles of calculus, vector analysis, and linear algebra.
Dodd, Jonathan D; MacEneaney, Peter M; Malone, Dermot E
2004-05-01
The aim of this study was to show how evidence-based medicine (EBM) techniques can be applied to the appraisal of diagnostic radiology publications. A clinical scenario is described: a gastroenterologist has questioned the diagnostic performance of magnetic resonance cholangiopancreatography (MRCP) in a patient who may have common bile duct (CBD) stones. His opinion was based on an article on MRCP published in "Gut." The principles of EBM are described and then applied to the critical appraisal of this paper. Another paper on the same subject was obtained from the radiology literature and was also critically appraised using explicit EBM criteria. The principles for assessing the validity and strength of both studies are outlined. All statistical parameters were generated quickly using a spreadsheet in Excel format. The results of EBM assessment of both papers are presented. The calculation and application of confidence intervals (CIs) and likelihood ratios (LRs) for both studies are described. These statistical results are applied to individual patient scenarios using graphs of conditional probability (GCP). Basic EBM principles are described and additional points relevant to radiologists discussed. Online resources for EBR practice are identified. The principles of EBM and their application to radiology are discussed. It is emphasized that sensitivity and specificity are point estimates of the "true" characteristics of a test in clinical practice. A spreadsheet can be used to quickly calculate CIs, LRs and GCPs. These give the radiologist a better understanding of the meaning of diagnostic test results in any patient or population of patients.
Individuation in Quantum Mechanics and Space-Time
NASA Astrophysics Data System (ADS)
Jaeger, Gregg
2010-10-01
Two physical approaches—as distinct, under the classification of Mittelstaedt, from formal approaches—to the problem of individuation of quantum objects are considered, one formulated in spatiotemporal terms and one in quantum mechanical terms. The spatiotemporal approach itself has two forms: one attributed to Einstein and based on the ontology of space-time points, and the other proposed by Howard and based on intersections of world lines. The quantum mechanical approach is also provided here in two forms, one based on interference and another based on a new Quantum Principle of Individuation (QPI). It is argued that the space-time approach to individuation fails and that the quantum approach offers several advantages over it, including consistency with Leibniz’s Principle of Identity of Indiscernibles.
The Operating Principle of a Fully Solid State Active Magnetic Regenerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdelaziz, Omar
As an alternative refrigeration technology, magnetocaloric refrigeration has the potential to be safer, quieter, more efficient, and more environmentally friendly than the conventional vapor compression refrigeration technology. Most of the reported active magnetic regenerator (AMR) systems that operate based on the magnetocaloric effect use heat transfer fluid to exchange heat, which results in complicated mechanical subsystems and components such as rotating valves and hydraulic pumps. This paper presents an operating principle of a fully solid state AMR, in which an alternative mechanism for heat transfer between the AMR and the heat source/sink is proposed. The operating principle of the fullymore » solid state AMR is based on moving rods/sheets (e.g. copper, brass, iron or aluminum), which are employed to replace the heat transfer fluid. Such fully solid state AMR would provide a significantly higher heat transfer rate than a conventional AMR because the conductivity of moving solid rods/plates is high and it enables the increase in the machine operating frequency hence the cooling capacity. The details of operating principle are presented and discussed here. One of the key enabling features for this technology is the contact between the moving rods/sheets and magnetocaloric material, and heat exchange mechanism at the heat source/sink. This paper provides an overview of the design for a fully solid state magnetocaloric refrigeration system along with guidelines for their optimal design.« less
Possible dynamical explanations for Paltridge's principle of maximum entropy production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Virgo, Nathaniel, E-mail: nathanielvirgo@gmail.com; Ikegami, Takashi, E-mail: nathanielvirgo@gmail.com
2014-12-05
Throughout the history of non-equilibrium thermodynamics a number of theories have been proposed in which complex, far from equilibrium flow systems are hypothesised to reach a steady state that maximises some quantity. Perhaps the most celebrated is Paltridge's principle of maximum entropy production for the horizontal heat flux in Earth's atmosphere, for which there is some empirical support. There have been a number of attempts to derive such a principle from maximum entropy considerations. However, we currently lack a more mechanistic explanation of how any particular system might self-organise into a state that maximises some quantity. This is in contrastmore » to equilibrium thermodynamics, in which models such as the Ising model have been a great help in understanding the relationship between the predictions of MaxEnt and the dynamics of physical systems. In this paper we show that, unlike in the equilibrium case, Paltridge-type maximisation in non-equilibrium systems cannot be achieved by a simple dynamical feedback mechanism. Nevertheless, we propose several possible mechanisms by which maximisation could occur. Showing that these occur in any real system is a task for future work. The possibilities presented here may not be the only ones. We hope that by presenting them we can provoke further discussion about the possible dynamical mechanisms behind extremum principles for non-equilibrium systems, and their relationship to predictions obtained through MaxEnt.« less