Sample records for basic statistical mechanics

  1. ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prigogine, I.; Balescu, R.; Henin, F.

    1960-12-01

    Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)

  2. Quantum Social Science

    NASA Astrophysics Data System (ADS)

    Haven, Emmanuel; Khrennikov, Andrei

    2013-01-01

    Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.

  3. A statistical mechanics approach to autopoietic immune networks

    NASA Astrophysics Data System (ADS)

    Barra, Adriano; Agliari, Elena

    2010-07-01

    In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.

  4. Mechanics: Statics; A Syllabus.

    ERIC Educational Resources Information Center

    Compo, Louis

    The instructor's guide presents material for structuring an engineering fundamentals course covering the basic laws of statistics as part of a mechanical technology program. Detailed behavioral objectives are described for the following five areas of course content: principles of mechanics, two-dimensional equilibrium, equilibrium of internal…

  5. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  6. Thermodynamics and statistical mechanics. [thermodynamic properties of gases

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.

  7. Generalized Models for Rock Joint Surface Shapes

    PubMed Central

    Du, Shigui; Hu, Yunjin; Hu, Xiaofei

    2014-01-01

    Generalized models of joint surface shapes are the foundation for mechanism studies on the mechanical effects of rock joint surface shapes. Based on extensive field investigations of rock joint surface shapes, generalized models for three level shapes named macroscopic outline, surface undulating shape, and microcosmic roughness were established through statistical analyses of 20,078 rock joint surface profiles. The relative amplitude of profile curves was used as a borderline for the division of different level shapes. The study results show that the macroscopic outline has three basic features such as planar, arc-shaped, and stepped; the surface undulating shape has three basic features such as planar, undulating, and stepped; and the microcosmic roughness has two basic features such as smooth and rough. PMID:25152901

  8. Statistical-mechanics theory of active mode locking with noise.

    PubMed

    Gordon, Ariel; Fischer, Baruch

    2004-05-01

    Actively mode-locked lasers with noise are studied employing statistical mechanics. A mapping of the system to the spherical model (related to the Ising model) of ferromagnets in one dimension that has an exact solution is established. It gives basic features, such as analytical expressions for the correlation function between modes, and the widths and shapes of the pulses [different from the Kuizenga-Siegman expression; IEEE J. Quantum Electron. QE-6, 803 (1970)] and reveals the susceptibility to noise of mode ordering compared with passive mode locking.

  9. Retrocausal Effects As A Consequence of Orthodox Quantum Mechanics Refined To Accommodate The Principle Of Sufficient Reason

    NASA Astrophysics Data System (ADS)

    Stapp, Henry P.

    2011-11-01

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.

  10. Statistical-mechanical predictions and Navier-Stokes dynamics of two-dimensional flows on a bounded domain.

    PubMed

    Brands, H; Maassen, S R; Clercx, H J

    1999-09-01

    In this paper the applicability of a statistical-mechanical theory to freely decaying two-dimensional (2D) turbulence on a bounded domain is investigated. We consider an ensemble of direct numerical simulations in a square box with stress-free boundaries, with a Reynolds number that is of the same order as in experiments on 2D decaying Navier-Stokes turbulence. The results of these simulations are compared with the corresponding statistical equilibria, calculated from different stages of the evolution. It is shown that the statistical equilibria calculated from early times of the Navier-Stokes evolution do not correspond to the dynamical quasistationary states. At best, the global topological structure is correctly predicted from a relatively late time in the Navier-Stokes evolution, when the quasistationary state has almost been reached. This failure of the (basically inviscid) statistical-mechanical theory is related to viscous dissipation and net leakage of vorticity in the Navier-Stokes dynamics at moderate values of the Reynolds number.

  11. The maximum entropy production principle: two basic questions.

    PubMed

    Martyushev, Leonid M

    2010-05-12

    The overwhelming majority of maximum entropy production applications to ecological and environmental systems are based on thermodynamics and statistical physics. Here, we discuss briefly maximum entropy production principle and raises two questions: (i) can this principle be used as the basis for non-equilibrium thermodynamics and statistical mechanics and (ii) is it possible to 'prove' the principle? We adduce one more proof which is most concise today.

  12. A κ-generalized statistical mechanics approach to income analysis

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  13. Reinventing Biostatistics Education for Basic Scientists

    PubMed Central

    Weissgerber, Tracey L.; Garovic, Vesna D.; Milin-Lazovic, Jelena S.; Winham, Stacey J.; Obradovic, Zoran; Trzeciakowski, Jerome P.; Milic, Natasa M.

    2016-01-01

    Numerous studies demonstrating that statistical errors are common in basic science publications have led to calls to improve statistical training for basic scientists. In this article, we sought to evaluate statistical requirements for PhD training and to identify opportunities for improving biostatistics education in the basic sciences. We provide recommendations for improving statistics training for basic biomedical scientists, including: 1. Encouraging departments to require statistics training, 2. Tailoring coursework to the students’ fields of research, and 3. Developing tools and strategies to promote education and dissemination of statistical knowledge. We also provide a list of statistical considerations that should be addressed in statistics education for basic scientists. PMID:27058055

  14. Mechanics, Waves and Thermodynamics

    NASA Astrophysics Data System (ADS)

    Ranjan Jain, Sudhir

    2016-05-01

    Figures; Preface; Acknowledgement; 1. Energy, mass, momentum; 2. Kinematics, Newton's laws of motion; 3. Circular motion; 4. The principle of least action; 5. Work and energy; 6. Mechanics of a system of particles; 7. Friction; 8. Impulse and collisions; 9. Central forces; 10. Dimensional analysis; 11. Oscillations; 12. Waves; 13. Sound of music; 14. Fluid mechanics; 15. Water waves; 16. The kinetic theory of gases; 17. Concepts and laws of thermodynamics; 18. Some applications of thermodynamics; 19. Basic ideas of statistical mechanics; Bibliography; Index.

  15. Retrocausal Effects as a Consequence of Quantum Mechanics Refined to Accommodate the Principle of Sufficient Reason

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapp, Henry P.

    2011-05-10

    The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determinedmore » by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.« less

  16. Statistical Mechanics of Disordered Systems - Series: Cambridge Series in Statistical and Probabilistic Mathematics (No. 18)

    NASA Astrophysics Data System (ADS)

    Bovier, Anton

    2006-06-01

    Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field

  17. Theory of Financial Risk and Derivative Pricing

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2009-01-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  18. Theory of Financial Risk and Derivative Pricing - 2nd Edition

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe; Potters, Marc

    2003-12-01

    Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.

  19. Granular statistical mechanics - a personal perspective

    NASA Astrophysics Data System (ADS)

    Blumenfeld, R.; Edwards, S. F.

    2014-10-01

    The science of granular matter has expanded from an activity for specialised engineering applications to a fundamental field in its own right. This has been accompanied by an explosion of research and literature, which cannot be reviewed in one paper. A key to progress in this field is the formulation of a statistical mechanical formalism that could help develop equations of state and constitutive relations. This paper aims at reviewing some milestones in this direction. An essential basic step toward the development of any static and quasi-static theory of granular matter is a systematic and useful method to quantify the grain-scale structure and we start with a review of such a method. We then review and discuss the ongoing attempt to construct a statistical mechanical theory of granular systems. Along the way, we will clarify a number of misconceptions in the field, as well as highlight several outstanding problems.

  20. The actual content of quantum theoretical kinematics and mechanics

    NASA Technical Reports Server (NTRS)

    Heisenberg, W.

    1983-01-01

    First, exact definitions are supplied for the terms: position, velocity, energy, etc. (of the electron, for instance), such that they are valid also in quantum mechanics. Canonically conjugated variables are determined simultaneously only with a characteristic uncertainty. This uncertainty is the intrinsic reason for the occurrence of statistical relations in quantum mechanics. Mathematical formulation is made possible by the Dirac-Jordan theory. Beginning from the basic principles thus obtained, macroscopic processes are understood from the viewpoint of quantum mechanics. Several imaginary experiments are discussed to elucidate the theory.

  1. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results

  2. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  3. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091

  4. Measuring Boltzmann's Constant with Carbon Dioxide

    ERIC Educational Resources Information Center

    Ivanov, Dragia; Nikolov, Stefan

    2013-01-01

    In this paper we present two experiments to measure Boltzmann's constant--one of the fundamental constants of modern-day physics, which lies at the base of statistical mechanics and thermodynamics. The experiments use very basic theory, simple equipment and cheap and safe materials yet provide very precise results. They are very easy and…

  5. Have Basic Mathematical Skills Grown Obsolete in the Computer Age: Assessing Basic Mathematical Skills and Forecasting Performance in a Business Statistics Course

    ERIC Educational Resources Information Center

    Noser, Thomas C.; Tanner, John R.; Shah, Situl

    2008-01-01

    The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…

  6. Examination of the relationship between preservice science teachers' scientific reasoning and problem solving skills on basic mechanics

    NASA Astrophysics Data System (ADS)

    Yuksel, Ibrahim; Ates, Salih

    2018-02-01

    The purpose of this study is to determine relationship between scientific reasoning and mechanics problem solving skills of students in science education program. Scientific Reasoning Skills Test (SRST) and Basic Mechanics Knowledge Test (BMKT) were applied to 90 second, third and fourth grade students who took Scientific Reasoning Skills course at science teaching program of Gazi Faculty of Education for three successive fall semesters of 2014, 2015 and 2016 academic years. It was found a statistically significant positive (p = 0.038 <0.05) but a low correlation (r = 0.219) between SRST and BMKT. There were no significant relationship among Conservation Laws, Proportional Thinking, Combinational Thinking, Correlational Thinking, Probabilistic Thinking subskills of reasoning and BMKT. There were significant and positive correlation among Hypothetical Thinking and Identifying and Controlling Variables subskills of reasoning and BMKT. The findings of the study were compared with other studies in the field and discussed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoilova, N. I.

    Generalized quantum statistics, such as paraboson and parafermion statistics, are characterized by triple relations which are related to Lie (super)algebras of type B. The correspondence of the Fock spaces of parabosons, parafermions as well as the Fock space of a system of parafermions and parabosons to irreducible representations of (super)algebras of type B will be pointed out. Example of generalized quantum statistics connected to the basic classical Lie superalgebra B(1|1) ≡ osp(3|2) with interesting physical properties, such as noncommutative coordinates, will be given. Therefore the article focuses on the question, addressed already in 1950 by Wigner: do the equation ofmore » motion determine the quantum mechanical commutation relation?.« less

  8. Commentary: Decaying Numerical Skills. "I Can't Divide by 60 in My Head!"

    ERIC Educational Resources Information Center

    Parslow, Graham R.

    2010-01-01

    As an undergraduate in the 1960s, the author mostly used a slide rule for calculations and a Marchant-brand motor-operated mechanical calculator for statistics. This was after an elementary education replete with learning multiplication tables and taking speed and accuracy tests in arithmetic. Times have changed and assuming even basic calculation…

  9. Statistical learning and language acquisition

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2011-01-01

    Human learners, including infants, are highly sensitive to structure in their environment. Statistical learning refers to the process of extracting this structure. A major question in language acquisition in the past few decades has been the extent to which infants use statistical learning mechanisms to acquire their native language. There have been many demonstrations showing infants’ ability to extract structures in linguistic input, such as the transitional probability between adjacent elements. This paper reviews current research on how statistical learning contributes to language acquisition. Current research is extending the initial findings of infants’ sensitivity to basic statistical information in many different directions, including investigating how infants represent regularities, learn about different levels of language, and integrate information across situations. These current directions emphasize studying statistical language learning in context: within language, within the infant learner, and within the environment as a whole. PMID:21666883

  10. Statistical foundations of liquid-crystal theory: I. Discrete systems of rod-like molecules.

    PubMed

    Seguin, Brian; Fried, Eliot

    2012-12-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals.

  11. Mass action at the single-molecule level.

    PubMed

    Shon, Min Ju; Cohen, Adam E

    2012-09-05

    We developed a system to reversibly encapsulate small numbers of molecules in an array of nanofabricated "dimples". This system enables highly parallel, long-term, and attachment-free studies of molecular dynamics via single-molecule fluorescence. In studies of bimolecular reactions of small numbers of confined molecules, we see phenomena that, while expected from basic statistical mechanics, are not observed in bulk chemistry. Statistical fluctuations in the occupancy of sealed reaction chambers lead to steady-state fluctuations in reaction equilibria and rates. These phenomena are likely to be important whenever reactions happen in confined geometries.

  12. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  13. Using Bayes' theorem for free energy calculations

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.

  14. Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation

    NASA Astrophysics Data System (ADS)

    Tchiguirinskaia, Ioulia; Scherzer, Daniel

    2016-04-01

    Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.

  15. Random bursts determine dynamics of active filaments.

    PubMed

    Weber, Christoph A; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S; Bausch, Andreas R; Frey, Erwin

    2015-08-25

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system's dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model.

  16. Random bursts determine dynamics of active filaments

    PubMed Central

    Weber, Christoph A.; Suzuki, Ryo; Schaller, Volker; Aranson, Igor S.; Bausch, Andreas R.; Frey, Erwin

    2015-01-01

    Constituents of living or synthetic active matter have access to a local energy supply that serves to keep the system out of thermal equilibrium. The statistical properties of such fluctuating active systems differ from those of their equilibrium counterparts. Using the actin filament gliding assay as a model, we studied how nonthermal distributions emerge in active matter. We found that the basic mechanism involves the interplay between local and random injection of energy, acting as an analog of a thermal heat bath, and nonequilibrium energy dissipation processes associated with sudden jump-like changes in the system’s dynamic variables. We show here how such a mechanism leads to a nonthermal distribution of filament curvatures with a non-Gaussian shape. The experimental curvature statistics and filament relaxation dynamics are reproduced quantitatively by stochastic computer simulations and a simple kinetic model. PMID:26261319

  17. A Self-Contained Mapping Closure Approximation for Scalar Mixing

    DTIC Science & Technology

    2003-12-01

    hierarchy in statistical mechanics ( Balescu 1975), where the correlations are specified a priori and then fixed. The MCA approach does not invoke...and thus the scalar fields. Unlike usual treatments in the BBGKY hierar- chy ( Balescu 1975), where the representations are specified a priori, the...discussions. This work was supported by the Speciae Funds for Major Basic Research Project G. 2000077305, P. R. China. REFERENCES BALESCU , R. 1975

  18. The einstein equivalence principle, intrinsic spin and the invariance of constitutive equations in continuum mechanics

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.

    1988-01-01

    The invariance of constitutive equations in continuum mechanics is examined from a basic theoretical standpoint. It is demonstrated the constitutive equations which are not form invariant under arbitrary translational accelerations of the reference frame are in violation of the Einstein equivalane principle. Furthermore, by making use of an analysis based on statistical mechanics, it is argued that any frame-dependent terms in constitutive equations must arise from the intrinsic spin tensor and are negligible provided that the ratio of microscopic to macroscopic time scales is extremely small. The consistency of these results with existing constitutive theories is discussed in detail along with possible avenues of future research.

  19. Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu

    1995-11-01

    This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.

  20. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    ERIC Educational Resources Information Center

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  1. The United Nations Basic Space Science Initiative

    NASA Astrophysics Data System (ADS)

    Haubold, H. J.

    2006-08-01

    Pursuant to recommendations of the United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space (UNCOPUOS), annual UN/ European Space Agency workshops on basic space science have been held around the world since 1991. These workshops contribute to the development of astrophysics and space science, particularly in developing nations. Following a process of prioritization, the workshops identified the following elements as particularly important for international cooperation in the field: (i) operation of astronomical telescope facilities implementing TRIPOD, (ii) virtual observatories, (iii) astrophysical data systems, (iv) concurrent design capabilities for the development of international space missions, and (v) theoretical astrophysics such as applications of nonextensive statistical mechanics. Beginning in 2005, the workshops focus on preparations for the International Heliophysical Year 2007 (IHY2007). The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost, ground-based, world-wide instrument arrays as lead by the IHY secretariat. Wamsteker, W., Albrecht, R. and Haubold, H.J.: Developing Basic Space Science World-Wide: A Decade of UN/ESA Workshops. Kluwer Academic Publishers, Dordrecht 2004. http://ihy2007.org http://www.unoosa.org/oosa/en/SAP/bss/ihy2007/index.html http://www.cbpf.br/GrupPesq/StatisticalPhys/biblio.htm

  2. Theory of the Decoherence Effect in Finite and Infinite Open Quantum Systems Using the Algebraic Approach

    NASA Astrophysics Data System (ADS)

    Blanchard, Philippe; Hellmich, Mario; Ługiewicz, Piotr; Olkiewicz, Robert

    Quantum mechanics is the greatest revision of our conception of the character of the physical world since Newton. Consequently, David Hilbert was very interested in quantum mechanics. He and John von Neumann discussed it frequently during von Neumann's residence in Göttingen. He published in 1932 his book Mathematical Foundations of Quantum Mechanics. In Hilbert's opinion it was the first exposition of quantum mechanics in a mathematically rigorous way. The pioneers of quantum mechanics, Heisenberg and Dirac, neither had use for rigorous mathematics nor much interest in it. Conceptually, quantum theory as developed by Bohr and Heisenberg is based on the positivism of Mach as it describes only observable quantities. It first emerged as a result of experimental data in the form of statistical observations of quantum noise, the basic concept of quantum probability.

  3. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  4. Is quantum theory a form of statistical mechanics?

    NASA Astrophysics Data System (ADS)

    Adler, S. L.

    2007-05-01

    We give a review of the basic themes of my recent book: Adler S L 2004 Quantum Theory as an Emergent Phenomenon (Cambridge: Cambridge University Press). We first give motivations for considering the possibility that quantum mechanics is not exact, but is instead an accurate asymptotic approximation to a deeper level theory. For this deeper level, we propose a non-commutative generalization of classical mechanics, that we call "trace dynamics", and we give a brief survey of how it works, considering for simplicity only the bosonic case. We then discuss the statistical mechanics of trace dynamics and give our argument that with suitable approximations, the Ward identities for trace dynamics imply that ensemble averages in the canonical ensemble correspond to Wightman functions in quantum field theory. Thus, quantum theory emerges as the statistical thermodynamics of trace dynamics. Finally, we argue that Brownian motion corrections to this thermodynamics lead to stochastic corrections to the Schrödinger equation, of the type that have been much studied in the "continuous spontaneous localization" model of objective state vector reduction. In appendices to the talk, we give details of the existence of a conserved operator in trace dynamics that encodes the structure of the canonical algebra, of the derivation of the Ward identities, and of the proof that the stochastically-modified Schrödinger equation leads to state vector reduction with Born rule probabilities.

  5. What are hierarchical models and how do we analyze them?

    USGS Publications Warehouse

    Royle, Andy

    2016-01-01

    In this chapter we provide a basic definition of hierarchical models and introduce the two canonical hierarchical models in this book: site occupancy and N-mixture models. The former is a hierarchical extension of logistic regression and the latter is a hierarchical extension of Poisson regression. We introduce basic concepts of probability modeling and statistical inference including likelihood and Bayesian perspectives. We go through the mechanics of maximizing the likelihood and characterizing the posterior distribution by Markov chain Monte Carlo (MCMC) methods. We give a general perspective on topics such as model selection and assessment of model fit, although we demonstrate these topics in practice in later chapters (especially Chapters 5, 6, 7, and 10 Chapter 5 Chapter 6 Chapter 7 Chapter 10)

  6. Determining significant material properties: A discovery approach

    NASA Technical Reports Server (NTRS)

    Karplus, Alan K.

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. The experiment itself can be informative for persons of any age past elementary school, and even for some in elementary school. The preparation of the plastic samples is readily accomplished by persons with resonable dexterity in the cutting of paper designs. The completion of the statistical Design of Experiments, which uses Yates' Method, requires basic math (addition and subtraction). Interpretive work requires plotting of data and making observations. Knowledge of statistical methods would be helpful. The purpose of this experiment is to acquaint students with the seven classes of recyclable plastics, and provide hands-on learning about the response of these plastics to mechanical tensile loading.

  7. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  8. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  9. The United Nations Basic Space Science Initiative (UNBSSI): A Historical Introduction

    NASA Astrophysics Data System (ADS)

    Haubold, H. J.

    2006-11-01

    Pursuant to recommendations of the Third United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space (UNCOPUOS), annual UN/European Space Agency workshops on basic space science have been held around the world since 1991. These workshops contributed to the development of astrophysics and space science, particularly in developing nations. Following a process of prioritization, the workshops identified the following elements as particularly important for international cooperation in the field: (i) operation of astronomical telescope facilities implementing TRIPOD, (ii) virtual observatories, (iii) astrophysical data systems, (iv) con-current design capabilities for the development of international space missions, and (v) theoretical astrophysics such as applications of non-extensive statistical mechanics. Beginning in 2005, the workshops are focusing on preparations for the International Heliophysical Year 2007 (IHY2007). The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost, ground-based, world- wide instrument arrays as led by the IHY secretariat. Wamsteker, W., Albrecht, R. and Haubold, H.J.: Developing Basic Space Science World-Wide: A Decade of UN/ESA Workshops: Kluwer Academic Publishers, Dordrecht 2004. http://ihy2007.org http://www.unoosa.org/oosa/en/SAP/bss/ihy2007/index.html http://www.cbpf.br/GrupPesq/StatisticalPhys/biblio.htm

  10. The United Nations Basic Space Science Initiative

    NASA Astrophysics Data System (ADS)

    Haubold, H. J.

    Pursuant to recommendations of the United Nations Conference on the Exploration and Peaceful Uses of Outer Space UNISPACE III and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space UNCOPUOS annual UN European Space Agency workshops on basic space science have been held around the world since 1991 These workshops contribute to the development of astrophysics and space science particularly in developing nations Following a process of prioritization the workshops identified the following elements as particularly important for international cooperation in the field i operation of astronomical telescope facilities implementing TRIPOD ii virtual observatories iii astrophysical data systems iv concurrent design capabilities for the development of international space missions and v theoretical astrophysics such as applications of nonextensive statistical mechanics Beginning in 2005 the workshops focus on preparations for the International Heliophysical Year 2007 IHY2007 The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost ground-based world-wide instrument arrays as lead by the IHY secretariat Further information Wamsteker W Albrecht R and Haubold H J Developing Basic Space Science World-Wide A Decade of UN ESA Workshops Kluwer Academic Publishers Dordrecht 2004 http ihy2007 org http www oosa unvienna org SAP bss ihy2007 index html http www cbpf br GrupPesq StatisticalPhys biblio htm

  11. Intuitive statistics by 8-month-old infants

    PubMed Central

    Xu, Fei; Garcia, Vashti

    2008-01-01

    Human learners make inductive inferences based on small amounts of data: we generalize from samples to populations and vice versa. The academic discipline of statistics formalizes these intuitive statistical inferences. What is the origin of this ability? We report six experiments investigating whether 8-month-old infants are “intuitive statisticians.” Our results showed that, given a sample, the infants were able to make inferences about the population from which the sample had been drawn. Conversely, given information about the entire population of relatively small size, the infants were able to make predictions about the sample. Our findings provide evidence that infants possess a powerful mechanism for inductive learning, either using heuristics or basic principles of probability. This ability to make inferences based on samples or information about the population develops early and in the absence of schooling or explicit teaching. Human infants may be rational learners from very early in development. PMID:18378901

  12. Analysis of basic clustering algorithms for numerical estimation of statistical averages in biomolecules.

    PubMed

    Anandakrishnan, Ramu; Onufriev, Alexey

    2008-03-01

    In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.

  13. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  14. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  15. The classification of wind shears from the point of view of aerodynamics and flight mechanics

    NASA Technical Reports Server (NTRS)

    Seidler, Fritz; Hensel, Gunter

    1987-01-01

    A study of international statistical data shows that in about three quarters of all serious accidents which occurred with jet propelled airliners wind shear was either one of the main causes of the accident or represented a major contributory cause. Wind shear related problems are examined. The necessity of a use of different concepts, definitions, and divisions is explained, and the concepts and definitions required for the division of wind and wind shear into different categories is discussed. A description of the context between meteorological and aerodynamics-flight mechanics concepts, definitions, and divisions is also provided. Attention is given to wind and wind components, general characteristics of wind shear and the meteorological terms, the basic types of wind shear for aerodynamics-flight mechanics investigations, special types of wind shear for aerodynamics-flight mechanics investigations, and possibilities regarding a change of the wind component.

  16. Health financing and integration of urban and rural residents' basic medical insurance systems in China.

    PubMed

    Zhu, Kun; Zhang, Luying; Yuan, Shasha; Zhang, Xiaojuan; Zhang, Zhiruo

    2017-11-07

    China is in the process of integrating the new cooperative medical scheme (NCMS) and the urban residents' basic medical insurance system (URBMI) into the urban and rural residents' basic medical insurance system (URRBMI). However, how to integrate the financing policies of NCMS and URBMI has not been described in detail. This paper attempts to illustrate the differences between the financing mechanisms of NCMS and URBMI, to analyze financing inequity between urban and rural residents and to identify financing mechanisms for integrating urban and rural residents' medical insurance systems. Financing data for NCMS and URBMI (from 2008 to 2015) was collected from the China health statistics yearbook, the China health and family planning statistics yearbook, the National Handbook of NCMS Information, the China human resources and social security statistics yearbook, and the China social security yearbook. "Ability to pay" was introduced to measure inequity in health financing. Individual contributions to NCMS and URBMI as a function of per capita disposable income was used to analyze equity in health financing between rural and urban residents. URBMI had a financing mechanism that was similar to that used by NCMS in that public finance accounted for more than three quarters of the pooling funds. The scale of financing for NCMS was less than 5% of the per capita net income of rural residents and less than 2% of the per capita disposable income of urban residents for URBMI. Individual contributions to the NCMS and URBMI funds were less than 1% of their disposable and net incomes. Inequity in health financing between urban and rural residents in China was not improved as expected with the introduction of NCMS and URBMI. The role of the central government and local governments in financing NCMS and URBMI was oscillating in the past decade. The scale of financing for URRBMI is insufficient for the increasing demands for medical services from the insured. The pooling fund should be increased so that it can better adjust to China's rapidly aging population and epidemiological transitions as well as protect the insured from poverty due to illness. Individual contributions to the URBMI and NCMS funds were small in terms of contributors' incomes. The role of the central government and local governments in financing URRBMI was not clearly identified. Individual contributions to the URRBMI fund should be increased to ensure the sustainable development of URRBMI. Compulsory enrollment should be required so that URRBMI improves the social medical insurance system in China.

  17. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    ERIC Educational Resources Information Center

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  18. A basic introduction to statistics for the orthopaedic surgeon.

    PubMed

    Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef

    2012-02-01

    Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.

  19. Strength/Brittleness Classification of Igneous Intact Rocks Based on Basic Physical and Dynamic Properties

    NASA Astrophysics Data System (ADS)

    Aligholi, Saeed; Lashkaripour, Gholam Reza; Ghafoori, Mohammad

    2017-01-01

    This paper sheds further light on the fundamental relationships between simple methods, rock strength, and brittleness of igneous rocks. In particular, the relationship between mechanical (point load strength index I s(50) and brittleness value S 20), basic physical (dry density and porosity), and dynamic properties (P-wave velocity and Schmidt rebound values) for a wide range of Iranian igneous rocks is investigated. First, 30 statistical models (including simple and multiple linear regression analyses) were built to identify the relationships between mechanical properties and simple methods. The results imply that rocks with different Schmidt hardness (SH) rebound values have different physicomechanical properties or relations. Second, using these results, it was proved that dry density, P-wave velocity, and SH rebound value provide a fine complement to mechanical properties classification of rock materials. Further, a detailed investigation was conducted on the relationships between mechanical and simple tests, which are established with limited ranges of P-wave velocity and dry density. The results show that strength values decrease with the SH rebound value. In addition, there is a systematic trend between dry density, P-wave velocity, rebound hardness, and brittleness value of the studied rocks, and rocks with medium hardness have a higher brittleness value. Finally, a strength classification chart and a brittleness classification table are presented, providing reliable and low-cost methods for the classification of igneous rocks.

  20. Using Data Mining to Teach Applied Statistics and Correlation

    ERIC Educational Resources Information Center

    Hartnett, Jessica L.

    2016-01-01

    This article describes two class activities that introduce the concept of data mining and very basic data mining analyses. Assessment data suggest that students learned some of the conceptual basics of data mining, understood some of the ethical concerns related to the practice, and were able to perform correlations via the Statistical Package for…

  1. Simple Data Sets for Distinct Basic Summary Statistics

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.

    2011-01-01

    It is important to avoid ambiguity with numbers because unfortunate choices of numbers can inadvertently make it possible for students to form misconceptions or make it difficult for teachers to tell if students obtained the right answer for the right reason. Therefore, it is important to make sure when introducing basic summary statistics that…

  2. How do energetic ions damage metallic surfaces?

    DOE PAGES

    Osetskiy, Yury N.; Calder, Andrew F.; Stoller, Roger E.

    2015-02-20

    Surface modification under bombardment by energetic ions observed under different conditions in structural and functional materials and can be either unavoidable effect of the conditions or targeted modification to enhance materials properties. Understanding basic mechanisms is necessary for predicting properties changes. The mechanisms activated during ion irradiation are of atomic scale and atomic scale modeling is the most suitable tool to study these processes. In this paper we present results of an extensive simulation program aimed at developing an understanding of primary surface damage in iron by energetic particles. We simulated 25 keV self-ion bombardment of Fe thin films withmore » (100) and (110) surfaces at room temperature. A large number of simulations, ~400, were carried out allow a statistically significant treatment of the results. The particular mechanism of surface damage depends on how the destructive supersonic shock wave generated by the displacement cascade interacts with the free surface. Three basic scenarios were observed, with the limiting cases being damage created far below the surface with little or no impact on the surface itself, and extensive direct surface damage on the timescale of a few picoseconds. In some instances, formation of large <100> vacancy loops beneath the free surface was observed, which may explain some earlier experimental observations.« less

  3. Variations in intensity statistics for representational and abstract art, and for art from the Eastern and Western hemispheres.

    PubMed

    Graham, Daniel J; Field, David J

    2008-01-01

    Two recent studies suggest that natural scenes and paintings show similar statistical properties. But does the content or region of origin of an artwork affect its statistical properties? We addressed this question by having judges place paintings from a large, diverse collection of paintings into one of three subject-matter categories using a forced-choice paradigm. Basic statistics for images whose caterogization was agreed by all judges showed no significant differences between those judged to be 'landscape' and 'portrait/still-life', but these two classes differed from paintings judged to be 'abstract'. All categories showed basic spatial statistical regularities similar to those typical of natural scenes. A test of the full painting collection (140 images) with respect to the works' place of origin (provenance) showed significant differences between Eastern works and Western ones, differences which we find are likely related to the materials and the choice of background color. Although artists deviate slightly from reproducing natural statistics in abstract art (compared to representational art), the great majority of human art likely shares basic statistical limitations. We argue that statistical regularities in art are rooted in the need to make art visible to the eye, not in the inherent aesthetic value of natural-scene statistics, and we suggest that variability in spatial statistics may be generally imposed by manufacture.

  4. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  5. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  6. From Research to Practice: Basic Mathematics Skills and Success in Introductory Statistics

    ERIC Educational Resources Information Center

    Lunsford, M. Leigh; Poplin, Phillip

    2011-01-01

    Based on previous research of Johnson and Kuennen (2006), we conducted a study to determine factors that would possibly predict student success in an introductory statistics course. Our results were similar to Johnson and Kuennen in that we found students' basic mathematical skills, as measured on a test created by Johnson and Kuennen, were a…

  7. Words, rules, and mechanisms of language acquisition.

    PubMed

    Endress, Ansgar D; Bonatti, Luca L

    2016-01-01

    We review recent artificial language learning studies, especially those following Endress and Bonatti (Endress AD, Bonatti LL. Rapid learning of syllable classes from a perceptually continuous speech stream. Cognition 2007, 105:247-299), suggesting that humans can deploy a variety of learning mechanisms to acquire artificial languages. Several experiments provide evidence for multiple learning mechanisms that can be deployed in fluent speech: one mechanism encodes the positions of syllables within words and can be used to extract generalization, while the other registers co-occurrence statistics of syllables and can be used to break a continuum into its components. We review dissociations between these mechanisms and their potential role in language acquisition. We then turn to recent criticisms of the multiple mechanisms hypothesis and show that they are inconsistent with the available data. Our results suggest that artificial and natural language learning is best understood by dissecting the underlying specialized learning abilities, and that these data provide a rare opportunity to link important language phenomena to basic psychological mechanisms. For further resources related to this article, please visit the WIREs website. © 2015 Wiley Periodicals, Inc.

  8. Bench to bedside: the quest for quality in experimental stroke research.

    PubMed

    Dirnagl, Ulrich

    2006-12-01

    Over the past decades, great progress has been made in clinical as well as experimental stroke research. Disappointingly, however, hundreds of clinical trials testing neuroprotective agents have failed despite efficacy in experimental models. Recently, several systematic reviews have exposed a number of important deficits in the quality of preclinical stroke research. Many of the issues raised in these reviews are not specific to experimental stroke research, but apply to studies of animal models of disease in general. It is the aim of this article to review some quality-related sources of bias with a particular focus on experimental stroke research. Weaknesses discussed include, among others, low statistical power and hence reproducibility, defects in statistical analysis, lack of blinding and randomization, lack of quality-control mechanisms, deficiencies in reporting, and negative publication bias. Although quantitative evidence for quality problems at present is restricted to preclinical stroke research, to spur discussion and in the hope that they will be exposed to meta-analysis in the near future, I have also included some quality-related sources of bias, which have not been systematically studied. Importantly, these may be also relevant to mechanism-driven basic stroke research. I propose that by a number of rather simple measures reproducibility of experimental results, as well as the step from bench to bedside in stroke research may be made more successful. However, the ultimate proof for this has to await successful phase III stroke trials, which were built on basic research conforming to the criteria as put forward in this article.

  9. Universal Algorithm for Identification of Fractional Brownian Motion. A Case of Telomere Subdiffusion

    PubMed Central

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-01-01

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912

  10. Foundations of radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Mihalas, D.; Mihalas, B. W.

    This book is the result of an attempt, over the past few years, to gather the basic tools required to do research on radiating flows in astrophysics. The microphysics of gases is discussed, taking into account the equation of state of a perfect gas, the first and second law of thermodynamics, the thermal properties of a perfect gas, the distribution function and Boltzmann's equation, the collision integral, the Maxwellian velocity distribution, Boltzmann's H-theorem, the time of relaxation, and aspects of classical statistical mechanics. Other subjects explored are related to the dynamics of ideal fluids, the dynamics of viscous and heat-conducting fluids, relativistic fluid flow, waves, shocks, winds, radiation and radiative transfer, the equations of radiation hydrodynamics, and radiating flows. Attention is given to small-amplitude disturbances, nonlinear flows, the interaction of radiation and matter, the solution of the transfer equation, acoustic waves, acoustic-gravity waves, basic concepts of special relativity, and equations of motion and energy.

  11. Statistical thermodynamics unveils the dissolution mechanism of cellobiose.

    PubMed

    Nicol, Thomas W J; Isobe, Noriyuki; Clark, James H; Shimizu, Seishi

    2017-08-30

    In the study of the cellulose dissolution mechanism opinion is still divided. Here, the solution interaction components of the most prominent hypotheses for the driving force of cellulose dissolution were evaluated quantitatively. Combining a rigorous statistical thermodynamic theory and cellobiose solubility data in the presence of chloride salts, whose cations progress in the Hofmeister series (KCl, NaCl, LiCl and ZnCl 2 ), we have shown that cellobiose solubilization is driven by the preferential accumulation of salts around the solutes which is stronger than cellobiose hydration. Yet contrary to the classical chaotropy hypothesis, increasing salt concentration leads to cellobiose dehydration in the presence of the strongest solubilizer ZnCl 2 . However, thanks to cellobiose dehydration, cellobiose-salt interaction still remains preferential despite weakening salt accumulation. Based on such insights, the previous hypotheses based on hydrophobicity and polymer charging have also been evaluated quantitatively. Thus, our present study successfully paved a way towards identifying the basic driving forces for cellulose solubilization in a quantitative manner for the first time. When combined with unit additivity methods this quantitative information could lead to a full understanding of cellulose solubility.

  12. Self-Organization: Complex Dynamical Systems in the Evolution of Speech

    NASA Astrophysics Data System (ADS)

    Oudeyer, Pierre-Yves

    Human vocalization systems are characterized by complex structural properties. They are combinatorial, based on the systematic reuse of phonemes, and the set of repertoires in human languages is characterized by both strong statistical regularities—universals—and a great diversity. Besides, they are conventional codes culturally shared in each community of speakers. What are the origins of the forms of speech? What are the mechanisms that permitted their evolution in the course of phylogenesis and cultural evolution? How can a shared speech code be formed in a community of individuals? This chapter focuses on the way the concept of self-organization, and its interaction with natural selection, can throw light on these three questions. In particular, a computational model is presented which shows that a basic neural equipment for adaptive holistic vocal imitation, coupling directly motor and perceptual representations in the brain, can generate spontaneously shared combinatorial systems of vocalizations in a society of babbling individuals. Furthermore, we show how morphological and physiological innate constraints can interact with these self-organized mechanisms to account for both the formation of statistical regularities and diversity in vocalization systems.

  13. Evolution of the use of noninvasive mechanical ventilation in chronic obstructive pulmonary disease in a Spanish region, 1997-2010.

    PubMed

    Carpe-Carpe, Bienvenida; Hernando-Arizaleta, Lauro; Ibáñez-Pérez, M Carmen; Palomar-Rodríguez, Joaquín A; Esquinas-Rodríguez, Antonio M

    2013-08-01

    Noninvasive mechanical ventilation (NIV) appeared in the 1980s as an alternative to invasive mechanical ventilation (IMV) in patients with acute respiratory failure. We evaluated the introduction of NIV and the results in patients with acute exacerbation of chronic obstructive pulmonary disease in the Region of Murcia (Spain). A retrospective observational study based on the minimum basic hospital discharge data of all patients hospitalised for this pathology in all public hospitals in the region between 1997 and 2010. We performed a time trend analysis on hospital attendance, the use of each ventilatory intervention and hospital mortality through joinpoint regression. We identified 30.027 hospital discharges. Joinpoint analysis: downward trend in attendance (annual percentage change [APC]=-3.4, 95% CI: - 4.8; -2.0, P <.05) and in the group without ventilatory intervention (APC=-4.2%, -5.6; -2.8, P <.05); upward trend in the use of NIV (APC=16.4, 12.0; 20. 9, P <.05), and downward trend that was not statistically significant in IMV (APC=-4.5%, -10.3; 1.7). We observed an upward trend without statistical significance in overall mortality (APC=0.5, -1.3; 2.4) and in the group without intervention (APC=0.1, -1.6; 1.9); downward trend with statistical significance in the NIV group (APC=-7.1, -11.7; -2.2, P <.05) and not statistically significant in the IMV group (APC=-0,8, -6, 1; 4.8). The mean stay did not change substantially. The introduction of NIV has reduced the group of patients not receiving assisted ventilation. No improvement in results was found in terms of mortality or length of stay. Copyright © 2012 SEPAR. Published by Elsevier Espana. All rights reserved.

  14. Descriptive data analysis.

    PubMed

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  15. Water Interfaces, Solvation, and Spectroscopy

    NASA Astrophysics Data System (ADS)

    Geissler, Phillip L.

    2013-04-01

    Liquid water consistently expands our appreciation of the rich statistical mechanics that can emerge from simple molecular constituents. Here I review several interrelated areas of recent work on aqueous systems that aim to explore and explain this richness by revealing molecular arrangements, their thermodynamic origins, and the timescales on which they change. Vibrational spectroscopy of OH stretching features prominently in these discussions, with an emphasis on efforts to establish connections between spectroscopic signals and statistics of intermolecular structure. For bulk solutions, the results of these efforts largely verify and enrich existing physical pictures of hydrogen-bond network connectivity, dynamics, and response. For water at interfaces, such pictures are still emerging. As an important example I discuss the solvation of small ions at the air-water interface, whose surface propensities challenge a basic understanding of how aqueous fluctuations accommodate solutes in heterogeneous environments.

  16. Like Beauty, Complexity is Hard to Define

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    Like beauty, complexity is hard to define and rather easy to identify: nonlinear dynamics, strongly interconnected simple elements, some sort of divisoria aquorum between order and disorder. Before focusing on complexity, let us remember that the theoretical pillars of contemporary physics are mechanics (Newtonian, relativistic, quantum), Maxwell electromagnetism, and (Boltzmann-Gibbs, BG) statistical mechanics - obligatory basic disciplines in any advanced course in physics. The firstprinciple statistical-mechanical approach starts from (microscopic) electro-mechanics and theory of probabilities, and, through a variety of possible mesoscopic descriptions, arrives to (oscopic) thermodynamics. In the middle of this trip, we cross energy and entropy. Energy is related to the possible microscopic configurations of the system, whereas entropy is related to the corresponding probabilities. Therefore, in some sense, entropy represents a concept which, epistemologically speaking, is one step further with regard to energy. The fact that energy is not parameter-independent is very familiar: the kinetic energy of a truck is very different from that of a fly, and the relativistic energy of a fast electron is very different from its classical value, and so on. What about entropy? One hundred and forty years of tradition, and hundreds - we may even say thousands - of impressive theoretical successes of the parameter-free BG entropy have sedimented, in the mind of many scientists, the conviction that it is unique. However, it can be straightforwardly argued that, in general, this is not the case...

  17. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  18. Applications of statistics to medical science (1) Fundamental concepts.

    PubMed

    Watanabe, Hiroshi

    2011-01-01

    The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.

  19. Fracture mechanics concepts in reliability analysis of monolithic ceramics

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.; Gyekenyesi, John P.

    1987-01-01

    Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.

  20. A Comparison of Computer-Assisted Instruction and the Traditional Method of Teaching Basic Statistics

    ERIC Educational Resources Information Center

    Ragasa, Carmelita Y.

    2008-01-01

    The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…

  1. Back to basics: an introduction to statistics.

    PubMed

    Halfens, R J G; Meijers, J M M

    2013-05-01

    In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.

  2. Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Sugiyama, Masaru

    . Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.

  3. Coupling functions: Universal insights into dynamical interaction mechanisms

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav; Pereira, Tiago; McClintock, Peter V. E.; Stefanovska, Aneta

    2017-10-01

    The dynamical systems found in nature are rarely isolated. Instead they interact and influence each other. The coupling functions that connect them contain detailed information about the functional mechanisms underlying the interactions and prescribe the physical rule specifying how an interaction occurs. A coherent and comprehensive review is presented encompassing the rapid progress made recently in the analysis, understanding, and applications of coupling functions. The basic concepts and characteristics of coupling functions are presented through demonstrative examples of different domains, revealing the mechanisms and emphasizing their multivariate nature. The theory of coupling functions is discussed through gradually increasing complexity from strong and weak interactions to globally coupled systems and networks. A variety of methods that have been developed for the detection and reconstruction of coupling functions from measured data is described. These methods are based on different statistical techniques for dynamical inference. Stemming from physics, such methods are being applied in diverse areas of science and technology, including chemistry, biology, physiology, neuroscience, social sciences, mechanics, and secure communications. This breadth of application illustrates the universality of coupling functions for studying the interaction mechanisms of coupled dynamical systems.

  4. Bayesian models: A statistical primer for ecologists

    USGS Publications Warehouse

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  5. Understanding Statistical Concepts and Terms in Context: The GovStat Ontology and the Statistical Interactive Glossary.

    ERIC Educational Resources Information Center

    Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.

    2003-01-01

    Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…

  6. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  7. Universal algorithm for identification of fractional Brownian motion. A case of telomere subdiffusion.

    PubMed

    Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander

    2012-11-07

    We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Tradeoffs between hydraulic and mechanical stress responses of mature Norway spruce trunk wood.

    PubMed

    Rosner, Sabine; Klein, Andrea; Müller, Ulrich; Karlsson, Bo

    2008-08-01

    We tested the effects of growth characteristics and basic density on hydraulic and mechanical properties of mature Norway spruce (Picea abies (L.) Karst.) wood from six 24-year-old clones, grown on two sites in southern Sweden differing in water availability. Hydraulic parameters assessed were specific hydraulic conductivity at full saturation (ks100) and vulnerability to cavitation (Psi50), mechanical parameters included bending strength (sigma b), modulus of elasticity (MOE), compression strength (sigma a) and Young's modulus (E). Basic density, diameter at breast height, tree height, and hydraulic and mechanical parameters varied considerably among clones. Clonal means of hydraulic and mechanical properties were strongly related to basic density and to growth parameters across sites, especially to diameter at breast height. Compared with stem wood of slower growing clones, stem wood of rapidly growing clones had significantly lower basic density, lower sigma b, MOE, sigma a and E, was more vulnerable to cavitation, but had higher ks100. Basic density was negatively correlated to Psi50 and ks100. We therefore found a tradeoff between Psi50 and ks100. Clones with high basic density had significantly lower hydraulic vulnerability, but also lower hydraulic conductivity at full saturation and thus less rapid growth than clones with low basic density. This tradeoff involved a negative relationship between Psi50 and sigma b as well as MOE, and between ks100 and sigma b, MOE and sigma a. Basic density and Psi50 showed no site-specific differences, but tree height, diameter at breast height, ks100 and mechanical strength and stiffness were significantly lower at the drier site. Basic density had no influence on the site-dependent differences in hydraulic and mechanical properties, but was strongly negatively related to diameter at breast height. Selecting for growth may thus lead not only to a reduction in mechanical strength and stiffness but also to a reduction in hydraulic safety.

  9. The energetic cost of walking: a comparison of predictive methods.

    PubMed

    Kramer, Patricia Ann; Sylvester, Adam D

    2011-01-01

    The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is "best", but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species.

  10. Fish: A New Computer Program for Friendly Introductory Statistics Help

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  11. Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces.

    PubMed

    Spezia, Riccardo; Martínez-Nuñez, Emilio; Vazquez, Saulo; Hase, William L

    2017-04-28

    In this Introduction, we show the basic problems of non-statistical and non-equilibrium phenomena related to the papers collected in this themed issue. Over the past few years, significant advances in both computing power and development of theories have allowed the study of larger systems, increasing the time length of simulations and improving the quality of potential energy surfaces. In particular, the possibility of using quantum chemistry to calculate energies and forces 'on the fly' has paved the way to directly study chemical reactions. This has provided a valuable tool to explore molecular mechanisms at given temperatures and energies and to see whether these reactive trajectories follow statistical laws and/or minimum energy pathways. This themed issue collects different aspects of the problem and gives an overview of recent works and developments in different contexts, from the gas phase to the condensed phase to excited states.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).

  12. Properties of JP=1/2+ baryon octets at low energy

    NASA Astrophysics Data System (ADS)

    Kaur, Amanpreet; Gupta, Pallavi; Upadhyay, Alka

    2017-06-01

    The statistical model in combination with the detailed balance principle is able to phenomenologically calculate and analyze spin- and flavor-dependent properties like magnetic moments (with effective masses, with effective charge, or with both effective mass and effective charge), quark spin polarization and distribution, the strangeness suppression factor, and \\overline{d}-\\overline{u} asymmetry incorporating the strange sea. The s\\overline{s} in the sea is said to be generated via the basic quark mechanism but suppressed by the strange quark mass factor ms>m_{u,d}. The magnetic moments of the octet baryons are analyzed within the statistical model, by putting emphasis on the SU(3) symmetry-breaking effects generated by the mass difference between the strange and non-strange quarks. The work presented here assumes hadrons with a sea having an admixture of quark gluon Fock states. The results obtained have been compared with theoretical models and experimental data.

  13. Quantitative analysis of spatial variability of geotechnical parameters

    NASA Astrophysics Data System (ADS)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  14. CADDIS Volume 4. Data Analysis: Basic Principles & Issues

    EPA Pesticide Factsheets

    Use of inferential statistics in causal analysis, introduction to data independence and autocorrelation, methods to identifying and control for confounding variables, references for the Basic Principles section of Data Analysis.

  15. Are We Able to Pass the Mission of Statistics to Students?

    ERIC Educational Resources Information Center

    Hindls, Richard; Hronová, Stanislava

    2015-01-01

    The article illustrates our long term experience in teaching statistics for non-statisticians, especially for students of economics and humanities. The article is focused on some problems of the basic course that can weaken the interest in statistics or lead to false use of statistic methods.

  16. Achievement Test Program.

    ERIC Educational Resources Information Center

    Ohio State Dept. of Education, Columbus. Trade and Industrial Education Service.

    The Ohio Trade and Industrial Education Achievement Test battery is comprised of seven basic achievement tests: Machine Trades, Automotive Mechanics, Basic Electricity, Basic Electronics, Mechanical Drafting, Printing, and Sheet Metal. The tests were developed by subject matter committees and specialists in testing and research. The Ohio Trade and…

  17. [Effects of Self-directed Feedback Practice using Smartphone Videos on Basic Nursing Skills, Confidence in Performance and Learning Satisfaction].

    PubMed

    Lee, Seul Gi; Shin, Yun Hee

    2016-04-01

    This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.

  18. Provision of Pre-Primary Education as a Basic Right in Tanzania: Reflections from Policy Documents

    ERIC Educational Resources Information Center

    Mtahabwa, Lyabwene

    2010-01-01

    This study sought to assess provision of pre-primary education in Tanzania as a basic right through analyses of relevant policy documents. Documents which were published over the past decade were considered, including educational policies, action plans, national papers, the "Basic Education Statistics in Tanzania" documents, strategy…

  19. Test particle propagation in magnetostatic turbulence. 2: The local approximation method

    NASA Technical Reports Server (NTRS)

    Klimas, A. J.; Sandri, G.; Scudder, J. D.; Howell, D. R.

    1976-01-01

    An approximation method for statistical mechanics is presented and applied to a class of problems which contains a test particle propagation problem. All of the available basic equations used in statistical mechanics are cast in the form of a single equation which is integrodifferential in time and which is then used as the starting point for the construction of the local approximation method. Simplification of the integrodifferential equation is achieved through approximation to the Laplace transform of its kernel. The approximation is valid near the origin in the Laplace space and is based on the assumption of small Laplace variable. No other small parameter is necessary for the construction of this approximation method. The n'th level of approximation is constructed formally, and the first five levels of approximation are calculated explicitly. It is shown that each level of approximation is governed by an inhomogeneous partial differential equation in time with time independent operator coefficients. The order in time of these partial differential equations is found to increase as n does. At n = 0 the most local first order partial differential equation which governs the Markovian limit is regained.

  20. Nonequilibrium statistical mechanics Brussels-Austin style

    NASA Astrophysics Data System (ADS)

    Bishop, Robert C.

    The fundamental problem on which Ilya Prigogine and the Brussels-Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels-Austin Group, is to consider the observed direction of time to be a basic physical phenomenon due to the dynamics of physical systems. This essay focuses mainly on recent developments in the Brussels-Austin Group after the mid-1980s. The fundamental concerns are the same as in their earlier approaches (subdynamics, similarity transformations), but the contemporary approach utilizes rigged Hilbert space (whereas the older approaches used Hilbert space). While the emphasis on nonequilibrium statistical mechanics remains the same, their more recent approach addresses the physical features of large Poincaré systems, nonlinear dynamics and the mathematical tools necessary to analyze them.

  1. Exploration in free word association networks: models and experiment.

    PubMed

    Ludueña, Guillermo A; Behzad, Mehran Djalali; Gros, Claudius

    2014-05-01

    Free association is a task that requires a subject to express the first word to come to their mind when presented with a certain cue. It is a task which can be used to expose the basic mechanisms by which humans connect memories. In this work, we have made use of a publicly available database of free associations to model the exploration of the averaged network of associations using a statistical and the adaptive control of thought-rational (ACT-R) model. We performed, in addition, an online experiment asking participants to navigate the averaged network using their individual preferences for word associations. We have investigated the statistics of word repetitions in this guided association task. We find that the considered models mimic some of the statistical properties, viz the probability of word repetitions, the distance between repetitions and the distribution of association chain lengths, of the experiment, with the ACT-R model showing a particularly good fit to the experimental data for the more intricate properties as, for instance, the ratio of repetitions per length of association chains.

  2. Spontaneous ultraweak photon emission from biological systems and the endogenous light field.

    PubMed

    Schwabl, Herbert; Klima, Herbert

    2005-04-01

    Still one of the most astonishing biological electromagnetic phenomena is the ultraweak photon emission (UPE) from living systems. Organisms and tissues spontaneously emit measurable intensities of light, i.e. photons in the visible part of the electromagnetic spectrum (380-780 nm), in the range from 1 to 1,000 photons x s-1 x cm-2, depending on their condition and vitality. It is important not to confuse UPE from living systems with other biogenic light emitting processes such as bioluminescence or chemiluminescence. This article examines with basic considerations from physics on the quantum nature of photons the empirical phenomenon of UPE. This leads to the description of the non-thermal origin of this radiation. This is in good correspondence with the modern understanding of life phenomena as dissipative processes far from thermodynamic equilibrium. UPE also supports the understanding of life sustaining processes as basically driven by electromagnetic fields. The basic features of UPE, like intensity and spectral distribution, are known in principle for many experimental situations. The UPE of human leukocytes contributes to an endogenous light field of about 1011 photons x s-1 which can be influenced by certain factors. Further research is needed to reveal the statistical properties of UPE and in consequence to answer questions about the underlying mechanics of the biological system. In principle, statistical properties of UPE allow to reconstruct phase-space dynamics of the light emitting structures. Many open questions remain until a proper understanding of the electromagnetic interaction of the human organism can be achieved: which structures act as receptors and emitters for electromagnetic radiation? How is electromagnetic information received and processed within cells?

  3. A crash course on data analysis in asteroseismology

    NASA Astrophysics Data System (ADS)

    Appourchaux, Thierry

    2014-02-01

    In this course, I try to provide a few basics required for performing data analysis in asteroseismology. First, I address how one can properly treat times series: the sampling, the filtering effect, the use of Fourier transform, the associated statistics. Second, I address how one can apply statistics for decision making and for parameter estimation either in a frequentist of a Bayesian framework. Last, I review how these basic principle have been applied (or not) in asteroseismology.

  4. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  5. On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Quadt, Ralf

    1990-10-01

    Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.

  6. Investigations on colour dependent photo induced microactuation effect of FSMA and proposing suitable mechanisms to control the effect

    NASA Astrophysics Data System (ADS)

    Bagchi, A.; Sarkar, S.; Mukhopadhyay, P. K.

    2018-02-01

    Three different coloured focused laser beams were used to study the photo induced microactuation effect found in some ferromagnetic shape memory alloys. Besides trying to uncover the basic causes of this unique and as yet unexplained effect, these studies are to help find other conditions to further characterize the effect for practical use. In this study some mechanisms have been proposed to control the amplitude of actuation of the sample. Control of the actuation of the FSMA sample both linearly with the help of a continuously variable neutral density filter as well periodically with the help of a linear polarizer was achieved. Statistical analysis of the experimental data was also done by applying ANOVA studies on the data to conclusively provide evidence in support of the relationship between the actuation of the sample and the various controlling factors. This study is expected to pave the way to implement this property of the sample in fabricating and operating useful micro-mechanical systems in the near future.

  7. Investigations on colour dependent photo induced microactuation effect of FSMA and proposing suitable mechanisms to control the effect

    NASA Astrophysics Data System (ADS)

    Bagchi, A.; Sarkar, S.; Mukhopadhyay, P. K.

    2018-07-01

    Three different coloured focused laser beams were used to study the photo induced microactuation effect found in some ferromagnetic shape memory alloys. Besides trying to uncover the basic causes of this unique and as yet unexplained effect, these studies are to help find other conditions to further characterize the effect for practical use. In this study some mechanisms have been proposed to control the amplitude of actuation of the sample. Control of the actuation of the FSMA sample both linearly with the help of a continuously variable neutral density filter as well periodically with the help of a linear polarizer was achieved. Statistical analysis of the experimental data was also done by applying ANOVA studies on the data to conclusively provide evidence in support of the relationship between the actuation of the sample and the various controlling factors. This study is expected to pave the way to implement this property of the sample in fabricating and operating useful micro-mechanical systems in the near future.

  8. The physics of lipid droplet nucleation, growth and budding.

    PubMed

    Thiam, Abdou Rachid; Forêt, Lionel

    2016-08-01

    Lipid droplets (LDs) are intracellular oil-in-water emulsion droplets, covered by a phospholipid monolayer and mainly present in the cytosol. Despite their important role in cellular metabolism and growing number of newly identified functions, LD formation mechanism from the endoplasmic reticulum remains poorly understood. To form a LD, the oil molecules synthesized in the ER accumulate between the monolayer leaflets and induce deformation of the membrane. This formation process works through three steps: nucleation, growth and budding, exactly as in phase separation and dewetting phenomena. These steps involve sequential biophysical membrane remodeling mechanisms for which we present basic tools of statistical physics, membrane biophysics, and soft matter science underlying them. We aim to highlight relevant factors that could control LD formation size, site and number through this physics description. An emphasis will be given to a currently underestimated contribution of the molecular interactions between lipids to favor an energetically costless mechanism of LD formation. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Basic Engineer Equipment Mechanic.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the skills needed by basic engineer equipment mechanics. Addressed in the four individual units of the course are the following topics: mechanics and their tools (mechanics, hand tools, and power…

  10. 14 CFR 147.23 - Instructor requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... mechanic certificates and ratings that the Administrator determines necessary to provide adequate... mechanics, to teach mathematics, physics, basic electricity, basic hydraulics, drawing, and similar subjects...

  11. 14 CFR 147.23 - Instructor requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... mechanic certificates and ratings that the Administrator determines necessary to provide adequate... mechanics, to teach mathematics, physics, basic electricity, basic hydraulics, drawing, and similar subjects...

  12. 14 CFR 147.23 - Instructor requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... mechanic certificates and ratings that the Administrator determines necessary to provide adequate... mechanics, to teach mathematics, physics, basic electricity, basic hydraulics, drawing, and similar subjects...

  13. 14 CFR 147.23 - Instructor requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... mechanic certificates and ratings that the Administrator determines necessary to provide adequate... mechanics, to teach mathematics, physics, basic electricity, basic hydraulics, drawing, and similar subjects...

  14. 14 CFR 147.23 - Instructor requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... mechanic certificates and ratings that the Administrator determines necessary to provide adequate... mechanics, to teach mathematics, physics, basic electricity, basic hydraulics, drawing, and similar subjects...

  15. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  16. Auto Mechanics. Performance Objectives. Basic Course.

    ERIC Educational Resources Information Center

    Carter, Thomas G., Sr.

    Several intermediate performance objectives and corresponding criterion measures are listed for each of 14 terminal objectives for a basic automotive mechanics course. The materials were developed for a two-semester course (2 hours daily) designed to provide training in the basic fundamentals in diagnosis and repair including cooling system and…

  17. Center for Prostate Disease Research

    MedlinePlus

    ... 2017 Cancer Statistics programs Clinical Research Program Synopsis Leadership Multi-Disciplinary Clinic Staff Listing 2017 Cancer Statistics Basic Science Research Program Synopsis Leadership Gene Expression Data Research Achievements Staff Listing Lab ...

  18. Basic Aerospace Education Library

    ERIC Educational Resources Information Center

    Journal of Aerospace Education, 1975

    1975-01-01

    Lists the most significant resource items on aerospace education which are presently available. Includes source books, bibliographies, directories, encyclopedias, dictionaries, audiovisuals, curriculum/planning guides, aerospace statistics, aerospace education statistics and newsletters. (BR)

  19. Multiple-solution problems in a statistics classroom: an example

    NASA Astrophysics Data System (ADS)

    Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing

    2017-11-01

    The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact probability mass distribution for the sum of face values. Four different ways of solving the problem are discussed. The solutions span various basic concepts in different mathematical disciplines (sample space in probability theory, the probability generating function in statistics, integer partition in basic combinatorics and individual risk model in actuarial science) and thus promotes upper undergraduate students' awareness of knowledge connections between their courses. All solutions of the example are implemented using the R statistical software package.

  20. The Statistical Power of Planned Comparisons.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…

  1. 14 CFR 147.36 - Maintenance of instructor requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... holding appropriate mechanic certificates and ratings that the Administrator determines necessary to... certificated mechanics to teach mathematics, physics, drawing, basic electricity, basic hydraulics, and similar...

  2. 14 CFR 147.36 - Maintenance of instructor requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... holding appropriate mechanic certificates and ratings that the Administrator determines necessary to... certificated mechanics to teach mathematics, physics, drawing, basic electricity, basic hydraulics, and similar...

  3. 14 CFR 147.36 - Maintenance of instructor requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... holding appropriate mechanic certificates and ratings that the Administrator determines necessary to... certificated mechanics to teach mathematics, physics, drawing, basic electricity, basic hydraulics, and similar...

  4. 14 CFR 147.36 - Maintenance of instructor requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... holding appropriate mechanic certificates and ratings that the Administrator determines necessary to... certificated mechanics to teach mathematics, physics, drawing, basic electricity, basic hydraulics, and similar...

  5. 14 CFR 147.36 - Maintenance of instructor requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... holding appropriate mechanic certificates and ratings that the Administrator determines necessary to... certificated mechanics to teach mathematics, physics, drawing, basic electricity, basic hydraulics, and similar...

  6. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 5 2010-07-01 2010-07-01 false Requests from the Bureau of Labor Statistics for data. 1904... Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses Form from the Bureau of Labor Statistics (BLS), or a BLS designee, you must promptly complete the form...

  7. 78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...

  8. Hidden Statistics Approach to Quantum Simulations

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2010-01-01

    Recent advances in quantum information theory have inspired an explosion of interest in new quantum algorithms for solving hard computational (quantum and non-quantum) problems. The basic principle of quantum computation is that the quantum properties can be used to represent structure data, and that quantum mechanisms can be devised and built to perform operations with this data. Three basic non-classical properties of quantum mechanics superposition, entanglement, and direct-product decomposability were main reasons for optimism about capabilities of quantum computers that promised simultaneous processing of large massifs of highly correlated data. Unfortunately, these advantages of quantum mechanics came with a high price. One major problem is keeping the components of the computer in a coherent state, as the slightest interaction with the external world would cause the system to decohere. That is why the hardware implementation of a quantum computer is still unsolved. The basic idea of this work is to create a new kind of dynamical system that would preserve the main three properties of quantum physics superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. In other words, such a system would reinforce the advantages and minimize limitations of both quantum and classical aspects. Based upon a concept of hidden statistics, a new kind of dynamical system for simulation of Schroedinger equation is proposed. The system represents a modified Madelung version of Schroedinger equation. It preserves superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods. Such an optimal combination of characteristics is a perfect match for simulating quantum systems. The model includes a transitional component of quantum potential (that has been overlooked in previous treatment of the Madelung equation). The role of the transitional potential is to provide a jump from a deterministic state to a random state with prescribed probability density. This jump is triggered by blowup instability due to violation of Lipschitz condition generated by the quantum potential. As a result, the dynamics attains quantum properties on a classical scale. The model can be implemented physically as an analog VLSI-based (very-large-scale integration-based) computer, or numerically on a digital computer. This work opens a way of developing fundamentally new algorithms for quantum simulations of exponentially complex problems that expand NASA capabilities in conducting space activities. It has been illustrated that the complexity of simulations of particle interaction can be reduced from an exponential one to a polynomial one.

  9. Understanding Summary Statistics and Graphical Techniques to Compare Michael Jordan versus LeBron James

    ERIC Educational Resources Information Center

    Williams, Immanuel James; Williams, Kelley Kim

    2016-01-01

    Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.

  10. Financial statistics for public health dispensary decisions in Nigeria: insights on standard presentation typologies.

    PubMed

    Agundu, Prince Umor C

    2003-01-01

    Public health dispensaries in Nigeria in recent times have demonstrated the poise to boost corporate productivity in the new millennium and to drive the nation closer to concretising the lofty goal of health-for-all. This is very pronounced considering the face-lift giving to the physical environment, increase in the recruitment and development of professionals, and upward review of financial subventions. However, there is little or no emphasis on basic statistical appreciation/application which enhances the decision making ability of corporate executives. This study used the responses from 120 senior public health officials in Nigeria and analyzed them with chi-square statistical technique. The results established low statistical aptitude, inadequate statistical training programmes, little/no emphasis on statistical literacy compared to computer literacy, amongst others. Consequently, it was recommended that these lapses be promptly addressed to enhance official executive performance in the establishments. Basic statistical data presentation typologies have been articulated in this study to serve as first-aid instructions to the target group, as they represent the contributions of eminent scholars in this area of intellectualism.

  11. Energy transfer mechanism and probability analysis of submarine pipe laterally impacted by dropped objects

    NASA Astrophysics Data System (ADS)

    Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui

    2016-06-01

    Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.

  12. ONR Ocean Wave Dynamics Workshop

    NASA Astrophysics Data System (ADS)

    In anticipation of the start (in Fiscal Year 1988) of a new Office of Naval Research (ONR) Accelerated Research Initiative (ARI) on Ocean Surface Wave Dynamics, a workshop was held August 5-7, 1986, at Woods Hole, Mass., to discuss new ideas and directions of research. This new ARI on Ocean Surface Wave Dynamics is a 5-year effort that is organized by the ONR Physical Oceanography Program in cooperation with the ONR Fluid Mechanics Program and the Physical Oceanography Branch at the Naval Ocean Research and Development Activity (NORDA). The central theme is improvement of our understanding of the basic physics and dynamics of surface wave phenomena, with emphasis on the following areas: precise air-sea coupling mechanisms,dynamics of nonlinear wave-wave interaction under realistic environmental conditions,wave breaking and dissipation of energy,interaction between surface waves and upper ocean boundary layer dynamics, andsurface statistical and boundary layer coherent structures.

  13. Strategy for Promoting the Equitable Development of Basic Education in Underdeveloped Counties as Seen from Cili County

    ERIC Educational Resources Information Center

    Shihua, Peng; Rihui, Tan

    2009-01-01

    Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…

  14. Appplication of statistical mechanical methods to the modeling of social networks

    NASA Astrophysics Data System (ADS)

    Strathman, Anthony Robert

    With the recent availability of large-scale social data sets, social networks have become open to quantitative analysis via the methods of statistical physics. We examine the statistical properties of a real large-scale social network, generated from cellular phone call-trace logs. We find this network, like many other social networks to be assortative (r = 0.31) and clustered (i.e., strongly transitive, C = 0.21). We measure fluctuation scaling to identify the presence of internal structure in the network and find that structural inhomogeneity effectively disappears at the scale of a few hundred nodes, though there is no sharp cutoff. We introduce an agent-based model of social behavior, designed to model the formation and dissolution of social ties. The model is a modified Metropolis algorithm containing agents operating under the basic sociological constraints of reciprocity, communication need and transitivity. The model introduces the concept of a social temperature. We go on to show that this simple model reproduces the global statistical network features (incl. assortativity, connected fraction, mean degree, clustering, and mean shortest path length) of the real network data and undergoes two phase transitions, one being from a "gas" to a "liquid" state and the second from a liquid to a glassy state as function of this social temperature.

  15. Educating the Educator: U.S. Government Statistical Sources for Geographic Research and Teaching.

    ERIC Educational Resources Information Center

    Fryman, James F.; Wilkinson, Patrick J.

    Appropriate for college geography students and researchers, this paper briefly introduces basic federal statistical publications and corresponding finding aids. General references include "Statistical Abstract of the United States," and three complementary publications: "County and City Data Book,""State and Metropolitan Area Data Book," and…

  16. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  17. Statistical Significance Testing in Second Language Research: Basic Problems and Suggestions for Reform

    ERIC Educational Resources Information Center

    Norris, John M.

    2015-01-01

    Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…

  18. Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Nordenhaug, Erik

    2004-01-01

    This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…

  19. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    NASA Astrophysics Data System (ADS)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  20. Statistics Canada's Definition and Classification of Postsecondary and Adult Education Providers in Canada. Culture, Tourism and the Centre for Education Statistics. Research Paper. Catalogue no. 81-595-M No. 071

    ERIC Educational Resources Information Center

    Orton, Larry

    2009-01-01

    This document outlines the definitions and the typology now used by Statistics Canada's Centre for Education Statistics to identify, classify and delineate the universities, colleges and other providers of postsecondary and adult education in Canada for which basic enrollments, graduates, professors and finance statistics are produced. These new…

  1. Standardized Curriculum for Diesel Engine Mechanics.

    ERIC Educational Resources Information Center

    Mississippi State Dept. of Education, Jackson. Office of Vocational, Technical and Adult Education.

    Standardized curricula are provided for two courses for the secondary vocational education program in Mississippi: diesel engine mechanics I and II. The eight units in diesel engine mechanics I are as follows: orientation; shop safety; basic shop tools; fasteners; measurement; engine operating principles; engine components; and basic auxiliary…

  2. Building Capacity for Developing Statistical Literacy in a Developing Country: Lessons Learned from an Intervention

    ERIC Educational Resources Information Center

    North, Delia; Gal, Iddo; Zewotir, Temesgen

    2014-01-01

    This paper aims to contribute to the emerging literature on capacity-building in statistics education by examining issues pertaining to the readiness of teachers in a developing country to teach basic statistical topics. The paper reflects on challenges and barriers to building statistics capacity at grass-roots level in a developing country,…

  3. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  4. EDITORIAL: Fracture: from the atomic to the geophysical scale Fracture: from the atomic to the geophysical scale

    NASA Astrophysics Data System (ADS)

    Bouchaud, Elisabeth; Soukiassian, Patrick

    2009-11-01

    Although fracture is a very common experience in every day life, it still harbours many unanswered questions. New avenues of investigation arise concerning the basic mechanisms leading to deformation and failure in heterogeneous materials, particularly in non-metals. The processes involved are even more complex when plasticity, thermal fluctuations or chemical interactions between the material and its environment introduce a specific time scale. Sub-critical failure, which may be reached at unexpectedly low loads, is particularly important for silicate glasses. Another source of complications originates from dynamic fracture, when loading rates become so high that the acoustic waves produced by the crack interact with the material heterogeneities, in turn producing new waves that modify the propagation. Recent progress in experimental techniques, allowing one to test and probe materials at sufficiently small length or time scales or in three dimensions, has led to a quantitative understanding of the physical processes involved. In parallel, simulations have also progressed, by extending the time and length scales they are able to reach, and thus attaining experimentally accessible conditions. However, one central question remains the inclusion of these basic mechanisms into a statistical description. This is not an easy task, mostly because of the strong stress gradients present at the tip of a crack, and because the averaging of fracture properties over a heterogeneous material, containing more or less brittle phases, requires rare event statistics. Substantial progress has been made in models and simulations based on accurate experiments. From these models, scaling laws have been derived, linking the behaviour at a micro- or even nano-scale to the macroscopic and even to geophysical scales. The reviews in this Cluster Issue of Journal of Physics D: Applied Physics cover several of these important topics, including the physical processes in fracture mechanisms, the sub-critical failure issue, the dynamical fracture propagation, and the scaling laws from the micro- to the geophysical scales. Achievements and progress are reported, and the many open questions are discussed, which should provide a sound basis for present and future prospects.

  5. The value of basic research insights into atrial fibrillation mechanisms as a guide to therapeutic innovation: a critical analysis.

    PubMed

    Heijman, Jordi; Algalarrondo, Vincent; Voigt, Niels; Melka, Jonathan; Wehrens, Xander H T; Dobrev, Dobromir; Nattel, Stanley

    2016-04-01

    Atrial fibrillation (AF) is an extremely common clinical problem associated with increased morbidity and mortality. Current antiarrhythmic options include pharmacological, ablation, and surgical therapies, and have significantly improved clinical outcomes. However, their efficacy remains suboptimal, and their use is limited by a variety of potentially serious adverse effects. There is a clear need for improved therapeutic options. Several decades of research have substantially expanded our understanding of the basic mechanisms of AF. Ectopic firing and re-entrant activity have been identified as the predominant mechanisms for arrhythmia initiation and maintenance. However, it has become clear that the clinical factors predisposing to AF and the cellular and molecular mechanisms involved are extremely complex. Moreover, all AF-promoting and maintaining mechanisms are dynamically regulated and subject to remodelling caused by both AF and cardiovascular disease. Accordingly, the initial presentation and clinical progression of AF patients are enormously heterogeneous. An understanding of arrhythmia mechanisms is widely assumed to be the basis of therapeutic innovation, but while this assumption seems self-evident, we are not aware of any papers that have critically examined the practical contributions of basic research into AF mechanisms to arrhythmia management. Here, we review recent insights into the basic mechanisms of AF, critically analyse the role of basic research insights in the development of presently used anti-AF therapeutic options and assess the potential value of contemporary experimental discoveries for future therapeutic innovation. Finally, we highlight some of the important challenges to the translation of basic science findings to clinical application. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  6. Statistical mechanics of neocortical interactions: Stability and duration of the 7±2 rule of short-term-memory capacity

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1985-02-01

    This paper is an essential addendum to a previous paper [L. Ingber,

    Phys. Rev. A 29, 3346 (1984)
    ]. Calculations are presented here to support the claim made in the previous paper that there exists an approximate one-dimensional solution to the two-dimensional neocortical Fokker-Planck equation. This solution is extremely useful, not only for obtaining a closed algebraic expression for the time of first passage, but also for establishing that minima of the associated path-integral stationary Lagrangian are indeed stable points of the transient dynamic system. Also, a relatively nontechnical summary is given of the basic theory.

  7. Wave turbulence

    NASA Astrophysics Data System (ADS)

    Nazarenko, Sergey

    2015-07-01

    Wave turbulence is the statistical mechanics of random waves with a broadband spectrum interacting via non-linearity. To understand its difference from non-random well-tuned coherent waves, one could compare the sound of thunder to a piece of classical music. Wave turbulence is surprisingly common and important in a great variety of physical settings, starting with the most familiar ocean waves to waves at quantum scales or to much longer waves in astrophysics. We will provide a basic overview of the wave turbulence ideas, approaches and main results emphasising the physics of the phenomena and using qualitative descriptions avoiding, whenever possible, involved mathematical derivations. In particular, dimensional analysis will be used for obtaining the key scaling solutions in wave turbulence - Kolmogorov-Zakharov (KZ) spectra.

  8. The uniform quantized electron gas revisited

    NASA Astrophysics Data System (ADS)

    Lomba, Enrique; Høye, Johan S.

    2017-11-01

    In this article we continue and extend our recent work on the correlation energy of the quantized electron gas of uniform density at temperature T=0 . As before, we utilize the methods, properties, and results obtained by means of classical statistical mechanics. These were extended to quantized systems via the Feynman path integral formalism. The latter translates the quantum problem into a classical polymer problem in four dimensions. Again, the well known RPA (random phase approximation) is recovered as a basic result which we then modify and improve upon. Here we analyze the condition of thermodynamic self-consistency. Our numerical calculations exhibit a remarkable agreement with well known results of a standard parameterization of Monte Carlo correlation energies.

  9. The accuracy of the ATLAS muon X-ray tomograph

    NASA Astrophysics Data System (ADS)

    Avramidou, R.; Berbiers, J.; Boudineau, C.; Dechelette, C.; Drakoulakos, D.; Fabjan, C.; Grau, S.; Gschwendtner, E.; Maugain, J.-M.; Rieder, H.; Rangod, S.; Rohrbach, F.; Sbrissa, E.; Sedykh, E.; Sedykh, I.; Smirnov, Y.; Vertogradov, L.; Vichou, I.

    2003-01-01

    A gigantic detector, the ATLAS project, is under construction at CERN for particle physics research at the Large Hadron Collider which is to be ready by 2006. An X-ray tomograph has been developed, designed and constructed at CERN in order to control the mechanical quality of the ATLAS muon chambers. We reached a measurement accuracy of 2 μm systematic and 2 μm statistical uncertainties in the horizontal and vertical directions in the working area 220 cm (horizontal)×60 cm (vertical). Here we describe in detail the fundamental approach of the basic principle chosen to achieve such good accuracy. In order to crosscheck our precision, key results of measurements are presented.

  10. Linear canonical transformations of coherent and squeezed states in the Wigner phase space. III - Two-mode states

    NASA Technical Reports Server (NTRS)

    Han, D.; Kim, Y. S.; Noz, Marilyn E.

    1990-01-01

    It is shown that the basic symmetry of two-mode squeezed states is governed by the group SP(4) in the Wigner phase space which is locally isomorphic to the (3 + 2)-dimensional Lorentz group. This symmetry, in the Schroedinger picture, appears as Dirac's two-oscillator representation of O(3,2). It is shown that the SU(2) and SU(1,1) interferometers exhibit the symmetry of this higher-dimensional Lorentz group. The mathematics of two-mode squeezed states is shown to be applicable to other branches of physics including thermally excited states in statistical mechanics and relativistic extended hadrons in the quark model.

  11. Auto-Mechanics Course. Bilingual Vocational Instructional Materials.

    ERIC Educational Resources Information Center

    Lopez-Cox, Guadalupe

    This auto-mechanics course, one of a series of bilingual English-Spanish vocational education courses, is designed to introduce the basic skills that an automotive mechanics student should know. It is geared to teach the student basic manipulative skills, safety judgments, proper work habits, desirable attitudes, and proper behavior for initial…

  12. County-by-County Financial and Staffing I-M-P-A-C-T. FY 1994-95 Basic Education Program.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh.

    This publication provides the basic statistics needed to illustrate the impact of North Carolina's Basic Education Program (BEP), an educational reform effort begun in 1985. Over 85% of the positions in the BEP are directly related to teaching and student-related activities. The new BEP programs result in smaller class sizes in kindergartens and…

  13. Grand canonical validation of the bipartite international trade network.

    PubMed

    Straka, Mika J; Caldarelli, Guido; Saracco, Fabio

    2017-08-01

    Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.

  14. Grand canonical validation of the bipartite international trade network

    NASA Astrophysics Data System (ADS)

    Straka, Mika J.; Caldarelli, Guido; Saracco, Fabio

    2017-08-01

    Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.

  15. Climate Change Conceptual Change: Scientific Information Can Transform Attitudes.

    PubMed

    Ranney, Michael Andrew; Clark, Dav

    2016-01-01

    Of this article's seven experiments, the first five demonstrate that virtually no Americans know the basic global warming mechanism. Fortunately, Experiments 2-5 found that 2-45 min of physical-chemical climate instruction durably increased such understandings. This mechanistic learning, or merely receiving seven highly germane statistical facts (Experiment 6), also increased climate-change acceptance-across the liberal-conservative spectrum. However, Experiment 7's misleading statistics decreased such acceptance (and dramatically, knowledge-confidence). These readily available attitudinal and conceptual changes through scientific information disconfirm what we term "stasis theory"--which some researchers and many laypeople varyingly maintain. Stasis theory subsumes the claim that informing people (particularly Americans) about climate science may be largely futile or even counterproductive--a view that appears historically naïve, suffers from range restrictions (e.g., near-zero mechanistic knowledge), and/or misinterprets some polarization and (noncausal) correlational data. Our studies evidenced no polarizations. Finally, we introduce HowGlobalWarmingWorks.org--a website designed to directly enhance public "climate-change cognition." Copyright © 2016 Cognitive Science Society, Inc.

  16. Universal biology and the statistical mechanics of early life.

    PubMed

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-12-28

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).

  17. Universal biology and the statistical mechanics of early life

    NASA Astrophysics Data System (ADS)

    Goldenfeld, Nigel; Biancalani, Tommaso; Jafarpour, Farshid

    2017-11-01

    All known life on the Earth exhibits at least two non-trivial common features: the canonical genetic code and biological homochirality, both of which emerged prior to the Last Universal Common Ancestor state. This article describes recent efforts to provide a narrative of this epoch using tools from statistical mechanics. During the emergence of self-replicating life far from equilibrium in a period of chemical evolution, minimal models of autocatalysis show that homochirality would have necessarily co-evolved along with the efficiency of early-life self-replicators. Dynamical system models of the evolution of the genetic code must explain its universality and its highly refined error-minimization properties. These have both been accounted for in a scenario where life arose from a collective, networked phase where there was no notion of species and perhaps even individuality itself. We show how this phase ultimately terminated during an event sometimes known as the Darwinian transition, leading to the present epoch of tree-like vertical descent of organismal lineages. These examples illustrate concrete examples of universal biology: the quest for a fundamental understanding of the basic properties of living systems, independent of precise instantiation in chemistry or other media. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  18. Financial Statistics. Higher Education General Information Survey (HEGIS) [machine-readable data file].

    ERIC Educational Resources Information Center

    Center for Education Statistics (ED/OERI), Washington, DC.

    The Financial Statistics machine-readable data file (MRDF) is a subfile of the larger Higher Education General Information Survey (HEGIS). It contains basic financial statistics for over 3,000 institutions of higher education in the United States and its territories. The data are arranged sequentially by institution, with institutional…

  19. The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.

    ERIC Educational Resources Information Center

    Shatz, Mark A.

    1985-01-01

    A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)

  20. Fundamentals of Counting Statistics in Digital PCR: I Just Measured Two Target Copies-What Does It Mean?

    PubMed

    Tzonev, Svilen

    2018-01-01

    Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.

  1. The Energetic Cost of Walking: A Comparison of Predictive Methods

    PubMed Central

    Kramer, Patricia Ann; Sylvester, Adam D.

    2011-01-01

    Background The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is “best”, but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. Methodology/Principal Findings We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Conclusion Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species. PMID:21731693

  2. Studies of elasticity, sound propagation and attenuation of acoustic modes in granular media: final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makse, Hernan A.; Johnson, David L.

    2014-09-03

    This is the final report describing the results of DOE Grant # DE-FG02-03ER15458 with original termination date of April 31, 2013, which has been extended to April 31, 2014. The goal of this project is to develop a theoretical and experimental understanding of sound propagation, elasticity and dissipation in granular materials. The topic is relevant for the efficient production of hydrocarbon and for identifying and characterizing the underground formation for storage of either CO 2 or nuclear waste material. Furthermore, understanding the basic properties of acoustic propagation in granular media is of importance not only to the energy industry, butmore » also to the pharmaceutical, chemical and agricultural industries. We employ a set of experimental, theoretical and computational tools to develop a study of acoustics and dissipation in granular media. These include the concept effective mass of granular media, normal modes analysis, statistical mechanics frameworks and numerical simulations based on Discrete Element Methods. Effective mass measurements allow us to study the mechanisms of the elastic response and attenuation of acoustic modes in granular media. We perform experiments and simulations under varying conditions, including humidity and vacuum, and different interparticle force-laws to develop a fundamental understanding of the mechanisms of damping and acoustic propagation in granular media. A theoretical statistical approach studies the necessary phase space of configurations in pressure, volume fraction to classify granular materials.« less

  3. Nurses' foot care activities in home health care.

    PubMed

    Stolt, Minna; Suhonen, Riitta; Puukka, Pauli; Viitanen, Matti; Voutilainen, Päivi; Leino-Kilpi, Helena

    2013-01-01

    This study described the basic foot care activities performed by nurses and factors associated with these in the home care of older people. Data were collected from nurses (n=322) working in nine public home care agencies in Finland using the Nurses' Foot Care Activities Questionnaire (NFAQ). Data were analyzed statistically using descriptive statistics and multivariate liner models. Although some of the basic foot care activities of nurses reported using were outdated, the majority of foot care activities were consistent with recommendations in foot care literature. Longer working experience, referring patients with foot problems to a podiatrist and physiotherapist, and patient education in wart and nail care were associated with a high score for adequate foot care activities. Continuing education should focus on updating basic foot care activities and increasing the use of evidence-based foot care methods. Also, geriatric nursing research should focus in intervention research to improve the use of evidence-based basic foot care activities. Copyright © 2013 Mosby, Inc. All rights reserved.

  4. 75 FR 61219 - Entergy Operations, Inc.; River Bend Station, Unit 1; Environmental Assessment and Finding of No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    ... Emergencies,'' for repair and corrective actions states that two individuals, one Mechanical Maintenance... actions will be taken to ensure basic electrical/l&C tasks can be performed by Mechanical Maintenance personnel. Mechanical Maintenance personnel will receive training in basic electrical and I&C tasks to...

  5. Kuhn's Paradigm and Example-Based Teaching of Newtonian Mechanics.

    ERIC Educational Resources Information Center

    Whitaker, M. A. B.

    1980-01-01

    Makes a recommendation for more direct teaching of the basic principles of mechanics. Contends that students currently learn mechanics in terms of standard examples. This causes difficulty when the student is confronted with a problem that can be solved from basic principles, but which does not fit a standard category. (GS)

  6. Diesel Mechanics. Performance Objectives. Basic Course.

    ERIC Educational Resources Information Center

    Tidwell, Joseph

    Several intermediate performance objectives and corresponding criterion measures are listed for each of 12 terminal objectives for a basic diesel mechanics course. The course is designed as a two-semester (2 hour daily) course for 10th graders interested in being diesel service and repair mechanics; it would serve as the first year of a 3-year…

  7. Insight into others' minds: spatio-temporal representations by intrinsic frame of reference.

    PubMed

    Sun, Yanlong; Wang, Hongbin

    2014-01-01

    Recent research has seen a growing interest in connections between domains of spatial and social cognition. Much evidence indicates that processes of representing space in distinct frames of reference (FOR) contribute to basic spatial abilities as well as sophisticated social abilities such as tracking other's intention and belief. Argument remains, however, that belief reasoning in social domain requires an innately dedicated system and cannot be reduced to low-level encoding of spatial relationships. Here we offer an integrated account advocating the critical roles of spatial representations in intrinsic frame of reference. By re-examining the results from a spatial task (Tamborello etal., 2012) and a false-belief task (Onishi and Baillargeon, 2005), we argue that spatial and social abilities share a common origin at the level of spatio-temporal association and predictive learning, where multiple FOR-based representations provide the basic building blocks for efficient and flexible partitioning of the environmental statistics. We also discuss neuroscience evidence supporting these mechanisms. We conclude that FOR-based representations may bridge the conceptual as well as the implementation gaps between the burgeoning fields of social and spatial cognition.

  8. An Informed Approach to Improving Quantitative Literacy and Mitigating Math Anxiety in Undergraduates Through Introductory Science Courses

    NASA Astrophysics Data System (ADS)

    Follette, K.; McCarthy, D.

    2012-08-01

    Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.

  9. Developing Competency of Teachers in Basic Education Schools

    ERIC Educational Resources Information Center

    Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn

    2015-01-01

    This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…

  10. [Influence of geomagnetic storms on the balance of autonomic regulatory mechanisms].

    PubMed

    Chichinadze, G; Tvildiani, L; Kvachadze, I; Tarkhan-Mouravi, I

    2005-09-01

    The investigation aimed to evaluate autonomic regulatory mechanisms in practically healthy persons during the geomagnetically quiet periods and during geomagnetic storms. The examinations were conducted among the volunteer young men (n=64) 18-22 years of age. The autonomic function was studied on the basis of the heart rate variability. The geomagnetically quiet periods were considered when the value of the K-index was no more then 2 and a geomagnetic storm was considered when the value of the index was 5 and more. It is ascertained that in the both cases the basic statistical indices of the heart rate were identical. The analysis of R-R intervals spectral power gave the possibility to sort the persons examined into the three different groups. The data obtained allowed to suggest that geomagnetic storms influence human organisms through the vagus centers by means of their excitation. This phenomenon may be considered as a self-regulatory physiologic mechanism of the adaptive character. The analysis of the spectral power of R-R intervals may be considered as a sensitive method for the detection of the magnitolabile persons.

  11. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Cordoba (Teaching Personnel in Primary Schools. Basic Statistics Series , Level of Education: Cordoba).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working the elementary schools of Cordoba, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  12. Personal Docente des Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Narino (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Narino).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Narino, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  13. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Cauca (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Cauca).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Cauca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  14. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Caldas (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Caldas).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Caldas, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  15. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Boyaca (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Boyaca).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Boyaca, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  16. Personal Docente del Nivel Primario. Series Estadisticas Basicas, Nivel Educativo: Huila (Teaching Personnel in Primary Schools. Basic Statistics Series, Level of Education: Huila).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teaching personnel working in the elementary schools of Huila, Colombia, between 1958 and 1967. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of the teachers. For overall statistics in…

  17. Health Resources Statistics; Health Manpower and Health Facilities, 1968. Public Health Service Publication No. 1509.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…

  18. Personal Docente del Nivel Primario. Series Estadisticas Basicas: Colombia (Teaching Personnel in Primary Schools. Basic Statistics Series: Colombia).

    ERIC Educational Resources Information Center

    Ministerio de Educacion Nacional, Bogota (Colombia). Instituto Colombiano de Pedagogia.

    This document provides statistical data on the distribution and education of teacher personnel working in Colombian elementary schools between 1940 and 1968. The statistics cover the number of men and women, public and private schools, urban and rural location, and the amount of education of teachers. (VM)

  19. Explorations in Statistics: Standard Deviations and Standard Errors

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2008-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…

  20. The Precision-Power-Gradient Theory for Teaching Basic Research Statistical Tools to Graduate Students.

    ERIC Educational Resources Information Center

    Cassel, Russell N.

    This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…

  1. Estimates of School Statistics, 1971-72.

    ERIC Educational Resources Information Center

    Flanigan, Jean M.

    This report presents public school statistics for the 50 States, the District of Columbia, and the regions and outlying areas of the United States. The text presents national data for each of the past 10 years and defines the basic series of statistics. Tables present the revised estimates by State and region for 1970-71 and the preliminary…

  2. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  3. Illustrating the Basic Functioning of Mass Analyzers in Mass Spectrometers with Ball-Rolling Mechanisms

    ERIC Educational Resources Information Center

    Horikoshi, Ryo; Takeiri, Fumitaka; Mikita, Riho; Kobayashi, Yoji; Kageyama, Hiroshi

    2017-01-01

    A unique demonstration with ball-rolling mechanisms has been developed to illustrate the basic principles of mass analyzers as components of mass spectrometers. Three ball-rolling mechanisms mimicking the currently used mass analyzers (i.e., a quadrupole mass filter, a magnetic sector, and a time-of- flight) have been constructed. Each mechanism…

  4. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  5. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  6. The synaptic maintenance problem: membrane recycling, Ca2+ homeostasis and late onset degeneration

    PubMed Central

    2013-01-01

    Most neurons are born with the potential to live for the entire lifespan of the organism. In addition, neurons are highly polarized cells with often long axons, extensively branched dendritic trees and many synaptic contacts. Longevity together with morphological complexity results in a formidable challenge to maintain synapses healthy and functional. This challenge is often evoked to explain adult-onset degeneration in numerous neurodegenerative disorders that result from otherwise divergent causes. However, comparably little is known about the basic cell biological mechanisms that keep normal synapses alive and functional in the first place. How the basic maintenance mechanisms are related to slow adult-onset degeneration in different diseasesis largely unclear. In this review we focus on two basic and interconnected cell biological mechanisms that are required for synaptic maintenance: endomembrane recycling and calcium (Ca2+) homeostasis. We propose that subtle defects in these homeostatic processes can lead to late onset synaptic degeneration. Moreover, the same basic mechanisms are hijacked, impaired or overstimulated in numerous neurodegenerative disorders. Understanding the pathogenesis of these disorders requires an understanding of both the initial cause of the disease and the on-going changes in basic maintenance mechanisms. Here we discuss the mechanisms that keep synapses functional over long periods of time with the emphasis on their role in slow adult-onset neurodegeneration. PMID:23829673

  7. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  8. System analysis for the Huntsville Operational Support Center distributed computer system

    NASA Technical Reports Server (NTRS)

    Ingels, E. M.

    1983-01-01

    A simulation model was developed and programmed in three languages BASIC, PASCAL, and SLAM. Two of the programs are included in this report, the BASIC and the PASCAL language programs. SLAM is not supported by NASA/MSFC facilities and hence was not included. The statistical comparison of simulations of the same HOSC system configurations are in good agreement and are in agreement with the operational statistics of HOSC that were obtained. Three variations of the most recent HOSC configuration was run and some conclusions drawn as to the system performance under these variations.

  9. GENASIS Basics: Object-oriented utilitarian functionality for large-scale physics simulations (Version 2)

    NASA Astrophysics Data System (ADS)

    Cardall, Christian Y.; Budiardja, Reuben D.

    2017-05-01

    GenASiS Basics provides Fortran 2003 classes furnishing extensible object-oriented utilitarian functionality for large-scale physics simulations on distributed memory supercomputers. This functionality includes physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. This revision -Version 2 of Basics - makes mostly minor additions to functionality and includes some simplifying name changes.

  10. Basic Facts and Figures about the Educational System in Japan.

    ERIC Educational Resources Information Center

    National Inst. for Educational Research, Tokyo (Japan).

    Tables, charts, and graphs convey supporting data that accompany text on various aspects of the Japanese educational system presented in this booklet. There are seven chapters: (1) Fundamental principles of education; (2) Organization of the educational system; (3) Basic statistics of education; (4) Curricula, textbooks, and instructional aids;…

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leppik, P.A.

    This paper presents results of a study designed to confirm that the interaction of the neutron flux and the coolant flow plays an important role in the mechanism of high-frequency (HF) resonant instability of the VK-50 boiling water reactor. To do this and to check the working model, signals from probes measuring the flow rate of the coolant and the neutron flux were recorded simultaneously (with the help of a magnetograph) in experiments performed in 1981 on driving the VK-50 reactor into the HF reonant instability regimes. Estimates were then obtained for the statistical characteristics of the pulsations of themore » flow rate and of the neutron flux, including the cross-correlation functions and coherence functions. The basic results of these studies are reported here.« less

  12. Rogue Waves in Multi-Ion Cometary Plasmas

    NASA Astrophysics Data System (ADS)

    Sreekala, G.; Manesh, M.; Neethu, T. W.; Anu, V.; Sijo, S.; Venugopal, C.

    2018-01-01

    The effect of pair ions on the formation of rogue waves in a six-component plasma composed of two hot and one colder electron component, hot ions, and pair ions is studied. The kappa distribution, which provides an unambiguous replacement for a Maxwellian distribution in space plasmas, is connected with nonextensive statistical mechanics and provides a continuous energy spectrum. Hence, the colder and one component of the hotter electrons is modeled by kappa distributions and the other hot electron component, by a q-nonextensive distribution. It is found that the rogue wave amplitude is different for various pair-ion components. The magnitude, however, increases with increasing spectral index and nonextensive parameter q. These results may be useful in understanding the basic characteristics of rogue waves in cometary plasmas.

  13. Complexity transitions in global algorithms for sparse linear systems over finite fields

    NASA Astrophysics Data System (ADS)

    Braunstein, A.; Leone, M.; Ricci-Tersenghi, F.; Zecchina, R.

    2002-09-01

    We study the computational complexity of a very basic problem, namely that of finding solutions to a very large set of random linear equations in a finite Galois field modulo q. Using tools from statistical mechanics we are able to identify phase transitions in the structure of the solution space and to connect them to the changes in the performance of a global algorithm, namely Gaussian elimination. Crossing phase boundaries produces a dramatic increase in memory and CPU requirements necessary for the algorithms. In turn, this causes the saturation of the upper bounds for the running time. We illustrate the results on the specific problem of integer factorization, which is of central interest for deciphering messages encrypted with the RSA cryptosystem.

  14. A Meinardus Theorem with Multiple Singularities

    NASA Astrophysics Data System (ADS)

    Granovsky, Boris L.; Stark, Dudley

    2012-09-01

    Meinardus proved a general theorem about the asymptotics of the number of weighted partitions, when the Dirichlet generating function for weights has a single pole on the positive real axis. Continuing (Granovsky et al., Adv. Appl. Math. 41:307-328, 2008), we derive asymptotics for the numbers of three basic types of decomposable combinatorial structures (or, equivalently, ideal gas models in statistical mechanics) of size n, when their Dirichlet generating functions have multiple simple poles on the positive real axis. Examples to which our theorem applies include ones related to vector partitions and quantum field theory. Our asymptotic formula for the number of weighted partitions disproves the belief accepted in the physics literature that the main term in the asymptotics is determined by the rightmost pole.

  15. Single-digit arithmetic processing—anatomical evidence from statistical voxel-based lesion analysis

    PubMed Central

    Mihulowicz, Urszula; Willmes, Klaus; Karnath, Hans-Otto; Klein, Elise

    2014-01-01

    Different specific mechanisms have been suggested for solving single-digit arithmetic operations. However, the neural correlates underlying basic arithmetic (multiplication, addition, subtraction) are still under debate. In the present study, we systematically assessed single-digit arithmetic in a group of acute stroke patients (n = 45) with circumscribed left- or right-hemispheric brain lesions. Lesion sites significantly related to impaired performance were found only in the left-hemisphere damaged (LHD) group. Deficits in multiplication and addition were related to subcortical/white matter brain regions differing from those for subtraction tasks, corroborating the notion of distinct processing pathways for different arithmetic tasks. Additionally, our results further point to the importance of investigating fiber pathways in numerical cognition. PMID:24847238

  16. Annual statistical report 2008 : based on data from CARE/EC

    DOT National Transportation Integrated Search

    2008-10-31

    This Annual Statistical Report provides the basic characteristics of road accidents in 19 member states of : the European Union for the period 1997-2006, on the basis of data collected and processed in the CARE : database, the Community Road Accident...

  17. Country Education Profiles: Algeria.

    ERIC Educational Resources Information Center

    International Bureau of Education, Geneva (Switzerland).

    One of a series of profiles prepared by the Cooperative Educational Abstracting Service, this brief outline provides basic background information on educational principles, system of administration, structure and organization, curricula, and teacher training in Algeria. Statistics provided by the Unesco Office of Statistics show enrollment at all…

  18. 78 FR 23158 - Organization and Delegation of Duties

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-18

    ... management actions of major significance, such as those relating to changes in basic organization pattern... regard to rulemaking, enforcement, vehicle safety research and statistics and data analysis, provides... Administrator for the National Center for Statistics and Analysis, and the Associate Administrator for Vehicle...

  19. English: Basic Mechanics Modules 4 and 5.

    ERIC Educational Resources Information Center

    Pipeline, 1983

    1983-01-01

    "English: Basic Mechanics" is a series of computer-based lessons for the Apple II that allow students to practice applying the fundamentals of English grammar and punctuation. The two newest modules covering use of subordinate clauses and use of subordinate phrases are described. (JN)

  20. Basic mechanisms governing solar-cell efficiency

    NASA Technical Reports Server (NTRS)

    Lindholm, F. A.; Neugroschel, A.; Sah, C. T.

    1976-01-01

    The efficiency of a solar cell depends on the material parameters appearing in the set of differential equations that describe the transport, recombination, and generation of electrons and holes. This paper describes the many basic mechanisms occurring in semiconductors that can control these material parameters.

  1. When Statistical Literacy Really Matters: Understanding Published Information about the HIV/AIDS Epidemic in South Africa

    ERIC Educational Resources Information Center

    Hobden, Sally

    2014-01-01

    Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…

  2. Introduction to Statistics. Learning Packages in the Policy Sciences Series, PS-26. Revised Edition.

    ERIC Educational Resources Information Center

    Policy Studies Associates, Croton-on-Hudson, NY.

    The primary objective of this booklet is to introduce students to basic statistical skills that are useful in the analysis of public policy data. A few, selected statistical methods are presented, and theory is not emphasized. Chapter 1 provides instruction for using tables, bar graphs, bar graphs with grouped data, trend lines, pie diagrams,…

  3. Virtual laboratory learning media development to improve science literacy skills of mechanical engineering students on basic physics concept of material measurement

    NASA Astrophysics Data System (ADS)

    Jannati, E. D.; Setiawan, A.; Siahaan, P.; Rochman, C.

    2018-05-01

    This study aims to determine the description of virtual laboratory learning media development to improve science literacy skills of Mechanical Engineering students on the concept of basic Physics. Quasi experimental method was employed in this research. The participants of this research were first semester students of mechanical engineering in Majalengka University. The research instrument was readability test of instructional media. The results of virtual laboratory learning media readability test show that the average score is 78.5%. It indicates that virtual laboratory learning media development are feasible to be used in improving science literacy skill of Mechanical Engineering students in Majalengka University, specifically on basic Physics concepts of material measurement.

  4. Basic Automotive Mechanics. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This program guide identifies primary concerns in the organization, operation, and evaluation of a basic automotive mechanics program. It is designed for local school district and community college administrators, instructors, program advisory committees, and regional coordinating councils. The guide begins with the Dictionary of Occupational…

  5. Mechanical design of deformation compensated flexural pivots structured for linear nanopositioning stages

    DOEpatents

    Shu, Deming; Kearney, Steven P.; Preissner, Curt A.

    2015-02-17

    A method and deformation compensated flexural pivots structured for precision linear nanopositioning stages are provided. A deformation-compensated flexural linear guiding mechanism includes a basic parallel mechanism including a U-shaped member and a pair of parallel bars linked to respective pairs of I-link bars and each of the I-bars coupled by a respective pair of flexural pivots. The basic parallel mechanism includes substantially evenly distributed flexural pivots minimizing center shift dynamic errors.

  6. 75 FR 33203 - Funding Formula for Grants to States

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-11

    ... as Social Security numbers, birth dates, and medical data. Docket: To read or download submissions or... Local Area Unemployment Statistics (LAUS), both of which are compiled by DOL's Bureau of Labor Statistics. Specifies how each State's basic JVSG allocation is calculated. Identifies the procedures...

  7. Statistical Considerations for Establishing CBTE Cut-Off Scores.

    ERIC Educational Resources Information Center

    Trzasko, Joseph A.

    This report gives the basic definition and purpose of competency-based teacher education (CBTE) cut-off scores. It describes the basic characteristics of CBTE as a yes-no dichotomous decision regarding the presence of a specific ability or knowledge, which necesitates the establishment of a cut-off point to designate competency vs. incompetency on…

  8. ADULT BASIC EDUCATION. PROGRAM SUMMARY.

    ERIC Educational Resources Information Center

    Office of Education (DHEW), Washington, DC.

    A BRIEF DESCRIPTION IS GIVEN OF THE FEDERAL ADULT BASIC EDUCATION PROGRAM, UNDER THE ADULT EDUCATION ACT OF 1966, AT THE NATIONAL AND STATE LEVELS (INCLUDING PUERTO RICO, GUAM, AMERICAN SAMOA, AND THE VIRGIN ISLANDS) AS PROVIDED BY STATE EDUCATION AGENCIES. STATISTICS FOR FISCAL YEARS 1965 AND 1966, AND ESTIMATES FOR FISCAL YEAR 1967, INDICATE…

  9. Action Research of Computer-Assisted-Remediation of Basic Research Concepts.

    ERIC Educational Resources Information Center

    Packard, Abbot L.; And Others

    This study investigated the possibility of creating a computer-assisted remediation program to assist students having difficulties in basic college research and statistics courses. A team approach involving instructors and students drove the research into and creation of the computer program. The effect of student use was reviewed by looking at…

  10. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    ERIC Educational Resources Information Center

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  11. 77 FR 37059 - Draft Guidance for Industry on Active Controls in Studies To Demonstrate Effectiveness of a New...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-D-0419... who conduct studies using active controls and have a basic understanding of statistical principles... clinical investigators who conduct studies using active controls and have a basic understanding of...

  12. Development of high temperature nickel-base alloys for jet engine turbine bucket applications

    NASA Technical Reports Server (NTRS)

    Quigg, R. J.; Scheirer, S. T.

    1965-01-01

    A program has been initiated to develop a material with superior properties at elevated temperatures for utilization in turbine blade applications. A nickel-base superalloy can provide the necessary high temperature strength by using the maximum capability of the three available strengthening mechanisms - intermetallic gamma prime precipitation (Ni3Al), solid solution strengthening with refractory and precious metals, and stable carbide formations through the addition of strong carbide forming elements. A stress rupture test at 2000 deg F and 15,000 psi was formulated to approximate the desired properties. By adding varying amounts of refractory metals (Mo, W and Ta) it was possible to statistically analyze the effects of each in a basic superalloy composition containing fixed amounts of Co, Cr, C, B, Sr, and Ni at three separate levels of AL and Ta. Metallographic analysis correlated with the mechanical properties of the alloys; those with few strengthening phases were weak and ductile and those with excessive amounts of intermetallic phases present in undesirable morphologies were brittle.

  13. Chromatic stability of acrylic resins of artificial eyes submitted to accelerated aging and polishing.

    PubMed

    Goiato, Marcelo Coelho; Santos, Daniela Micheline dos; Souza, Josiene Firmino; Moreno, Amália; Pesqueira, Aldiéris Alves

    2010-12-01

    Esthetics and durability of materials used to fabricate artificial eyes has been an important issue since artificial eyes are essential to restore esthetics and function, protect the remaining tissues and help with patients' psychological therapy. However, these materials are submitted to degrading effects of environmental agents on the physical properties of the acrylic resin. This study assessed the color stability of acrylic resins used to fabricate sclera in three basic shades (N1, N2 and N3) when subjected to accelerated aging, mechanical and chemical polishing. Specimens of each resin were fabricated and submitted to mechanical and chemical polishing. Chromatic analysis was performed before and after accelerated aging through ultraviolet reflection spectrophotometry. All specimens revealed color alteration following polishing and accelerated aging. The resins presented statistically significant chromatic alteration (p<0.01) between the periods of 252 and 1008 h. Both polishing methods presented no significant difference between the values of color derivatives of resins.

  14. The Six Core Theories of Modern Physics

    NASA Astrophysics Data System (ADS)

    Stevens, Charles F.

    1996-09-01

    Charles Stevens, a prominent neurobiologist who originally trained as a biophysicist (with George Uhlenbeck and Mark Kac), wrote this book almost by accident. Each summer he found himself reviewing key areas of physics that he had once known and understood well, for use in his present biological research. Since there was no book, he created his own set of notes, which formed the basis for this brief, clear, and self-contained summary of the basic theoretical structures of classical mechanics, electricity and magnetism, quantum mechanics, statistical physics, special relativity, and quantum field theory. The Six Core Theories of Modern Physics can be used by advanced undergraduates or beginning graduate students as a supplement to the standard texts or for an uncluttered, succinct review of the key areas. Professionals in such quantitative sciences as chemistry, engineering, computer science, applied mathematics, and biophysics who need to brush up on the essentials of a particular area will find most of the required background material, including the mathematics.

  15. Investigation of endocrine and immunological response in fat tissue to hyperbaric oxygen administration in rats.

    PubMed

    Şen, H; Erbağ, G; Ovali, M A; Öztopuz, R Ö; Uzun, M

    2016-04-30

    Though HBO treatment is becoming more common, the mechanism of action is not fully known. The positive effects of HBO administration on the inflammatory response is thought to be a possible basic mechanism. As a result, we aimed to research whether endocrine and immunological response of fat tissue changes in rats given HBO treatment model. This research was carried out on Wistar albino rats, they were treated with hyperbaric oxygen therapy. Their fatty tissue were taken from the abdomen, gene expression of the cytokines and adipokines were analyzed with Real time PCR method. When the gene expression of hormones and cytokines by fat tissue was examined, the leptin, visfatin, TNF-α, IL-1β and IL-10 levels in the HBO treatment group were statistically significantly increased compared to the control group (p=0.0313, p=0.0156, p=0.0156, p=0.0156, p=0.0313). In conclusion, in our study we identified that HBO administration affected the endochrinological functions of fat tissue.

  16. Process, mechanism, and explanation related to externalizing behavior in developmental psychopathology.

    PubMed

    Hinshaw, Stephen P

    2002-10-01

    Advances in conceptualization and statistical modeling, on the one hand, and enhanced appreciation of transactional pathways, gene-environment correlations and interactions, and moderator and mediator variables, on the other, have heightened awareness of the need to consider factors and processes that explain the development and maintenance of psychopathology. With a focus on attentional problems, impulsivity, and disruptive behavior patterns, I address the kinds of conceptual approaches most likely to lead to advances regarding explanatory models in the field. Findings from my own research program on processes and mechanisms reveal both promise and limitations. Progress will emanate from use of genetically informative designs, blends of variable and person-centered research, explicit testing of developmental processes, systematic approaches to moderation and mediation, exploitation of "natural experiments," and the conduct of prevention and intervention trials designed to accentuate explanation as well as outcome. In all, breakthroughs will occur only with advances in translational research-linking basic and applied science-and with the further development of transactional, systemic approaches to explanation.

  17. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  18. Peers versus professional training of basic life support in Syria: a randomized controlled trial.

    PubMed

    Abbas, Fatima; Sawaf, Bisher; Hanafi, Ibrahem; Hajeer, Mohammad Younis; Zakaria, Mhd Ismael; Abbas, Wafaa; Alabdeh, Fadi; Ibrahim, Nazir

    2018-06-18

    Peer training has been identified as a useful tool for delivering undergraduate training in basic life support (BLS) which is fundamental as an initial response in cases of emergency. This study aimed to (1) Evaluate the efficacy of peer-led model in basic life support training among medical students in their first three years of study, compared to professional-led training and (2) To assess the efficacy of the course program and students' satisfaction of peer-led training. A randomized controlled trial with blinded assessors was conducted on 72 medical students from the pre-clinical years (1st to 3rd years in Syria) at Syrian Private University. Students were randomly assigned to peer-led or to professional-led training group for one-day-course of basic life support skills. Sixty-four students who underwent checklist based assessment using objective structured clinical examination design (OSCE) (practical assessment of BLS skills) and answered BLS knowledge checkpoint-questionnaire were included in the analysis. There was no statistically significant difference between the two groups in delivering BLS skills to medical students in practical (P = 0.850) and BLS knowledge questionnaire outcomes (P = 0.900). Both groups showed statistically significant improvement from pre- to post-course assessment with significant statistical difference in both practical skills and theoretical knowledge (P-Value < 0.001). Students were satisfied with the peer model of training. Peer-led training of basic life support for medical students was beneficial and it provided a quality of education which was as effective as training conducted by professionals. This method is applicable and desirable especially in poor-resource countries and in crisis situation.

  19. An adaptive approach to the dynamic allocation of buffer storage. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Crooke, S. C.

    1970-01-01

    Several strategies for the dynamic allocation of buffer storage are simulated and compared. The basic algorithms investigated, using actual statistics observed in the Univac 1108 EXEC 8 System, include the buddy method and the first-fit method. Modifications are made to the basic methods in an effort to improve and to measure allocation performance. A simulation model of an adaptive strategy is developed which permits interchanging the two different methods, the buddy and the first-fit methods with some modifications. Using an adaptive strategy, each method may be employed in the statistical environment in which its performance is superior to the other method.

  20. Ultrasound Dopplerography of abdomen pathology using statistical computer programs

    NASA Astrophysics Data System (ADS)

    Dmitrieva, Irina V.; Arakelian, Sergei M.; Wapota, Alberto R. W.

    1998-04-01

    The modern ultrasound dopplerography give us the big possibilities in investigation of gemodynamical changes in all stages of abdomen pathology. Many of researches devoted to using of noninvasive methods in practical medicine. Now ultrasound Dopplerography is one of the basic one. We investigated 250 patients from 30 to 77 ages, including 149 men and 101 women. The basic diagnosis of all patients was the Ischaemic Pancreatitis. The Second diagnoses of pathology were the Ischaemic Disease of Heart, Gypertension, Atherosclerosis, Diabet, Vascular Disease of Extremities. We researched the abdominal aorta and her branches: Arteria Mesenterica Superior (AMS), truncus coeliacus (TC), arteria hepatica communis (AHC), arteria lienalis (AL). For investigation we use the following equipment: ACUSON 128 XP/10c, BIOMEDIC, GENERAL ELECTRIC (USA, Japan). We analyzed the following componetns of gemodynamical changes of abdominal vessels: index of pulsation, index of resistance, ratio of systol-dystol, speed of blood circulation. Statistical program included the following one: 'basic statistic's,' 'analytic program.' In conclusion we determined that the all gemodynamical components of abdominal vessels had considerable changes in abdominal ischaemia than in normal situation. Using the computer's program for definition degree of gemodynamical changes, we can recommend the individual plan of diagnostical and treatment program.

  1. Resilience Among Students at the Basic Enlisted Submarine School

    DTIC Science & Technology

    2016-12-01

    reported resilience. The Hayes’ Macro in the Statistical Package for the Social Sciences (SSPS) was used to uncover factors relevant to mediation analysis... Statistical Package for the Social Sciences (SPSS) was used to uncover factors relevant to mediation analysis. Findings suggest that the encouragement of...to Stressful Experiences Scale RTC Recruit Training Command SPSS Statistical Package for the Social Sciences SS Social Support SWB Subjective Well

  2. Multi-level modeling of total ionizing dose in a-silicon dioxide: First principles to circuits

    NASA Astrophysics Data System (ADS)

    Nicklaw, Christopher J.

    Oxygen vacancies have long been known to be the dominant intrinsic defect in amorphous SiO2. They exist, in concentrations dependent on processing conditions, as neutral defects in thermal oxides without usually causing any significant deleterious effects, with some spatial and energy distribution. During irradiation they can capture holes and become positively charged E '-centers, contributing to device degradation. Over the years, a considerable database has been amassed on the dynamics of E' -centers in bulk SiO2 films, and near the interface under different irradiation and annealing conditions. Theoretical calculations so far have revealed the basic properties of prototype oxygen vacancies, primarily as they behave in either a crystalline quartz environment, or in small clusters that serve as a substitute for a real amorphous structure. To date at least three categories of E'-centers, existing at or above room temperature, have been observed in SiO2. The unifying feature is an unpaired electron on a threefold coordinated silicon atom, having the form O3 ≡ Si·. Feigl et al. identified the E'1 -center in crystalline quartz as a trapped hole on an oxygen vacancy, which causes an asymmetrical relaxation, resulting in a paramagnetic center. The unpaired electron in the E'1 -center is localized on the three-fold coordinated Si atoms, while the hole is localized on the other Si atom. Results from an ab initio statistical simulation examination of the behaviors of oxygen vacancies, within amorphous structures, identify a new form of the E'-center, the E'g5 and help in the understanding of the underlying physical mechanisms involved in switched-bias annealing, and electron paramagnetic resonance (EPR) studies. The results also suggest a common border trap, induced by trapped holes in SiO2, is a hole trapped at an oxygen vacancy defect, which can be compensated by an electron, as originally proposed by Lelis and co-workers at Harry Diamond Laboratories. This dissertation provides new insights into the basic mechanisms of a-SiO2 defects, and provides a link between basic mechanisms and Electronic Design Automation (EDA) tools, providing an enhanced design flow for radiation-resistant electronics.

  3. A Simple Statistical Thermodynamics Experiment

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  4. 76 FR 41756 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...

  5. Descriptive Statistics: Reporting the Answers to the 5 Basic Questions of Who, What, Why, When, Where, and a Sixth, So What?

    PubMed

    Vetter, Thomas R

    2017-11-01

    Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"

  6. `New insight into statistical hydrology' preface to the special issue

    NASA Astrophysics Data System (ADS)

    Kochanek, Krzysztof

    2018-04-01

    Statistical methods are still the basic tool for investigating random, extreme events occurring in hydrosphere. On 21-22 September 2017, in Warsaw (Poland) the international workshop of the Statistical Hydrology (StaHy) 2017 took place under the auspices of the International Association of Hydrological Sciences. The authors of the presentations proposed to publish their research results in the Special Issue of the Acta Geophysica-`New Insight into Statistical Hydrology'. Five papers were selected for publication, touching on the most crucial issues of statistical methodology in hydrology.

  7. An engineering, multiscale constitutive model for fiber-forming collagen in tension.

    PubMed

    Annovazzi, Lorella; Genna, Francesco

    2010-01-01

    This work proposes a nonlinear constitutive model for a single collagen fiber. Fiber-forming collagen can exhibit different hierarchies of basic units, called fascicles, bundles, fibrils, microfibrils, and so forth, down to the molecular (tropocollagen) level. Exploiting the fact that at each hierarchy level the microstructure can be seen, at least approximately, as that of a wavy, or crimped, extensible cable, the proposed stress-strain model considers a given number of levels, each of which contributes to the overall mechanical behavior according to its own geometrical features (crimp, or waviness), as well as to the basic mechanical properties of the tropocollagen. The crimp features at all levels are assumed to be random variables, whose statistical integration furnishes a stress-strain curve for a collagen fiber. The soundness of this model-the first, to the Authors' knowledge, to treat a single collagen fiber as a microstructured nonlinear structural element-is checked by its application to collagen fibers for which experimental results are available: rat tail tendon, periodontal ligament, and engineered ones. Here, no attempt is made to obtain a stress-strain law for generic collagenous tissues, which exhibit specific features, often much more complex than those of a single fiber. However, it is trivial to observe that the availability of a sound, microstructurally based constitutive law for a single collagen fiber (but applicable at any sub-level, or to any other material with a similar microstructure) is essential for assembling complex constitutive models for any collagenous fibrous tissue.

  8. Diesel Mechanics: Fundamentals.

    ERIC Educational Resources Information Center

    Foutes, William; And Others

    This publication is the first in a series of three texts for a diesel mechanics curriculum. Its purpose is to teach the basic concepts related to employment in a diesel trade. Six sections contain 29 units. Each instructional unit includes some or all of these basic components: unit and specific (performance) objectives, suggested activities for…

  9. Basic Gasoline Engine Mechanics. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This packet contains a program guide and Career Merit Achievement Plan (Career MAP) for the implementation of a basic gasoline engine mechanics program in Florida secondary and postsecondary schools. The program guide describes the program content and structure, provides a program description, lists job titles under the program, and includes a…

  10. Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method

    ERIC Educational Resources Information Center

    Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo

    2012-01-01

    This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)

  11. Air Pollution.

    ERIC Educational Resources Information Center

    Scorer, Richard S.

    The purpose of this book is to describe the basic mechanisms whereby pollution is transported and diffused in the atmosphere. It is designed to give practitioners an understanding of basic mechanics and physics so they may have a correct basis on which to formulate their decisions related to practical air pollution control problems. Since many…

  12. Tendency to occupy a statistically dominant spatial state of the flow as a driving force for turbulent transition.

    PubMed

    Chekmarev, Sergei F

    2013-03-01

    The transition from laminar to turbulent fluid motion occurring at large Reynolds numbers is generally associated with the instability of the laminar flow. On the other hand, since the turbulent flow characteristically appears in the form of spatially localized structures (e.g., eddies) filling the flow field, a tendency to occupy such a structured state of the flow cannot be ruled out as a driving force for turbulent transition. To examine this possibility, we propose a simple analytical model that treats the flow as a collection of localized spatial structures, each of which consists of elementary cells in which the behavior of the particles (atoms or molecules) is uncorrelated. This allows us to introduce the Reynolds number, associating it with the ratio between the total phase volume for the system and that for the elementary cell. Using the principle of maximum entropy to calculate the most probable size distribution of the localized structures, we show that as the Reynolds number increases, the elementary cells group into the localized structures, which successfully explains turbulent transition and some other general properties of turbulent flows. An important feature of the present model is that a bridge between the spatial-statistical description of the flow and hydrodynamic equations is established. We show that the basic assumptions underlying the model, i.e., that the particles are indistinguishable and elementary volumes of phase space exist in which the state of the particles is uncertain, are involved in the derivation of the Navier-Stokes equation. Taking into account that the model captures essential features of turbulent flows, this suggests that the driving force for the turbulent transition is basically the same as in the present model, i.e., the tendency of the system to occupy a statistically dominant state plays a key role. The instability of the flow at high Reynolds numbers can then be a mechanism to initiate structural rearrangement of the flow to find this state.

  13. Statistical mechanics of human resource allocation

    NASA Astrophysics Data System (ADS)

    Inoue, Jun-Ichi; Chen, He

    2014-03-01

    We provide a mathematical platform to investigate the network topology of agents, say, university graduates who are looking for their positions in labor markets. The basic model is described by the so-called Potts spin glass which is well-known in the research field of statistical physics. In the model, each Potts spin (a tiny magnet in atomic scale length) represents the action of each student, and it takes a discrete variable corresponding to the company he/she applies for. We construct the energy to include three distinct effects on the students' behavior, namely, collective effect, market history and international ranking of companies. In this model system, the correlations (the adjacent matrix) between students are taken into account through the pairwise spin-spin interactions. We carry out computer simulations to examine the efficiency of the model. We also show that some chiral representation of the Potts spin enables us to obtain some analytical insights into our labor markets. This work was financially supported by Grant-in-Aid for Scientific Research (C) of Japan Society for the Promotion of Science No. 25330278.

  14. Health care research by degrees N Reid Health care research by degrees Blackwell Scientific 162pp £13.99 0632-03466-1 [Formula: see text].

    PubMed

    1993-03-03

    Inadequately understood statistics so often cloud both the argument of the researcher and the judgement of the reader. Norma Reid brings a refreshing clarity to a complex topic; she takes the mystification and mystique out of statistics. Her basic premiss that theory ought to be based on practical utility and relevance shines through her text and helps to make the subject accessible co clinicians who want to understand the underpinnings of their practice. Research methods, particularly qualitative approaches, are sketchily dealt with when compared with the wealth of detail on the mechanics of computing. Also, it is awkward to find methods and analysis not clearly separated in places (eg, Delphi studies), but ample references direct the reader to more expansive sources. Any attempt to steer the uninitiated through the minefields of computing is fraught with difficulties, and some will be disappointed to find one system used exclusively, but, perhaps, it serves as an illustration rather than a course to be slavishly followed.

  15. Compressor seal rub energetics study

    NASA Technical Reports Server (NTRS)

    Laverty, W. F.

    1978-01-01

    The rub mechanics of compressor abradable blade tip seals at simulated engine conditions were investigated. Twelve statistically planned, instrumented rub tests were conducted with titanium blades and Feltmetal fibermetal rubstrips. The tests were conducted with single stationary blades rubbing against seal material bonded to rotating test disks. The instantaneous rub torque, speed, incursion rate and blade temperatures were continuously measured and recorded. Basic rub parameters (incursion rate, rub depth, abradable density, blade thickness and rub velocity) were varied to determine the effects on rub energy and heat split between the blade, rubstrip surface and rub debris. The test data was reduced, energies were determined and statistical analyses were completed to determine the primary and interactive effects. Wear surface morphology, profile measurements and metallographic analysis were used to determine wear, glazing, melting and material transfer. The rub energies for these tests were most significantly affected by the incursion rate while rub velocity and blade thickness were of secondary importance. The ratios of blade wear to seal wear were representative of those experienced in engine operation of these seal system materials.

  16. The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis

    ERIC Educational Resources Information Center

    Buri, Olga Elizabeth Minchala; Stefos, Efstathios

    2017-01-01

    The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…

  17. Improving Attendance and Punctuality of FE Basic Skill Students through an Innovative Scheme

    ERIC Educational Resources Information Center

    Ade-Ojo, Gordon O.

    2005-01-01

    This paper reports the findings of a study set up to establish the impact of a particular scheme on the attendance and punctuality performance of a group of Basic Skills learners against the backdrop of various theoretical postulations on managing undesirable behavior. Data collected on learners' performance was subjected to statistical analysis…

  18. Statistical Match of the VA 1979-1980 Recipient File against the 1979-1980 Basic Grant Recipient File. Revised.

    ERIC Educational Resources Information Center

    Applied Management Sciences, Inc., Silver Spring, MD.

    The amount of misreporting of Veterans Administration (VA) benefits was assessed, along with the impact of misreporting on the Basic Educational Opportunity Grant (BEOG) program. Accurate financial information is need to determine appropriate awards. The analysis revealed: over 97% of VA beneficiaries misreported benefits; the total net loss to…

  19. An Inspection on the Gini Coefficient of the Budget Educational Public Expenditure per Student for China's Basic Education

    ERIC Educational Resources Information Center

    Yingxiu, Yang

    2006-01-01

    Using statistical data on the implementing conditions of China's educational expenditure published by the state, this paper studies the Gini coefficient of the budget educational public expenditure per student in order to examine the concentration degree of the educational expenditure for China's basic education and analyze its balanced…

  20. Trees for Ohio

    Treesearch

    Ernest J. Gebhart

    1980-01-01

    Other members of this panel are going to reveal the basic statistics about the coal strip mining industry in Ohio so I will confine my remarks to the revegetation of the spoil banks. So it doesn't appear that Ohio confined its tree planting efforts to spoil banks alone, I will rely on a few statistics.

  1. Idaho State University Statistical Portrait, Academic Year 1998-1999.

    ERIC Educational Resources Information Center

    Idaho State Univ., Pocatello. Office of Institutional Research.

    This report provides basic statistical data for Idaho State University, and includes both point-of-time data as well as trend data. The information is divided into sections emphasizing students, programs, faculty and staff, finances, and physical facilities. Student data includes enrollment, geographical distribution, student/faculty ratios,…

  2. Statistical Report. Fiscal Year 1995: September 1, 1994 - August 31, 1995.

    ERIC Educational Resources Information Center

    Texas Higher Education Coordinating Board, Austin.

    This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1995. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1990-94 headcount data; headcount by classification, ethnic origin,…

  3. Statistical Report. Fiscal Year 1994: September 1, 1993 - August 31, 1994.

    ERIC Educational Resources Information Center

    Texas Higher Education Coordinating Board, Austin.

    This report provides statistical data on Texas public and independent higher education institutions for fiscal year 1994. An introductory section provides basic information on Texas higher education institutions, while nine major sections cover: (1) student enrollment, including 1989-93 headcount data; headcount by classification, ethnic origin,…

  4. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  5. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  6. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  7. 29 CFR 1904.42 - Requests from the Bureau of Labor Statistics for data.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR RECORDING AND REPORTING OCCUPATIONAL INJURIES AND ILLNESSES Reporting Fatality, Injury and Illness Information to the Government § 1904.42 Requests from the Bureau of Labor Statistics for data. (a) Basic requirement. If you receive a Survey of Occupational Injuries and Illnesses...

  8. Theoretical Frameworks for Math Fact Fluency

    ERIC Educational Resources Information Center

    Arnold, Katherine

    2012-01-01

    Recent education statistics indicate persistent low math scores for our nation's students. This drop in math proficiency includes deficits in basic number sense and automaticity of math facts. The decrease has been recorded across all grade levels with the elementary levels showing the greatest loss (National Center for Education Statistics,…

  9. Basic Statistical Concepts and Methods for Earth Scientists

    USGS Publications Warehouse

    Olea, Ricardo A.

    2008-01-01

    INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.

  10. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    NASA Astrophysics Data System (ADS)

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  11. Badminton--Teaching Concepts.

    ERIC Educational Resources Information Center

    Gibbs, Marilyn J.

    1988-01-01

    Teaching four basic badminton concepts along with the usual basic skill shots allows players to develop game strategy awareness as well as mechanical skills. These four basic concepts are: (1) ready position, (2) flight trajectory, (3) early shuttle contact, and (4) camouflage. (IAH)

  12. Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan

    DTIC Science & Technology

    2015-09-09

    administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results

  13. Biostatistical and medical statistics graduate education

    PubMed Central

    2014-01-01

    The development of graduate education in biostatistics and medical statistics is discussed in the context of training within a medical center setting. The need for medical researchers to employ a wide variety of statistical designs in clinical, genetic, basic science and translational settings justifies the ongoing integration of biostatistical training into medical center educational settings and informs its content. The integration of large data issues are a challenge. PMID:24472088

  14. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R

    PubMed Central

    Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763

  15. PopSc: Computing Toolkit for Basic Statistics of Molecular Population Genetics Simultaneously Implemented in Web-Based Calculator, Python and R.

    PubMed

    Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia

    2016-01-01

    Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.

  16. Orientation, Sketching, Mechanical Drawing, Drafting--Basic: 9253.01.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    The course introduces the student to the drafting trade, freehand sketching, and basic mechanical drawing. The course has no prerequisites and will guide the student into drafting concepts and serve as a foundation for further study in vocational drafting. Requiring a total of 45 class hours, eight hours are utilized in orientation, 15 hours are…

  17. Gasoline Engine Mechanics. Performance Objectives. Basic Course.

    ERIC Educational Resources Information Center

    Jones, Marion

    Several intermediate performance objectives and corresponding criterion measures are listed for each of five terminal objectives presented in this curriculum guide for a basic gasoline engine mechanics course at the secondary level. (For the intermediate course guide see CE 010 946.) The materials were developed for a two semester (2 hours daily)…

  18. Clouding tracing: Visualization of the mixing of fluid elements in convection-diffusion systems

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu; Smith, Philip J.

    1993-01-01

    This paper describes a highly interactive method for computer visualization of the basic physical process of dispersion and mixing of fluid elements in convection-diffusion systems. It is based on transforming the vector field from a traditionally Eulerian reference frame into a Lagrangian reference frame. Fluid elements are traced through the vector field for the mean path as well as the statistical dispersion of the fluid elements about the mean position by using added scalar information about the root mean square value of the vector field and its Lagrangian time scale. In this way, clouds of fluid elements are traced and are not just mean paths. We have used this method to visualize the simulation of an industrial incinerator to help identify mechanisms for poor mixing.

  19. Fluctuations of conserved charges in relativistic heavy ion collisions: An introduction

    NASA Astrophysics Data System (ADS)

    Asakawa, Masayuki; Kitazawa, Masakiyo

    2016-09-01

    Bulk fluctuations of conserved charges measured by event-by-event analysis in relativistic heavy ion collisions are observables which are believed to carry significant amount of information on the hot medium created by the collisions. Active studies have been done recently experimentally, theoretically, and on the lattice. In particular, non-Gaussianity of the fluctuations has acquired much attention recently. In this review, we give a pedagogical introduction to these issues, and survey recent developments in this field of research. Starting from the definition of cumulants, basic concepts in fluctuation physics, such as thermal fluctuations in statistical mechanics and time evolution of fluctuations in diffusive systems, are described. Phenomena which are expected to occur in finite temperature and/or density QCD matter and their measurement by event-by-event analyses are also elucidated.

  20. Review of guidelines and literature for handling missing data in longitudinal clinical trials with a case study.

    PubMed

    Liu, M; Wei, L; Zhang, J

    2006-01-01

    Missing data in clinical trials are inevitable. We highlight the ICH guidelines and CPMP points to consider on missing data. Specifically, we outline how we should consider missing data issues when designing, planning and conducting studies to minimize missing data impact. We also go beyond the coverage of the above two documents, provide a more detailed review of the basic concepts of missing data and frequently used terminologies, and examples of the typical missing data mechanism, and discuss technical details and literature for several frequently used statistical methods and associated software. Finally, we provide a case study where the principles outlined in this paper are applied to one clinical program at protocol design, data analysis plan and other stages of a clinical trial.

  1. Auger recombination in sodium iodide

    NASA Astrophysics Data System (ADS)

    McAllister, Andrew; Kioupakis, Emmanouil; Åberg, Daniel; Schleife, André

    2014-03-01

    Scintillators are an important tool used to detect high energy radiation - both in the interest of national security and in medicine. However, scintillator detectors currently suffer from lower energy resolutions than expected from basic counting statistics. This has been attributed to non-proportional light yield compared to incoming radiation, but the specific mechanism for this non-proportionality has not been identified. Auger recombination is a non-radiative process that could be contributing to the non-proportionality of scintillating materials. Auger recombination comes in two types - direct and phonon-assisted. We have used first-principles calculations to study Auger recombination in sodium iodide, a well characterized scintillating material. Our findings indicate that phonon-assisted Auger recombination is stronger in sodium iodide than direct Auger recombination. Computational resources provided by LLNL and NERSC. Funding provided by NA-22.

  2. Many roads to synchrony: natural time scales and their algorithms.

    PubMed

    James, Ryan G; Mahoney, John R; Ellison, Christopher J; Crutchfield, James P

    2014-04-01

    We consider two important time scales-the Markov and cryptic orders-that monitor how an observer synchronizes to a finitary stochastic process. We show how to compute these orders exactly and that they are most efficiently calculated from the ε-machine, a process's minimal unifilar model. Surprisingly, though the Markov order is a basic concept from stochastic process theory, it is not a probabilistic property of a process. Rather, it is a topological property and, moreover, it is not computable from any finite-state model other than the ε-machine. Via an exhaustive survey, we close by demonstrating that infinite Markov and infinite cryptic orders are a dominant feature in the space of finite-memory processes. We draw out the roles played in statistical mechanical spin systems by these two complementary length scales.

  3. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  4. Analysis of Meniscus Fluctuation in a Continuous Casting Slab Mold

    NASA Astrophysics Data System (ADS)

    Zhang, Kaitian; Liu, Jianhua; Cui, Heng; Xiao, Chao

    2018-06-01

    A water model of slab mold was established to analyze the microscopic and macroscopic fluctuation of meniscus. The fast Fourier transform and wavelet entropy were adopted to analyze the wave amplitude, frequency, and components of fluctuation. The flow patterns under the meniscus were measured by using particle image velocimetry measurement and then the mechanisms of meniscus fluctuation were discussed. The results reflected that wavelet entropy had multi-scale and statistical properties, and it was suitable for the study of meniscus fluctuation details both in time and frequency domain. The basic wave, frequency of which exceeding 1 Hz in the condition of no mold oscillation, was demonstrated in this work. In fact, three basic waves were found: long-wave with low frequency, middle-wave with middle frequency, and short-wave with high frequency. In addition, the upper roll flow in mold had significant effect on meniscus fluctuation. When the position of flow impinged was far from the meniscus, long-wave dominated the fluctuation and the stability of meniscus was enhanced. However, when the velocity of flow was increased, the short-wave dominated the meniscus fluctuation and the meniscus stability was decreased.

  5. Stochastic geometry in disordered systems, applications to quantum Hall transitions

    NASA Astrophysics Data System (ADS)

    Gruzberg, Ilya

    2012-02-01

    A spectacular success in the study of random fractal clusters and their boundaries in statistical mechanics systems at or near criticality using Schramm-Loewner Evolutions (SLE) naturally calls for extensions in various directions. Can this success be repeated for disordered and/or non-equilibrium systems? Naively, when one thinks about disordered systems and their average correlation functions one of the very basic assumptions of SLE, the so called domain Markov property, is lost. Also, in some lattice models of Anderson transitions (the network models) there are no natural clusters to consider. Nevertheless, in this talk I will argue that one can apply the so called conformal restriction, a notion of stochastic conformal geometry closely related to SLE, to study the integer quantum Hall transition and its variants. I will focus on the Chalker-Coddington network model and will demonstrate that its average transport properties can be mapped to a classical problem where the basic objects are geometric shapes (loosely speaking, the current paths) that obey an important restriction property. At the transition point this allows to use the theory of conformal restriction to derive exact expressions for point contact conductances in the presence of various non-trivial boundary conditions.

  6. Entropy-based separation of yeast cells using a microfluidic system of conjoined spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Kai-Jian; Qin, S.-J., E-mail: shuijie.qin@gmail.com; Bai, Zhong-Chen

    2013-11-21

    A physical model is derived to create a biological cell separator that is based on controlling the entropy in a microfluidic system having conjoined spherical structures. A one-dimensional simplified model of this three-dimensional problem in terms of the corresponding effects of entropy on the Brownian motion of particles is presented. This dynamic mechanism is based on the Langevin equation from statistical thermodynamics and takes advantage of the characteristics of the Fokker-Planck equation. This mechanism can be applied to manipulate biological particles inside a microfluidic system with identical, conjoined, spherical compartments. This theoretical analysis is verified by performing a rapid andmore » a simple technique for separating yeast cells in these conjoined, spherical microfluidic structures. The experimental results basically match with our theoretical model and we further analyze the parameters which can be used to control this separation mechanism. Both numerical simulations and experimental results show that the motion of the particles depends on the geometrical boundary conditions of the microfluidic system and the initial concentration of the diffusing material. This theoretical model can be implemented in future biophysics devices for the optimized design of passive cell sorters.« less

  7. Views of medical students: what, when and how do they want statistics taught?

    PubMed

    Fielding, S; Poobalan, A; Prescott, G J; Marais, D; Aucott, L

    2015-11-01

    A key skill for a practising clinician is being able to do research, understand the statistical analyses and interpret results in the medical literature. Basic statistics has become essential within medical education, but when, what and in which format is uncertain. To inform curriculum design/development we undertook a quantitative survey of fifth year medical students and followed them up with a series of focus groups to obtain their opinions as to what statistics teaching they want, when and how. A total of 145 students undertook the survey and five focus groups were held with between 3 and 9 participants each. Previous statistical training varied and students recognised their knowledge was inadequate and keen to see additional training implemented. Students were aware of the importance of statistics to their future careers, but apprehensive about learning. Face-to-face teaching supported by online resources was popular. Focus groups indicated the need for statistical training early in their degree and highlighted their lack of confidence and inconsistencies in support. The study found that the students see the importance of statistics training in the medical curriculum but that timing and mode of delivery are key. The findings have informed the design of a new course to be implemented in the third undergraduate year. Teaching will be based around published studies aiming to equip students with the basics required with additional resources available through a virtual learning environment. © The Author(s) 2015.

  8. A generalized concept for cost-effective structural design. [Statistical Decision Theory applied to aerospace systems

    NASA Technical Reports Server (NTRS)

    Thomas, J. M.; Hawk, J. D.

    1975-01-01

    A generalized concept for cost-effective structural design is introduced. It is assumed that decisions affecting the cost effectiveness of aerospace structures fall into three basic categories: design, verification, and operation. Within these basic categories, certain decisions concerning items such as design configuration, safety factors, testing methods, and operational constraints are to be made. All or some of the variables affecting these decisions may be treated probabilistically. Bayesian statistical decision theory is used as the tool for determining the cost optimum decisions. A special case of the general problem is derived herein, and some very useful parametric curves are developed and applied to several sample structures.

  9. On a Quantum Model of Brain Activities

    NASA Astrophysics Data System (ADS)

    Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.

    2010-01-01

    One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.

  10. Statistical and Detailed Analysis on Fiber Reinforced Self-Compacting Concrete Containing Admixtures- A State of Art of Review

    NASA Astrophysics Data System (ADS)

    Athiyamaan, V.; Mohan Ganesh, G.

    2017-11-01

    Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.

  11. Beyond δ : Tailoring marked statistics to reveal modified gravity

    NASA Astrophysics Data System (ADS)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models that seek to explain cosmic acceleration through modifications to general relativity (GR) evade stringent Solar System constraints through a restoring, screening mechanism. Down-weighting the high-density, screened regions in favor of the low density, unscreened ones offers the potential to enhance the amount of information carried in such modified gravity models. In this work, we assess the performance of a new "marked" transformation and perform a systematic comparison with the clipping and logarithmic transformations, in the context of Λ CDM and the symmetron and f (R ) modified gravity models. Performance is measured in terms of the fractional boost in the Fisher information and the signal-to-noise ratio (SNR) for these models relative to the statistics derived from the standard density distribution. We find that all three statistics provide improved Fisher boosts over the basic density statistics. The model parameters for the marked and clipped transformation that best enhance signals and the Fisher boosts are determined. We also show that the mark is useful both as a Fourier and real-space transformation; a marked correlation function also enhances the SNR relative to the standard correlation function, and can on mildly nonlinear scales show a significant difference between the Λ CDM and the modified gravity models. Our results demonstrate how a series of simple analytical transformations could dramatically increase the predicted information extracted on deviations from GR, from large-scale surveys, and give the prospect for a much more feasible potential detection.

  12. Innovations in curriculum design: A multi-disciplinary approach to teaching statistics to undergraduate medical students

    PubMed Central

    Freeman, Jenny V; Collier, Steve; Staniforth, David; Smith, Kevin J

    2008-01-01

    Background Statistics is relevant to students and practitioners in medicine and health sciences and is increasingly taught as part of the medical curriculum. However, it is common for students to dislike and under-perform in statistics. We sought to address these issues by redesigning the way that statistics is taught. Methods The project brought together a statistician, clinician and educational experts to re-conceptualize the syllabus, and focused on developing different methods of delivery. New teaching materials, including videos, animations and contextualized workbooks were designed and produced, placing greater emphasis on applying statistics and interpreting data. Results Two cohorts of students were evaluated, one with old style and one with new style teaching. Both were similar with respect to age, gender and previous level of statistics. Students who were taught using the new approach could better define the key concepts of p-value and confidence interval (p < 0.001 for both). They were more likely to regard statistics as integral to medical practice (p = 0.03), and to expect to use it in their medical career (p = 0.003). There was no significant difference in the numbers who thought that statistics was essential to understand the literature (p = 0.28) and those who felt comfortable with the basics of statistics (p = 0.06). More than half the students in both cohorts felt that they were comfortable with the basics of medical statistics. Conclusion Using a variety of media, and placing emphasis on interpretation can help make teaching, learning and understanding of statistics more people-centred and relevant, resulting in better outcomes for students. PMID:18452599

  13. A Discussion of the Measurement and Statistical Manipulation of Selected Key Variables in an Adult Basic Education Program.

    ERIC Educational Resources Information Center

    Cunningham, Phyllis M.

    Intending to explore the interaction effects of self-esteem level and perceived program utility on the retention and cognitive achievement of adult basic education students, a self-esteem instrument, to be administered verbally, was constructed with content relevant items developed from and tested on a working class, undereducated, black, adult…

  14. A Quantile Regression Approach to Understanding the Relations among Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students

    ERIC Educational Resources Information Center

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2016-01-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological…

  15. Summary Statistics of CPB-Qualified Public Radio Stations: Fiscal Year 1971.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Basic statistics on finance, employment, and broadcast and production activities of 103 Corporation for Public Broadcasting (CPB)--qualified radio stations in the United States and Puerto Rico for Fiscal Year 1971 are collected. The first section of the report deals with total funds, income, direct operating costs, capital expenditures, and other…

  16. Using Statistics to Lie, Distort, and Abuse Data

    ERIC Educational Resources Information Center

    Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca

    2009-01-01

    Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…

  17. What Software to Use in the Teaching of Mathematical Subjects?

    ERIC Educational Resources Information Center

    Berežný, Štefan

    2015-01-01

    We can consider two basic views, when using mathematical software in the teaching of mathematical subjects. First: How to learn to use specific software for the specific tasks, e. g., software Statistica for the subjects of Applied statistics, probability and mathematical statistics, or financial mathematics. Second: How to learn to use the…

  18. Intrex Subject/Title Inverted-File Characteristics.

    ERIC Educational Resources Information Center

    Uemura, Syunsuke

    The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…

  19. The Robustness of the Studentized Range Statistic to Violations of the Normality and Homogeneity of Variance Assumptions.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.; Tcheng, Tse-Kia

    The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)

  20. Application of an Online Reference for Reviewing Basic Statistical Principles of Operating Room Management

    ERIC Educational Resources Information Center

    Dexter, Franklin; Masursky, Danielle; Wachtel, Ruth E.; Nussmeier, Nancy A.

    2010-01-01

    Operating room (OR) management differs from clinical anesthesia in that statistical literacy is needed daily to make good decisions. Two of the authors teach a course in operations research for surgical services to anesthesiologists, anesthesia residents, OR nursing directors, hospital administration students, and analysts to provide them with the…

  1. Statistics and Data Interpretation for Social Work

    ERIC Educational Resources Information Center

    Rosenthal, James A.

    2011-01-01

    Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes numerous examples, data sets, and issues that students will encounter in social work practice. The first section introduces basic concepts and terms to…

  2. Using Excel in Teacher Education for Sustainability

    ERIC Educational Resources Information Center

    Aydin, Serhat

    2016-01-01

    In this study, the feasibility of using Excel software in teaching whole Basic Statistics Course and its influence on the attitudes of pre-service science teachers towards statistics were investigated. One hundred and two pre-service science teachers in their second year participated in the study. The data were collected from the prospective…

  3. Basic Math Skills and Performance in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Johnson, Marianne; Kuennen, Eric

    2006-01-01

    We identify the student characteristics most associated with success in an introductory business statistics class, placing special focus on the relationship between student math skills and course performance, as measured by student grade in the course. To determine which math skills are important for student success, we examine (1) whether the…

  4. An Online Course of Business Statistics: The Proportion of Successful Students

    ERIC Educational Resources Information Center

    Pena-Sanchez, Rolando

    2009-01-01

    This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…

  5. Health Literacy Impact on National Healthcare Utilization and Expenditure.

    PubMed

    Rasu, Rafia S; Bawa, Walter Agbor; Suminski, Richard; Snella, Kathleen; Warady, Bradley

    2015-08-17

    Health literacy presents an enormous challenge in the delivery of effective healthcare and quality outcomes. We evaluated the impact of low health literacy (LHL) on healthcare utilization and healthcare expenditure. Database analysis used Medical Expenditure Panel Survey (MEPS) from 2005-2008 which provides nationally representative estimates of healthcare utilization and expenditure. Health literacy scores (HLSs) were calculated based on a validated, predictive model and were scored according to the National Assessment of Adult Literacy (NAAL). HLS ranged from 0-500. Health literacy level (HLL) and categorized in 2 groups: Below basic or basic (HLS <226) and above basic (HLS ≥226). Healthcare utilization expressed as a physician, nonphysician, or emergency room (ER) visits and healthcare spending. Expenditures were adjusted to 2010 rates using the Consumer Price Index (CPI). A P value of 0.05 or less was the criterion for statistical significance in all analyses. Multivariate regression models assessed the impact of the predicted HLLs on outpatient healthcare utilization and expenditures. All analyses were performed with SAS and STATA® 11.0 statistical software. The study evaluated 22 599 samples representing 503 374 648 weighted individuals nationally from 2005-2008. The cohort had an average age of 49 years and included more females (57%). Caucasian were the predominant racial ethnic group (83%) and 37% of the cohort were from the South region of the United States of America. The proportion of the cohort with basic or below basic health literacy was 22.4%. Annual predicted values of physician visits, nonphysician visits, and ER visits were 6.6, 4.8, and 0.2, respectively, for basic or below basic compared to 4.4, 2.6, and 0.1 for above basic. Predicted values of office and ER visits expenditures were $1284 and $151, respectively, for basic or below basic and $719 and $100 for above basic (P < .05). The extrapolated national estimates show that the annual costs for prescription alone for adults with LHL possibly associated with basic and below basic health literacy could potentially reach about $172 billion. Health literacy is inversely associated with healthcare utilization and expenditure. Individuals with below basic or basic HLL have greater healthcare utilization and expendituresspending more on prescriptions compared to individuals with above basic HLL. Public health strategies promoting appropriate education among individuals with LHL may help to improve health outcomes and reduce unnecessary healthcare visits and costs. © 2015 by Kerman University of Medical Sciences.

  6. Knowledge, attitude and anxiety pertaining to basic life support and medical emergencies among dental interns in Mangalore City, India.

    PubMed

    Somaraj, Vinej; Shenoy, Rekha P; Panchmal, Ganesh Shenoy; Jodalli, Praveen S; Sonde, Laxminarayan; Karkal, Ravichandra

    2017-01-01

    This cross-sectional study aimed to assess the knowledge, attitude and anxiety pertaining to basic life support (BLS) and medical emergencies among interns in dental colleges of Mangalore city, Karnataka, India. The study subjects comprised of interns who volunteered from the four dental colleges. The knowledge and attitude of interns were assessed using a 30-item questionnaire prepared based on the Basic Life Support Manual from American Heart Association and the anxiety of interns pertaining to BLS and medical emergencies were assessed using a State-Trait Anxiety Inventory (STAI) Questionnaire. Chi-square test was performed on SPSS 21.0 (IBM Statistics, 2012) to determine statistically significant differences ( P <0.05) between assessed knowledge and anxiety. Out of 183 interns, 39.89% had below average knowledge. A total of 123 (67.21%) reported unavailability of professional training. The majority (180, 98.36%) felt the urgent need of training in basic life support procedures. Assessment of stress showed a total of 27.1% participants to be above high-stress level. Comparison of assessed knowledge and stress was found to be insignificant ( P =0.983). There was an evident lack of knowledge pertaining to the management of medical emergencies among the interns. As oral health care providers moving out to the society, a focus should be placed on the training of dental interns with respect to Basic Life Support procedures.

  7. Statistical generation of training sets for measuring NO3(-), NH4(+) and major ions in natural waters using an ion selective electrode array.

    PubMed

    Mueller, Amy V; Hemond, Harold F

    2016-05-18

    Knowledge of ionic concentrations in natural waters is essential to understand watershed processes. Inorganic nitrogen, in the form of nitrate and ammonium ions, is a key nutrient as well as a participant in redox, acid-base, and photochemical processes of natural waters, leading to spatiotemporal patterns of ion concentrations at scales as small as meters or hours. Current options for measurement in situ are costly, relying primarily on instruments adapted from laboratory methods (e.g., colorimetric, UV absorption); free-standing and inexpensive ISE sensors for NO3(-) and NH4(+) could be attractive alternatives if interferences from other constituents were overcome. Multi-sensor arrays, coupled with appropriate non-linear signal processing, offer promise in this capacity but have not yet successfully achieved signal separation for NO3(-) and NH4(+)in situ at naturally occurring levels in unprocessed water samples. A novel signal processor, underpinned by an appropriate sensor array, is proposed that overcomes previous limitations by explicitly integrating basic chemical constraints (e.g., charge balance). This work further presents a rationalized process for the development of such in situ instrumentation for NO3(-) and NH4(+), including a statistical-modeling strategy for instrument design, training/calibration, and validation. Statistical analysis reveals that historical concentrations of major ionic constituents in natural waters across New England strongly covary and are multi-modal. This informs the design of a statistically appropriate training set, suggesting that the strong covariance of constituents across environmental samples can be exploited through appropriate signal processing mechanisms to further improve estimates of minor constituents. Two artificial neural network architectures, one expanded to incorporate knowledge of basic chemical constraints, were tested to process outputs of a multi-sensor array, trained using datasets of varying degrees of statistical representativeness to natural water samples. The accuracy of ANN results improves monotonically with the statistical representativeness of the training set (error decreases by ∼5×), while the expanded neural network architecture contributes a further factor of 2-3.5 decrease in error when trained with the most representative sample set. Results using the most statistically accurate set of training samples (which retain environmentally relevant ion concentrations but avoid the potential interference of humic acids) demonstrated accurate, unbiased quantification of nitrate and ammonium at natural environmental levels (±20% down to <10 μM), as well as the major ions Na(+), K(+), Ca(2+), Mg(2+), Cl(-), and SO4(2-), in unprocessed samples. These results show promise for the development of new in situ instrumentation for the support of scientific field work.

  8. Are emergency medical technician-basics able to use a selective immobilization of the cervical spine protocol?: a preliminary report.

    PubMed

    Dunn, Thomas M; Dalton, Alice; Dorfman, Todd; Dunn, William W

    2004-01-01

    To be a first step in determining whether emergency medicine technician (EMT)-Basics are capable of using a protocol that allows for selective immobilization of the cervical spine. Such protocols are coming into use at an advanced life support level and could be beneficial when used by basic life support providers. A convenience sample of participants (n=95) from 11 emergency medical services agencies and one college class participated in the study. All participants evaluated six patients in written scenarios and decided which should be placed into spinal precautions according to a selective spinal immobilization protocol. Systems without an existing selective spinal immobilization protocol received a one-hour continuing education lecture regarding the topic. College students received a similar lecture written so laypersons could understand the protocol. All participants showed proficiency when applying a selective immobilization protocol to patients in paper-based scenarios. Furthermore, EMT-Basics performed at the same level as paramedics when following the protocol. Statistical analysis revealed no significant differences between EMT-Basics and paramedics. A follow-up group of college students (added to have a non-EMS comparison group) also performed as well as paramedics when making decisions to use spinal precautions. Differences between college students and paramedics were also statistically insignificant. The results suggest that EMT-Basics are as accurate as paramedics when making decisions regarding selective immobilization of the cervical spine during paper-based scenarios. That laypersons are also proficient when using the protocol could indicate that it is extremely simple to follow. This study is a first step toward the necessary additional studies evaluating the efficacy of EMT-Basics using selective immobilization as a regular practice.

  9. 5 CFR 772.101 - Basic authority.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Basic authority. 772.101 Section 772.101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) INTERIM RELIEF General § 772.101 Basic authority. This part establishes a mechanism for agencies to provide...

  10. 5 CFR 772.101 - Basic authority.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Basic authority. 772.101 Section 772.101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) INTERIM RELIEF General § 772.101 Basic authority. This part establishes a mechanism for agencies to provide...

  11. Inferring epidemiological dynamics of infectious diseases using Tajima's D statistic on nucleotide sequences of pathogens.

    PubMed

    Kim, Kiyeon; Omori, Ryosuke; Ito, Kimihito

    2017-12-01

    The estimation of the basic reproduction number is essential to understand epidemic dynamics, and time series data of infected individuals are usually used for the estimation. However, such data are not always available. Methods to estimate the basic reproduction number using genealogy constructed from nucleotide sequences of pathogens have been proposed so far. Here, we propose a new method to estimate epidemiological parameters of outbreaks using the time series change of Tajima's D statistic on the nucleotide sequences of pathogens. To relate the time evolution of Tajima's D to the number of infected individuals, we constructed a parsimonious mathematical model describing both the transmission process of pathogens among hosts and the evolutionary process of the pathogens. As a case study we applied this method to the field data of nucleotide sequences of pandemic influenza A (H1N1) 2009 viruses collected in Argentina. The Tajima's D-based method estimated basic reproduction number to be 1.55 with 95% highest posterior density (HPD) between 1.31 and 2.05, and the date of epidemic peak to be 10th July with 95% HPD between 22nd June and 9th August. The estimated basic reproduction number was consistent with estimation by birth-death skyline plot and estimation using the time series of the number of infected individuals. These results suggested that Tajima's D statistic on nucleotide sequences of pathogens could be useful to estimate epidemiological parameters of outbreaks. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Compliant leg behaviour explains basic dynamics of walking and running

    PubMed Central

    Geyer, Hartmut; Seyfarth, Andre; Blickhan, Reinhard

    2006-01-01

    The basic mechanics of human locomotion are associated with vaulting over stiff legs in walking and rebounding on compliant legs in running. However, while rebounding legs well explain the stance dynamics of running, stiff legs cannot reproduce that of walking. With a simple bipedal spring–mass model, we show that not stiff but compliant legs are essential to obtain the basic walking mechanics; incorporating the double support as an essential part of the walking motion, the model reproduces the characteristic stance dynamics that result in the observed small vertical oscillation of the body and the observed out-of-phase changes in forward kinetic and gravitational potential energies. Exploring the parameter space of this model, we further show that it not only combines the basic dynamics of walking and running in one mechanical system, but also reveals these gaits to be just two out of the many solutions to legged locomotion offered by compliant leg behaviour and accessed by energy or speed. PMID:17015312

  13. Income inequality and depression: a systematic review and meta-analysis of the association and a scoping review of mechanisms.

    PubMed

    Patel, Vikram; Burns, Jonathan K; Dhingra, Monisha; Tarver, Leslie; Kohrt, Brandon A; Lund, Crick

    2018-02-01

    Most countries have witnessed a dramatic increase of income inequality in the past three decades. This paper addresses the question of whether income inequality is associated with the population prevalence of depression and, if so, the potential mechanisms and pathways which may explain this association. Our systematic review included 26 studies, mostly from high-income countries. Nearly two-thirds of all studies and five out of six longitudinal studies reported a statistically significant positive relationship between income inequality and risk of depression; only one study reported a statistically significant negative relationship. Twelve studies were included in a meta-analysis with dichotomized inequality groupings. The pooled risk ratio was 1.19 (95% CI: 1.07-1.31), demonstrating greater risk of depression in populations with higher income inequality relative to populations with lower inequality. Multiple studies reported subgroup effects, including greater impacts of income inequality among women and low-income populations. We propose an ecological framework, with mechanisms operating at the national level (the neo-material hypothesis), neighbourhood level (the social capital and the social comparison hypotheses) and individual level (psychological stress and social defeat hypotheses) to explain this association. We conclude that policy makers should actively promote actions to reduce income inequality, such as progressive taxation policies and a basic universal income. Mental health professionals should champion such policies, as well as promote the delivery of interventions which target the pathways and proximal determinants, such as building life skills in adolescents and provision of psychological therapies and packages of care with demonstrated effectiveness for settings of poverty and high income inequality. © 2018 World Psychiatric Association.

  14. Income inequality and depression: a systematic review and meta‐analysis of the association and a scoping review of mechanisms

    PubMed Central

    Patel, Vikram; Burns, Jonathan K.; Dhingra, Monisha; Tarver, Leslie; Kohrt, Brandon A.; Lund, Crick

    2018-01-01

    Most countries have witnessed a dramatic increase of income inequality in the past three decades. This paper addresses the question of whether income inequality is associated with the population prevalence of depression and, if so, the potential mechanisms and pathways which may explain this association. Our systematic review included 26 studies, mostly from high‐income countries. Nearly two‐thirds of all studies and five out of six longitudinal studies reported a statistically significant positive relationship between income inequality and risk of depression; only one study reported a statistically significant negative relationship. Twelve studies were included in a meta‐analysis with dichotomized inequality groupings. The pooled risk ratio was 1.19 (95% CI: 1.07‐1.31), demonstrating greater risk of depression in populations with higher income inequality relative to populations with lower inequality. Multiple studies reported subgroup effects, including greater impacts of income inequality among women and low‐income populations. We propose an ecological framework, with mechanisms operating at the national level (the neo‐material hypothesis), neighbourhood level (the social capital and the social comparison hypotheses) and individual level (psychological stress and social defeat hypotheses) to explain this association. We conclude that policy makers should actively promote actions to reduce income inequality, such as progressive taxation policies and a basic universal income. Mental health professionals should champion such policies, as well as promote the delivery of interventions which target the pathways and proximal determinants, such as building life skills in adolescents and provision of psychological therapies and packages of care with demonstrated effectiveness for settings of poverty and high income inequality. PMID:29352539

  15. Collective Behaviors in Spatially Extended Systems with Local Interactions and Synchronous Updating

    NASA Astrophysics Data System (ADS)

    ChatÉ, H.; Manneville, P.

    1992-01-01

    Assessing the extent to which dynamical systems with many degrees of freedom can be described within a thermodynamics formalism is a problem that currently attracts much attention. In this context, synchronously updated regular lattices of identical, chaotic elements with local interactions are promising models for which statistical mechanics may be hoped to provide some insights. This article presents a large class of cellular automata rules and coupled map lattices of the above type in space dimensions d = 2 to 6.Such simple models can be approached by a mean-field approximation which usually reduces the dynamics to that of a map governing the evolution of some extensive density. While this approximation is exact in the d = infty limit, where macroscopic variables must display the time-dependent behavior of the mean-field map, basic intuition from equilibrium statistical mechanics rules out any such behavior in a low-dimensional systems, since it would involve the collective motion of locally disordered elements.The models studied are chosen to be as close as possible to mean-field conditions, i.e., rather high space dimension, large connectivity, and equal-weight coupling between sites. While the mean-field evolution is never observed, a new type of non-trivial collective behavior is found, at odds with the predictions of equilibrium statistical mechanics. Both in the cellular automata models and in the coupled map lattices, macroscopic variables frequently display a non-transient, time-dependent, low-dimensional dynamics emerging out of local disorder. Striking examples are period 3 cycles in two-state cellular automata and a Hopf bifurcation for a d = 5 lattice of coupled logistic maps. An extensive account of the phenomenology is given, including a catalog of behaviors, classification tables for the celular automata rules, and bifurcation diagrams for the coupled map lattices.The observed underlying dynamics is accompanied by an intrinsic quasi-Gaussian noise (stemming from the local disorder) which disappears in the infinite-size limit. The collective behaviors constitute a robust phenomenon, resisting external noise, small changes in the local dynamics, and modifications of the initial and boundary conditions. Synchronous updating, high space dimension and the regularity of connections are shown to be crucial ingredients in the subtle build-up of correlations giving rise to the collective motion. The discussion stresses the need for a theoretical understanding that neither equilibrium statistical mechanics nor higher-order mean-field approximations are able to provide.

  16. A Correlational Study of the Relationships between Music Aptitude and Phonemic Awareness of Kindergarten Children

    ERIC Educational Resources Information Center

    Rubinson, Laura E.

    2010-01-01

    More than one third of American children cannot read at a basic level by fourth grade (Lee, Grigg, & Donahue, 2007) and those numbers are even higher for African American, Hispanic and poor White students (Boorman et al., 2007). These are alarming statistics given that the ability to read is the most basic and fundamental skill for academic…

  17. Availability of Instructional Materials at the Basic Education Level in Enugu Educational Zone of Enugu State, Nigeria

    ERIC Educational Resources Information Center

    Chukwu, Leo C.; Eze, Thecla A. Y.; Agada, Fidelia Chinyelugo

    2016-01-01

    The study examined the availability of instructional materials at the basic education level in Enugu Education Zone of Enugu State, Nigeria. One research question and one hypothesis guided the study. The research question was answered using mean and grand mean ratings, while the hypothesis was tested using t-test statistics at 0.05 level of…

  18. Basic statistics with Microsoft Excel: a review.

    PubMed

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  19. Basic statistics with Microsoft Excel: a review

    PubMed Central

    Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-01-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690

  20. On the Acquisition of Some Basic Word Spelling Mechanisms in a Deep (French) and a Shallow (Spanish) System

    ERIC Educational Resources Information Center

    Carrillo, Maria Soledad; Alegria, Jesus; Marin, Javier

    2013-01-01

    An experiment was carried out to compare the time course of the acquisition of two basic spelling mechanisms in Spanish, a shallow system, and French, a deep system. The first was lexical. It relies on the orthographic lexicon, a hypothetical structure containing the orthographic representations of words accessible for word spelling. To evaluate…

  1. A. C. C. Fact Book: A Statistical Profile of Allegany Community College and the Community It Serves.

    ERIC Educational Resources Information Center

    Andersen, Roger C.

    This document is intended to be an authoritative compilation of frequently referenced basic facts concerning Allegany Community College (ACC) in Maryland. It is a statistical profile of ACC and the community it serves, divided into six sections: enrollment, students, faculty, community, support services, and general college related information.…

  2. Basic Mathematics Test Predicts Statistics Achievement and Overall First Year Academic Success

    ERIC Educational Resources Information Center

    Fonteyne, Lot; De Fruyt, Filip; Dewulf, Nele; Duyck, Wouter; Erauw, Kris; Goeminne, Katy; Lammertyn, Jan; Marchant, Thierry; Moerkerke, Beatrijs; Oosterlinck, Tom; Rosseel, Yves

    2015-01-01

    In the psychology and educational science programs at Ghent University, only 36.1% of the new incoming students in 2011 and 2012 passed all exams. Despite availability of information, many students underestimate the scientific character of social science programs. Statistics courses are a major obstacle in this matter. Not all enrolling students…

  3. The Structure of Research Methodology Competency in Higher Education and the Role of Teaching Teams and Course Temporal Distance

    ERIC Educational Resources Information Center

    Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert

    2011-01-01

    The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…

  4. Ten Ways to Improve the Use of Statistical Mediation Analysis in the Practice of Child and Adolescent Treatment Research

    ERIC Educational Resources Information Center

    Maric, Marija; Wiers, Reinout W.; Prins, Pier J. M.

    2012-01-01

    Despite guidelines and repeated calls from the literature, statistical mediation analysis in youth treatment outcome research is rare. Even more concerning is that many studies that "have" reported mediation analyses do not fulfill basic requirements for mediation analysis, providing inconclusive data and clinical implications. As a result, after…

  5. Statistical estimators for monitoring spotted owls in Oregon and Washington in 1987.

    Treesearch

    Tlmothy A. Max; Ray A. Souter; Kathleen A. O' Halloran

    1990-01-01

    Spotted owls (Strix occidentalis) were monitored on 11 National Forests in the Pacific Northwest Region of the USDA Forest Service between March and August of 1987. The basic intent of monitoring was to provide estimates of occupancy and reproduction rates for pairs of spotted owls. This paper documents the technical details of the statistical...

  6. Statistical techniques for sampling and monitoring natural resources

    Treesearch

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  7. Peer-Assisted Learning in Research Methods and Statistics

    ERIC Educational Resources Information Center

    Stone, Anna; Meade, Claire; Watling, Rosamond

    2012-01-01

    Feedback from students on a Level 1 Research Methods and Statistics module, studied as a core part of a BSc Psychology programme, highlighted demand for additional tutorials to help them to understand basic concepts. Students in their final year of study commonly request work experience to enhance their employability. All students on the Level 1…

  8. Adult Basic and Secondary Education Program Statistics. Fiscal Year 1976.

    ERIC Educational Resources Information Center

    Cain, Sylvester H.; Whalen, Barbara A.

    Reports submitted to the National Center for Education Statistics provided data for this compilation and tabulation of data on adult participants in U.S. educational programs in fiscal year 1976. In the summary section introducing the charts, it is noted that adult education programs funded under P.L. 91-230 served over 1.6 million persons--an…

  9. The Education Almanac, 1987-1988. Facts and Figures about Our Nation's System of Education. Third Edition.

    ERIC Educational Resources Information Center

    Goodman, Leroy V., Ed.

    This is the third edition of the Education Almanac, an assemblage of statistics, facts, commentary, and basic background information about the conduct of schools in the United States. Features of this variegated volume include an introductory section on "Education's Newsiest Developments," followed by some vital educational statistics, a set of…

  10. 29 CFR 5.24 - The basic hourly rate of pay.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false The basic hourly rate of pay. 5.24 Section 5.24 Labor... Provisions of the Davis-Bacon Act § 5.24 The basic hourly rate of pay. “The basic hourly rate of pay” is that part of a laborer's or mechanic's wages which the Secretary of Labor would have found and included in...

  11. 29 CFR 5.24 - The basic hourly rate of pay.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true The basic hourly rate of pay. 5.24 Section 5.24 Labor Office... Provisions of the Davis-Bacon Act § 5.24 The basic hourly rate of pay. “The basic hourly rate of pay” is that part of a laborer's or mechanic's wages which the Secretary of Labor would have found and included in...

  12. 29 CFR 5.24 - The basic hourly rate of pay.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false The basic hourly rate of pay. 5.24 Section 5.24 Labor... Provisions of the Davis-Bacon Act § 5.24 The basic hourly rate of pay. “The basic hourly rate of pay” is that part of a laborer's or mechanic's wages which the Secretary of Labor would have found and included in...

  13. 29 CFR 5.24 - The basic hourly rate of pay.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true The basic hourly rate of pay. 5.24 Section 5.24 Labor Office... Provisions of the Davis-Bacon Act § 5.24 The basic hourly rate of pay. “The basic hourly rate of pay” is that part of a laborer's or mechanic's wages which the Secretary of Labor would have found and included in...

  14. 29 CFR 5.24 - The basic hourly rate of pay.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false The basic hourly rate of pay. 5.24 Section 5.24 Labor... Provisions of the Davis-Bacon Act § 5.24 The basic hourly rate of pay. “The basic hourly rate of pay” is that part of a laborer's or mechanic's wages which the Secretary of Labor would have found and included in...

  15. Development of a Mechanism and Standards for the Assessment of Adult Basic Education Students as They Relate to Post-Secondary Vocational Education Programs. Final Report.

    ERIC Educational Resources Information Center

    Grosskoph, Arlys; And Others

    The purpose of this project was to develop a process that would reduce the attrition rate of adult basic education students entering occupational programs. To accomplish this goal, adult basic education students in occupational programs, adult basic education students who had dropped out of occupational programs, and their instructors were…

  16. Theory and computation of non-RRKM lifetime distributions and rates in chemical systems with three or more degrees of freedom

    NASA Astrophysics Data System (ADS)

    Gabern, Frederic; Koon, Wang S.; Marsden, Jerrold E.; Ross, Shane D.

    2005-11-01

    The computation, starting from basic principles, of chemical reaction rates in realistic systems (with three or more degrees of freedom) has been a longstanding goal of the chemistry community. Our current work, which merges tube dynamics with Monte Carlo methods provides some key theoretical and computational tools for achieving this goal. We use basic tools of dynamical systems theory, merging the ideas of Koon et al. [W.S. Koon, M.W. Lo, J.E. Marsden, S.D. Ross, Heteroclinic connections between periodic orbits and resonance transitions in celestial mechanics, Chaos 10 (2000) 427-469.] and De Leon et al. [N. De Leon, M.A. Mehta, R.Q. Topper, Cylindrical manifolds in phase space as mediators of chemical reaction dynamics and kinetics. I. Theory, J. Chem. Phys. 94 (1991) 8310-8328.], particularly the use of invariant manifold tubes that mediate the reaction, into a tool for the computation of lifetime distributions and rates of chemical reactions and scattering phenomena, even in systems that exhibit non-statistical behavior. Previously, the main problem with the application of tube dynamics has been with the computation of volumes in phase spaces of high dimension. The present work provides a starting point for overcoming this hurdle with some new ideas and implements them numerically. Specifically, an algorithm that uses tube dynamics to provide the initial bounding box for a Monte Carlo volume determination is used. The combination of a fine scale method for determining the phase space structure (invariant manifold theory) with statistical methods for volume computations (Monte Carlo) is the main contribution of this paper. The methodology is applied here to a three degree of freedom model problem and may be useful for higher degree of freedom systems as well.

  17. Normal saline solution nasal-pharyngeal irrigation improves chronic cough associated with allergic rhinitis.

    PubMed

    Lin, Lin; Chen, Zhongchun; Cao, Yitan; Sun, Guangbin

    2017-03-01

    Upper airway inflammation is one of the most commonly identified causes of chronic cough, although the underlying mechanism is not clear. This study compared normal saline solution nasal-pharyngeal irrigation (NSNPI) and fluticasone propionate nasal spray (FPNS) treatment for chronic cough associated with allergic rhinitis (AR). Patients with suspected AR to house-dust mite were enrolled, and the symptom of cough was assessed by a cough symptom score and the Leicester Cough Questionnaire, and cough response to capsaicin was evaluated. AR was assessed by using the visual analog scale (VAS) and the Mini Juniper Rhinoconjunctivitis Quality of Life Questionnaire (MiniRQLQ). Mediators, including histamine, leukotriene C4, and prostaglandin D2, and the major basic protein from nasal lavage fluid (NLF) were examined. The patients were treated with NSNPI (the NSNPI group) or FPNS (the FPNS group) for 30 days, after which they were reassessed. Forty-five of 50 patients completed this study. The scores of the cough symptom and the Leicester Cough Questionnaire, and the capsaicin cough threshold all improved statistically after NSNPI but did not change after FPNS. There were statistically significant changes in the evaluations of the MiniRQLQ and the mediators, including histamine and leukotriene C4, in the NLF in the NSNPI group. However, significant changes were found in the assessments of VAS, MiniRQLQ, and all above mediators including histamine, leukotriene C4, and prostaglandin D2, and the major basic protein in the NLF of the FPNS group. Furthermore, the assessments of VAS and all the mediators were reduced more in the FPNS group compared with those in the NSNPI group. The patients with suspected AR to house-dust mite reported a better relief of the cough symptom after 30 days of treatment with NSNPI compared with that after nasal corticosteroid.

  18. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  19. Predicting Success in Psychological Statistics Courses.

    PubMed

    Lester, David

    2016-06-01

    Many students perform poorly in courses on psychological statistics, and it is useful to be able to predict which students will have difficulties. In a study of 93 undergraduates enrolled in Statistical Methods (18 men, 75 women; M age = 22.0 years, SD = 5.1), performance was significantly associated with sex (female students performed better) and proficiency in algebra in a linear regression analysis. Anxiety about statistics was not associated with course performance, indicating that basic mathematical skills are the best correlate for performance in statistics courses and can usefully be used to stream students into classes by ability. © The Author(s) 2016.

  20. Special Features of Galactic Dynamics

    NASA Astrophysics Data System (ADS)

    Efthymiopoulos, Christos; Voglis, Nikos; Kalapotharakos, Constantinos

    This is an introductory article to some basic notions and currently open problems of galactic dynamics. The focus is on topics mostly relevant to the so-called `new methods' of celestial mechanics or Hamiltonian dynamics, as applied to the ellipsoidal components of galaxies, i.e., to the elliptical galaxies and to the dark halos and bulges of disk galaxies. Traditional topics such as Jeans theorem, the role of a `third integral' of motion, Nekhoroshev theory, violent relaxation, and the statistical mechanics of collisionless stellar systems are first discussed. The emphasis is on modern extrapolations of these old topics. Recent results from orbital and global dynamical studies of galaxies are then shortly reviewed. The role of various families of orbits in supporting self-consistency, as well as the role of chaos in galaxies, are stressed. A description is then given of the main numerical techniques of integration of the N-body problem in the framework of stellar dynamics and of the results obtained via N-Body experiments. A final topic is the secular evolution and self-organization of galactic systems.

  1. Moderate point: Balanced entropy and enthalpy contributions in soft matter

    NASA Astrophysics Data System (ADS)

    He, Baoji; Wang, Yanting

    2017-03-01

    Various soft materials share some common features, such as significant entropic effect, large fluctuations, sensitivity to thermodynamic conditions, and mesoscopic characteristic spatial and temporal scales. However, no quantitative definitions have yet been provided for soft matter, and the intrinsic mechanisms leading to their common features are unclear. In this work, from the viewpoint of statistical mechanics, we show that soft matter works in the vicinity of a specific thermodynamic state named moderate point, at which entropy and enthalpy contributions among substates along a certain order parameter are well balanced or have a minimal difference. Around the moderate point, the order parameter fluctuation, the associated response function, and the spatial correlation length maximize, which explains the large fluctuation, the sensitivity to thermodynamic conditions, and mesoscopic spatial and temporal scales of soft matter, respectively. Possible applications to switching chemical bonds or allosteric biomachines determining their best working temperatures are also briefly discussed. Project supported by the National Basic Research Program of China (Grant No. 2013CB932804) and the National Natural Science Foundation of China (Grant Nos. 11274319 and 11421063).

  2. Effect of mechanical denaturation on surface free energy of protein powders.

    PubMed

    Mohammad, Mohammad Amin; Grimsey, Ian M; Forbes, Robert T; Blagbrough, Ian S; Conway, Barbara R

    2016-10-01

    Globular proteins are important both as therapeutic agents and excipients. However, their fragile native conformations can be denatured during pharmaceutical processing, which leads to modification of the surface energy of their powders and hence their performance. Lyophilized powders of hen egg-white lysozyme and β-galactosidase from Aspergillus oryzae were used as models to study the effects of mechanical denaturation on the surface energies of basic and acidic protein powders, respectively. Their mechanical denaturation upon milling was confirmed by the absence of their thermal unfolding transition phases and by the changes in their secondary and tertiary structures. Inverse gas chromatography detected differences between both unprocessed protein powders and the changes induced by their mechanical denaturation. The surfaces of the acidic and basic protein powders were relatively basic, however the surface acidity of β-galactosidase was higher than that of lysozyme. Also, the surface of β-galactosidase powder had a higher dispersive energy compared to lysozyme. The mechanical denaturation decreased the dispersive energy and the basicity of the surfaces of both protein powders. The amino acid composition and molecular conformation of the proteins explained the surface energy data measured by inverse gas chromatography. The biological activity of mechanically denatured protein powders can either be reversible (lysozyme) or irreversible (β-galactosidase) upon hydration. Our surface data can be exploited to understand and predict the performance of protein powders within pharmaceutical dosage forms. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Non-equilibrium statistical mechanics theory for the large scales of geophysical flows

    NASA Astrophysics Data System (ADS)

    Eric, S.; Bouchet, F.

    2010-12-01

    The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.

  4. Facts about Folic Acid

    MedlinePlus

    ... Surveillance References Birth Defects COUNT Data & Statistics Research Articles & Key Findings About Us Partners Links to Other Websites Information For… Media Policy Makers Folic Acid Basics Language: English (US) ...

  5. Osteoarthritis year in review 2014: mechanics--basic and clinical studies in osteoarthritis.

    PubMed

    Moyer, R F; Ratneswaran, A; Beier, F; Birmingham, T B

    2014-12-01

    The purpose of this review was to highlight recent research in mechanics and osteoarthritis (OA) by summarizing results from selected studies spanning basic and clinical research methods. Databases were searched from January 2013 through to March 2014. Working in pairs, reviewers selected 67 studies categorized into four themes--mechanobiology, ambulatory mechanics, biomechanical interventions and mechanical risk factors. Novel developments in mechanobiology included the identification of cell signaling pathways that mediated cellular responses to loading of articular cartilage. Studies in ambulatory mechanics included an increased focus on instrumented knee implants and progress in computational models, both emphasizing the importance of muscular contributions to load. Several proposed biomechanical interventions (e.g., shoe insoles and knee braces) produced variable changes in external knee joint moments during walking, while meta-analysis of randomized clinical trials did not support the use of lateral wedge insoles for decreasing pain. Results from high quality randomized trials suggested diet with or without exercise decreased indicators of knee joint load during walking, whereas similar effects from exercise alone were not detected with the measures used. Data from longitudinal cohorts suggested mechanical alignment was a risk factor for incidence and progression of OA, with the mechanism involving damage to the meniscus. In combination, the basic and clinical studies highlight the importance of considering multiple contributors to joint loading that can evoke both protective and damaging responses. Although challenges clearly exist, future studies should strive to integrate basic and clinical research methods to gain a greater understanding of the interactions among mechanical factors in OA and to develop improved preventive and therapeutic strategies. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  6. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  7. A Comparison of Techniques for Camera Selection and Hand-Off in a Video Network

    NASA Astrophysics Data System (ADS)

    Li, Yiming; Bhanu, Bir

    Video networks are becoming increasingly important for solving many real-world problems. Multiple video sensors require collaboration when performing various tasks. One of the most basic tasks is the tracking of objects, which requires mechanisms to select a camera for a certain object and hand-off this object from one camera to another so as to accomplish seamless tracking. In this chapter, we provide a comprehensive comparison of current and emerging camera selection and hand-off techniques. We consider geometry-, statistics-, and game theory-based approaches and provide both theoretical and experimental comparison using centralized and distributed computational models. We provide simulation and experimental results using real data for various scenarios of a large number of cameras and objects for in-depth understanding of strengths and weaknesses of these techniques.

  8. Sorting of Streptomyces Cell Pellets Using a Complex Object Parametric Analyzer and Sorter

    PubMed Central

    Petrus, Marloes L. C.; van Veluw, G. Jerre; Wösten, Han A. B.; Claessen, Dennis

    2014-01-01

    Streptomycetes are filamentous soil bacteria that are used in industry for the production of enzymes and antibiotics. When grown in bioreactors, these organisms form networks of interconnected hyphae, known as pellets, which are heterogeneous in size. Here we describe a method to analyze and sort mycelial pellets using a Complex Object Parametric Analyzer and Sorter (COPAS). Detailed instructions are given for the use of the instrument and the basic statistical analysis of the data. We furthermore describe how pellets can be sorted according to user-defined settings, which enables downstream processing such as the analysis of the RNA or protein content. Using this methodology the mechanism underlying heterogeneous growth can be tackled. This will be instrumental for improving streptomycetes as a cell factory, considering the fact that productivity correlates with pellet size. PMID:24561666

  9. Seeing believes: Watching entangled sculpted branched DNA in real time

    NASA Astrophysics Data System (ADS)

    Jee, Ah-Young; Guan, Juan; Chen, Kejia; Granick, Steve

    2015-03-01

    The importance of branching in polymer physics is universally accepted but the details are disputed. We have sculpted DNA to various degrees of branching and used single-molecule tracking to image its diffusion in real time when entangled. By ligating three identical or varying length DNA segments, we construct symmetric and asymmetric ?Y? branches from elements of lambda-DNA with 16 um contour length, allowing for single-molecule visualization of equilibrium dynamics. Using home-written software, a full statistical distribution based on at least hundreds of trajectories is quantified with focus on discriminating arm-retraction from branch point motion. Some part of our observations is consistent with the anticipated ?relaxation through arm retraction? mechanism but other observations do not appear to be anticipated theoretically. Currently working as a researcher in Institute for Basic Science.

  10. Vibration and Noise in Magnetic Resonance Imaging of the Vocal Tract: Differences between Whole-Body and Open-Air Devices.

    PubMed

    Přibil, Jiří; Přibilová, Anna; Frollo, Ivan

    2018-04-05

    This article compares open-air and whole-body magnetic resonance imaging (MRI) equipment working with a weak magnetic field as regards the methods of its generation, spectral properties of mechanical vibration and acoustic noise produced by gradient coils during the scanning process, and the measured noise intensity. These devices are used for non-invasive MRI reconstruction of the human vocal tract during phonation with simultaneous speech recording. In this case, the vibration and noise have negative influence on quality of speech signal. Two basic measurement experiments were performed within the paper: mapping sound pressure levels in the MRI device vicinity and picking up vibration and noise signals in the MRI scanning area. Spectral characteristics of these signals are then analyzed statistically and compared visually and numerically.

  11. Social Ecology, Genomics, and African American Health: A Nonlinear Dynamical Perspective

    PubMed Central

    Madhere, Serge; Harrell, Jules; Royal, Charmaine D. M.

    2009-01-01

    This article offers a model that clarifies the degree of interdependence between social ecology and genomic processes. Drawing on principles from nonlinear dynamics, the model delineates major lines of bifurcation involving people's habitat, their family health history, and collective catastrophes experienced by their community. It shows how mechanisms of resource acquisition, depletion, and preservation can lead to disruptions in basic metabolism and in the activity of cytokines, neurotransmitters, and protein kinases, thus giving impetus to epigenetic changes. The hypotheses generated from the model are discussed throughout the article for their relevance to health problems among African Americans. Where appropriate, they are examined in light of data from the National Vital Statistics System. Multiple health outcomes are considered. For any one of them, the model makes clear the unique and converging contributions of multiple antecedent factors. PMID:19672481

  12. Hotspots for allosteric regulation on protein surfaces

    PubMed Central

    Reynolds, Kimberly A.; McLaughlin, Richard N.; Ranganathan, Rama

    2012-01-01

    Recent work indicates a general architecture for proteins in which sparse networks of physically contiguous and co-evolving amino acids underlie basic aspects of structure and function. These networks, termed sectors, are spatially organized such that active sites are linked to many surface sites distributed throughout the structure. Using the metabolic enzyme dihydrofolate reductase as a model system, we show that (1) the sector is strongly correlated to a network of residues undergoing millisecond conformational fluctuations associated with enzyme catalysis and (2) sector-connected surface sites are statistically preferred locations for the emergence of allosteric control in vivo. Thus, sectors represent an evolutionarily conserved “wiring” mechanism that can enable perturbations at specific surface positions to rapidly initiate conformational control over protein function. These findings suggest that sectors enable the evolution of intermolecular communication and regulation. PMID:22196731

  13. Bimanual coordination of bowing and fingering in violinists--effects of position changes and string changes.

    PubMed

    Kazennikov, Oleg; Wiesendanger, Mario

    2009-07-01

    Music performance is based on demanding motor control with much practice from young age onward. We have chosen to investigate basic bimanual movements played by violin amateurs and professionals. We posed the question whether position and string changes, two frequent mechanisms, may influence the time interval bowing (right)-fingering (left) coordination. The objective was to measure bimanual coordination, i.e., with or without position changes and string changes. The tendency was that the bimanual coordination was statistically only slightly increased or even unchanged but not perceptible. We conclude that the coordination index is limited up to 100 ms intervals, without any erroneous perception. Although the mentioned position changes and string changes are movements with their timing, they are executed in parallel rather than in series with the bow-fingering coordination.

  14. SEDIDAT: A BASIC program for the collection and statistical analysis of particle settling velocity data

    NASA Astrophysics Data System (ADS)

    Wright, Robyn; Thornberg, Steven M.

    SEDIDAT is a series of compiled IBM-BASIC (version 2.0) programs that direct the collection, statistical calculation, and graphic presentation of particle settling velocity and equivalent spherical diameter for samples analyzed using the settling tube technique. The programs follow a menu-driven format that is understood easily by students and scientists with little previous computer experience. Settling velocity is measured directly (cm,sec) and also converted into Chi units. Equivalent spherical diameter (reported in Phi units) is calculated using a modified Gibbs equation for different particle densities. Input parameters, such as water temperature, settling distance, particle density, run time, and Phi;Chi interval are changed easily at operator discretion. Optional output to a dot-matrix printer includes a summary of moment and graphic statistical parameters, a tabulation of individual and cumulative weight percents, a listing of major distribution modes, and cumulative and histogram plots of a raw time, settling velocity. Chi and Phi data.

  15. [Comment on] Statistical discrimination

    NASA Astrophysics Data System (ADS)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  16. Many-Body Localization and Thermalization in Quantum Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Nandkishore, Rahul; Huse, David A.

    2015-03-01

    We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.

  17. Exploring cognitive integration of basic science and its effect on diagnostic reasoning in novices.

    PubMed

    Lisk, Kristina; Agur, Anne M R; Woods, Nicole N

    2016-06-01

    Integration of basic and clinical science knowledge is increasingly being recognized as important for practice in the health professions. The concept of 'cognitive integration' places emphasis on the value of basic science in providing critical connections to clinical signs and symptoms while accounting for the fact that clinicians may not spontaneously articulate their use of basic science knowledge in clinical reasoning. In this study we used a diagnostic justification test to explore the impact of integrated basic science instruction on novices' diagnostic reasoning process. Participants were allocated to an integrated basic science or clinical science training group. The integrated basic science group was taught the clinical features along with the underlying causal mechanisms of four musculoskeletal pathologies while the clinical science group was taught only the clinical features. Participants completed a diagnostic accuracy test immediately after initial learning, and one week later a diagnostic accuracy and justification test. The results showed that novices who learned the integrated causal mechanisms had superior diagnostic accuracy and better understanding of the relative importance of key clinical features. These findings further our understanding of cognitive integration by providing evidence of the specific changes in clinical reasoning when basic and clinical sciences are integrated during learning.

  18. Proceedings of the NASTRAN (Tradename) Users’ Colloquium (15th) Held in Kansas City, Missouri on 4-8 May 1987

    DTIC Science & Technology

    1987-08-01

    HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band

  19. Columbia/Willamette Skill Builders Consortium. Final Performance Report. Appendix 5B Anodizing Inc. (Aluminum Extrusion Manufacturing). Basic Measurement Math. Instructors' Reports and Sample Curriculum Materials.

    ERIC Educational Resources Information Center

    Taylor, Marjorie; And Others

    Anodizing, Inc., Teamsters Local 162, and Mt. Hood Community College (Oregon) developed a workplace literacy program for workers at Anodizing. These workers did not have the basic skill competencies to benefit from company training efforts in statistical process control and quality assurance and were not able to advance to lead and supervisory…

  20. Opportunities Unlimited: Minnesota Indians Adult Basic Education; Narrative and Statistical Evaluation Third Year 1971-72, with a Review of the First and Second Years.

    ERIC Educational Resources Information Center

    Vizenor, Gerald

    Opportunities Unlimited is a State-wide program to provide adult basic education (ABE) and training for Indians on Minnesota reservations and in Indian communities. An administrative center in Bemidji serves communities on the Red Lake, White Earth, and Leech Lake Reservations, and a Duluth center provides ABE and training for communities on the…

  1. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  2. The Relationships between the Iowa Test of Basic Skills and the Washington Assessment of Student Learning in the State of Washington. Technical Report.

    ERIC Educational Resources Information Center

    Joireman, Jeff; Abbott, Martin L.

    This report examines the overlap between student test results on the Iowa Test of Basic Skills (ITBS) and the Washington Assessment of Student Learning (WASL). The two tests were compared and contrasted in terms of content and measurement philosophy, and analyses studied the statistical relationship between the ITBS and the WASL. The ITBS assesses…

  3. Fundamentals in Biostatistics for Research in Pediatric Dentistry: Part I - Basic Concepts.

    PubMed

    Garrocho-Rangel, J A; Ruiz-Rodríguez, M S; Pozos-Guillén, A J

    The purpose of this report was to provide the reader with some basic concepts in order to better understand the significance and reliability of the results of any article on Pediatric Dentistry. Currently, Pediatric Dentists need the best evidence available in the literature on which to base their diagnoses and treatment decisions for the children's oral care. Basic understanding of Biostatistics plays an important role during the entire Evidence-Based Dentistry (EBD) process. This report describes Biostatistics fundamentals in order to introduce the basic concepts used in statistics, such as summary measures, estimation, hypothesis testing, effect size, level of significance, p value, confidence intervals, etc., which are available to Pediatric Dentists interested in reading or designing original clinical or epidemiological studies.

  4. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  5. Computer programs for computing particle-size statistics of fluvial sediments

    USGS Publications Warehouse

    Stevens, H.H.; Hubbell, D.W.

    1986-01-01

    Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)

  6. Learning Genetics with Paper Pets

    ERIC Educational Resources Information Center

    Finnerty, Valerie Raunig

    2006-01-01

    By the end of the eighth grade, students are expected to have a basic understanding of the mechanism of basic genetic inheritance. However, these concepts can be difficult to teach. In this article, the author introduces a new learning tool that will help facilitate student learning and enthusiasm to the basic concepts of genetic inheritance. This…

  7. Statistical/Documentary Report, 1974 and 1975 Assessments of 17-Year-Old Students, Summary Volume; Functional Literacy Basic Reading Performance.

    ERIC Educational Resources Information Center

    Gadway, Charles J.; Wilson, H.A.

    This document provides statistical data on the 1974 and 1975 Mini-Assessment of Functional Literacy, which was designed to determine the extent of functional literacy among seventeen year olds in America. Also presented are data from comparable test items from the 1971 assessment. Three standards are presented, to allow different methods of…

  8. Effects of an Instructional Gaming Characteristic on Learning Effectiveness, Efficiency, and Engagement: Using a Storyline for Teaching Basic Statistical Skills

    ERIC Educational Resources Information Center

    Novak, Elena; Johnson, Tristan E.; Tenenbaum, Gershon; Shute, Valerie J.

    2016-01-01

    The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. A storyline is a game-design element that connects scenes with the educational content. In order to…

  9. Examining Agreement and Longitudinal Stability among Traditional and RTI-Based Definitions of Reading Disability Using the Affected-Status Agreement Statistic

    ERIC Educational Resources Information Center

    Waesche, Jessica S. Brown; Schatschneider, Christopher; Maner, Jon K.; Ahmed, Yusra; Wagner, Richard K.

    2011-01-01

    Rates of agreement among alternative definitions of reading disability and their 1- and 2-year stabilities were examined using a new measure of agreement, the affected-status agreement statistic. Participants were 288,114 first through third grade students. Reading measures were "Dynamic Indicators of Basic Early Literacy Skills" Oral…

  10. Elementary Preservice Teachers' Reasoning about Modeling a "Family Factory" with TinkerPlots--A Pilot Study

    ERIC Educational Resources Information Center

    Biehler, Rolf; Frischemeier, Daniel; Podworny, Susanne

    2017-01-01

    Connecting data and chance is fundamental in statistics curricula. The use of software like TinkerPlots can bridge both worlds because the TinkerPlots Sampler supports learners in expressive modeling. We conducted a study with elementary preservice teachers with a basic university education in statistics. They were asked to set up and evaluate…

  11. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  12. Effects of an Instructional Gaming Characteristic on Learning Effectiveness, Efficiency, and Engagement: Using a Storyline to Teach Basic Statistical Analytical Skills

    ERIC Educational Resources Information Center

    Novak, Elena

    2012-01-01

    The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. In addition, the study focused on examining the effects of a storyline GC on specific learning…

  13. Biological and mechanical interplay at the Macro- and Microscales Modulates the Cell-Niche Fate.

    PubMed

    Sarig, Udi; Sarig, Hadar; Gora, Aleksander; Krishnamoorthi, Muthu Kumar; Au-Yeung, Gigi Chi Ting; de-Berardinis, Elio; Chaw, Su Yin; Mhaisalkar, Priyadarshini; Bogireddi, Hanumakumar; Ramakrishna, Seeram; Boey, Freddy Yin Chiang; Venkatraman, Subbu S; Machluf, Marcelle

    2018-03-02

    Tissue development, regeneration, or de-novo tissue engineering in-vitro, are based on reciprocal cell-niche interactions. Early tissue formation mechanisms, however, remain largely unknown given complex in-vivo multifactoriality, and limited tools to effectively characterize and correlate specific micro-scaled bio-mechanical interplay. We developed a unique model system, based on decellularized porcine cardiac extracellular matrices (pcECMs)-as representative natural soft-tissue biomaterial-to study a spectrum of common cell-niche interactions. Model monocultures and 1:1 co-cultures on the pcECM of human umbilical vein endothelial cells (HUVECs) and human mesenchymal stem cells (hMSCs) were mechano-biologically characterized using macro- (Instron), and micro- (AFM) mechanical testing, histology, SEM and molecular biology aspects using RT-PCR arrays. The obtained data was analyzed using developed statistics, principal component and gene-set analyses tools. Our results indicated biomechanical cell-type dependency, bi-modal elasticity distributions at the micron cell-ECM interaction level, and corresponding differing gene expression profiles. We further show that hMSCs remodel the ECM, HUVECs enable ECM tissue-specific recognition, and their co-cultures synergistically contribute to tissue integration-mimicking conserved developmental pathways. We also suggest novel quantifiable measures as indicators of tissue assembly and integration. This work may benefit basic and translational research in materials science, developmental biology, tissue engineering, regenerative medicine and cancer biomechanics.

  14. Avalanches and generalized memory associativity in a network model for conscious and unconscious mental functioning

    NASA Astrophysics Data System (ADS)

    Siddiqui, Maheen; Wedemann, Roseli S.; Jensen, Henrik Jeldtoft

    2018-01-01

    We explore statistical characteristics of avalanches associated with the dynamics of a complex-network model, where two modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's ideas regarding the neuroses and that consciousness is related with symbolic and linguistic memory activity in the brain. It incorporates the Stariolo-Tsallis generalization of the Boltzmann Machine in order to model memory retrieval and associativity. In the present work, we define and measure avalanche size distributions during memory retrieval, in order to gain insight regarding basic aspects of the functioning of these complex networks. The avalanche sizes defined for our model should be related to the time consumed and also to the size of the neuronal region which is activated, during memory retrieval. This allows the qualitative comparison of the behaviour of the distribution of cluster sizes, obtained during fMRI measurements of the propagation of signals in the brain, with the distribution of avalanche sizes obtained in our simulation experiments. This comparison corroborates the indication that the Nonextensive Statistical Mechanics formalism may indeed be more well suited to model the complex networks which constitute brain and mental structure.

  15. The evolution of continuous learning of the structure of the environment

    PubMed Central

    Kolodny, Oren; Edelman, Shimon; Lotem, Arnon

    2014-01-01

    Continuous, ‘always on’, learning of structure from a stream of data is studied mainly in the fields of machine learning or language acquisition, but its evolutionary roots may go back to the first organisms that were internally motivated to learn and represent their environment. Here, we study under what conditions such continuous learning (CL) may be more adaptive than simple reinforcement learning and examine how it could have evolved from the same basic associative elements. We use agent-based computer simulations to compare three learning strategies: simple reinforcement learning; reinforcement learning with chaining (RL-chain) and CL that applies the same associative mechanisms used by the other strategies, but also seeks statistical regularities in the relations among all items in the environment, regardless of the initial association with food. We show that a sufficiently structured environment favours the evolution of both RL-chain and CL and that CL outperforms the other strategies when food is relatively rare and the time for learning is limited. This advantage of internally motivated CL stems from its ability to capture statistical patterns in the environment even before they are associated with food, at which point they immediately become useful for planning. PMID:24402920

  16. Numerical investigation of kinetic turbulence in relativistic pair plasmas - I. Turbulence statistics

    NASA Astrophysics Data System (ADS)

    Zhdankin, Vladimir; Uzdensky, Dmitri A.; Werner, Gregory R.; Begelman, Mitchell C.

    2018-02-01

    We describe results from particle-in-cell simulations of driven turbulence in collisionless, magnetized, relativistic pair plasma. This physical regime provides a simple setting for investigating the basic properties of kinetic turbulence and is relevant for high-energy astrophysical systems such as pulsar wind nebulae and astrophysical jets. In this paper, we investigate the statistics of turbulent fluctuations in simulations on lattices of up to 10243 cells and containing up to 2 × 1011 particles. Due to the absence of a cooling mechanism in our simulations, turbulent energy dissipation reduces the magnetization parameter to order unity within a few dynamical times, causing turbulent motions to become sub-relativistic. In the developed stage, our results agree with predictions from magnetohydrodynamic turbulence phenomenology at inertial-range scales, including a power-law magnetic energy spectrum with index near -5/3, scale-dependent anisotropy of fluctuations described by critical balance, lognormal distributions for particle density and internal energy density (related by a 4/3 adiabatic index, as predicted for an ultra-relativistic ideal gas), and the presence of intermittency. We also present possible signatures of a kinetic cascade by measuring power-law spectra for the magnetic, electric and density fluctuations at sub-Larmor scales.

  17. Generalization of symmetric α-stable Lévy distributions for q >1

    NASA Astrophysics Data System (ADS)

    Umarov, Sabir; Tsallis, Constantino; Gell-Mann, Murray; Steinberg, Stanly

    2010-03-01

    The α-stable distributions introduced by Lévy play an important role in probabilistic theoretical studies and their various applications, e.g., in statistical physics, life sciences, and economics. In the present paper we study sequences of long-range dependent random variables whose distributions have asymptotic power-law decay, and which are called (q,α)-stable distributions. These sequences are generalizations of independent and identically distributed α-stable distributions and have not been previously studied. Long-range dependent (q,α)-stable distributions might arise in the description of anomalous processes in nonextensive statistical mechanics, cell biology, finance. The parameter q controls dependence. If q =1 then they are classical independent and identically distributed with α-stable Lévy distributions. In the present paper we establish basic properties of (q,α)-stable distributions and generalize the result of Umarov et al. [Milan J. Math. 76, 307 (2008)], where the particular case α =2,qɛ[1,3) was considered, to the whole range of stability and nonextensivity parameters α ɛ(0,2] and q ɛ[1,3), respectively. We also discuss possible further extensions of the results that we obtain and formulate some conjectures.

  18. Computational Prediction of Shock Ignition Thresholds and Ignition Probability of Polymer-Bonded Explosives

    NASA Astrophysics Data System (ADS)

    Wei, Yaochi; Kim, Seokpum; Horie, Yasuyuki; Zhou, Min

    2017-06-01

    A computational approach is developed to predict the probabilistic ignition thresholds of polymer-bonded explosives (PBXs). The simulations explicitly account for microstructure, constituent properties, and interfacial responses and capture processes responsible for the development of hotspots and damage. The specific damage mechanisms considered include viscoelasticity, viscoplasticity, fracture, post-fracture contact, frictional heating, and heat conduction. The probabilistic analysis uses sets of statistically similar microstructure samples to mimic relevant experiments for statistical variations of material behavior due to inherent material heterogeneities. The ignition thresholds and corresponding ignition probability maps are predicted for PBX 9404 and PBX 9501 for the impact loading regime of Up = 200 --1200 m/s. James and Walker-Wasley relations are utilized to establish explicit analytical expressions for the ignition probability as a function of load intensities. The predicted results are in good agreement with available experimental measurements. The capability to computationally predict the macroscopic response out of material microstructures and basic constituent properties lends itself to the design of new materials and the analysis of existing materials. The authors gratefully acknowledge the support from Air Force Office of Scientific Research (AFOSR) and the Defense Threat Reduction Agency (DTRA).

  19. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  20. First principles statistical mechanics of alloys and magnetism

    NASA Astrophysics Data System (ADS)

    Eisenbach, Markus; Khan, Suffian N.; Li, Ying Wai

    Modern high performance computing resources are enabling the exploration of the statistical physics of phase spaces with increasing size and higher fidelity of the Hamiltonian of the systems. For selected systems, this now allows the combination of Density Functional based first principles calculations with classical Monte Carlo methods for parameter free, predictive thermodynamics of materials. We combine our locally selfconsistent real space multiple scattering method for solving the Kohn-Sham equation with Wang-Landau Monte-Carlo calculations (WL-LSMS). In the past we have applied this method to the calculation of Curie temperatures in magnetic materials. Here we will present direct calculations of the chemical order - disorder transitions in alloys. We present our calculated transition temperature for the chemical ordering in CuZn and the temperature dependence of the short-range order parameter and specific heat. Finally we will present the extension of the WL-LSMS method to magnetic alloys, thus allowing the investigation of the interplay of magnetism, structure and chemical order in ferrous alloys. This research was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Science and Engineering Division and it used Oak Ridge Leadership Computing Facility resources at Oak Ridge National Laboratory.

  1. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  2. Teaching Classical Statistical Mechanics: A Simulation Approach.

    ERIC Educational Resources Information Center

    Sauer, G.

    1981-01-01

    Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)

  3. Facts about Congenital Heart Defects

    MedlinePlus

    ... Living With Heart Defects Data & Statistics Tracking & Research Articles & Key Findings Free Materials Multimedia and Tools Links to Other Websites Information For… Media Policy Makers Basics about Congenital Heart Defects Language: ...

  4. Symptoms of Uterine Cancer

    MedlinePlus

    ... Cervical Cancer with the Right Test at the Right Time” Infographic How Is Cervical Cancer Diagnosed and Treated? Statistics Related Links Ovarian Cancer Basic Information What Are the Risk Factors? What Can ...

  5. Critical appraisal of scientific articles: part 1 of a series on evaluation of scientific publications.

    PubMed

    du Prel, Jean-Baptist; Röhrig, Bernd; Blettner, Maria

    2009-02-01

    In the era of evidence-based medicine, one of the most important skills a physician needs is the ability to analyze scientific literature critically. This is necessary to keep medical knowledge up to date and to ensure optimal patient care. The aim of this paper is to present an accessible introduction into critical appraisal of scientific articles. Using a selection of international literature, the reader is introduced to the principles of critical reading of scientific articles in medicine. For the sake of conciseness, detailed description of statistical methods is omitted. Widely accepted principles for critically appraising scientific articles are outlined. Basic knowledge of study design, structuring of an article, the role of different sections, of statistical presentations as well as sources of error and limitation are presented. The reader does not require extensive methodological knowledge. As far as necessary for critical appraisal of scientific articles, differences in research areas like epidemiology, clinical, and basic research are outlined. Further useful references are presented. Basic methodological knowledge is required to select and interpret scientific articles correctly.

  6. Development and validation of a machine learning algorithm and hybrid system to predict the need for life-saving interventions in trauma patients.

    PubMed

    Liu, Nehemiah T; Holcomb, John B; Wade, Charles E; Batchinsky, Andriy I; Cancio, Leopoldo C; Darrah, Mark I; Salinas, José

    2014-02-01

    Accurate and effective diagnosis of actual injury severity can be problematic in trauma patients. Inherent physiologic compensatory mechanisms may prevent accurate diagnosis and mask true severity in many circumstances. The objective of this project was the development and validation of a multiparameter machine learning algorithm and system capable of predicting the need for life-saving interventions (LSIs) in trauma patients. Statistics based on means, slopes, and maxima of various vital sign measurements corresponding to 79 trauma patient records generated over 110,000 feature sets, which were used to develop, train, and implement the system. Comparisons among several machine learning models proved that a multilayer perceptron would best implement the algorithm in a hybrid system consisting of a machine learning component and basic detection rules. Additionally, 295,994 feature sets from 82 h of trauma patient data showed that the system can obtain 89.8 % accuracy within 5 min of recorded LSIs. Use of machine learning technologies combined with basic detection rules provides a potential approach for accurately assessing the need for LSIs in trauma patients. The performance of this system demonstrates that machine learning technology can be implemented in a real-time fashion and potentially used in a critical care environment.

  7. Bag-breakup control of surface drag in hurricanes

    NASA Astrophysics Data System (ADS)

    Troitskaya, Yuliya; Zilitinkevich, Sergej; Kandaurov, Alexander; Ermakova, Olga; Kozlov, Dmitry; Sergeev, Daniil

    2016-04-01

    Air-sea interaction at extreme winds is of special interest now in connection with the problem of the sea surface drag reduction at the wind speed exceeding 30-35 m/s. This phenomenon predicted by Emanuel (1995) and confirmed by a number of field (e.g., Powell, et al, 2003) and laboratory (Donelan et al, 2004) experiments still waits its physical explanation. Several papers attributed the drag reduction to spume droplets - spray turning off the crests of breaking waves (e.g., Kudryavtsev, Makin, 2011, Bao, et al, 2011). The fluxes associated with the spray are determined by the rate of droplet production at the surface quantified by the sea spray generation function (SSGF), defined as the number of spray particles of radius r produced from the unit area of water surface in unit time. However, the mechanism of spume droplets' formation is unknown and empirical estimates of SSGF varied over six orders of magnitude; therefore, the production rate of large sea spray droplets is not adequately described and there are significant uncertainties in estimations of exchange processes in hurricanes. Herewith, it is unknown what is air-sea interface and how water is fragmented to spray at hurricane wind. Using high-speed video, we observed mechanisms of production of spume droplets at strong winds by high-speed video filming, investigated statistics and compared their efficiency. Experiments showed, that the generation of the spume droplets near the wave crest is caused by the following events: bursting of submerged bubbles, generation and breakup of "projections" and "bag breakup". Statistical analysis of results of these experiments showed that the main mechanism of spray-generation is attributed to "bag-breakup mechanism", namely, inflating and consequent blowing of short-lived, sail-like pieces of the water-surface film. Using high-speed video, we show that at hurricane winds the main mechanism of spray production is attributed to "bag-breakup", namely, inflating and consequent breaking of short-lived, sail-like pieces of the water-surface film - "bags". On the base of general principles of statistical physics (model of a canonical ensemble) we developed statistics of the "bag-breakup" events: their number and statistical distribution of geometrical parameters depending on wind speed. Basing on the developed statistics, we estimated the surface stress caused by bags as the average sum of stresses caused by individual bags depending on their eometrical parameters. The resulting stress is subjected to counteracting impacts of the increasing wind speed: the increasing number of bags, and their decreasing sizes and life times and the balance yields a peaking dependence of the bag resistance on the wind speed: the share of bag-stress peaks at U10  35 m/s and then reduces. Peaking of surface stress associated with the "bag-breakup" explains seemingly paradoxical non-monotonous wind-dependence of surface drag coefficient peaking at winds about 35 m/s. This work was supported by the Russian Foundation of Basic Research (14-05-91767, 13-05-12093, 16-05-00839, 14-05-91767, 16-55-52025, 15-35-20953) and experiment and equipment was supported by Russian Science Foundation (Agreements 14-17-00667 and 15-17-20009 respectively), Yu.Troitskaya, A.Kandaurov and D.Sergeev were partially supported by FP7 Collaborative Project No. 612610.

  8. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  9. Interpretation of correlations in clinical research.

    PubMed

    Hung, Man; Bounsanga, Jerry; Voss, Maren Wright

    2017-11-01

    Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.

  10. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.

  11. Conceptual versus Algorithmic Learning in High School Chemistry: The Case of Basic Quantum Chemical Concepts--Part 1. Statistical Analysis of a Quantitative Study

    ERIC Educational Resources Information Center

    Papaphotis, Georgios; Tsaparlis, Georgios

    2008-01-01

    Part 1 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught in the twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used. The study compared performance in five questions that tested recall of knowledge or application of algorithmic procedures (type-A…

  12. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....25 an hour to a mechanic as his basic cash wage plus 50 cents an hour as a contribution to a welfare... prevailing wage statutes. It is clear from the legislative history that in no event can the regular or basic... less than the amount determined by the Secretary of Labor as the basic hourly rate (i.e. cash rate...

  13. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....25 an hour to a mechanic as his basic cash wage plus 50 cents an hour as a contribution to a welfare... prevailing wage statutes. It is clear from the legislative history that in no event can the regular or basic... less than the amount determined by the Secretary of Labor as the basic hourly rate (i.e. cash rate...

  14. 29 CFR 5.32 - Overtime payments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....25 an hour to a mechanic as his basic cash wage plus 50 cents an hour as a contribution to a welfare... prevailing wage statutes. It is clear from the legislative history that in no event can the regular or basic... less than the amount determined by the Secretary of Labor as the basic hourly rate (i.e. cash rate...

  15. Object Categorization: Reversals and Explanations of the Basic-Level Advantage

    ERIC Educational Resources Information Center

    Rogers, Timothy T.; Patterson, Karalyn

    2007-01-01

    People are generally faster and more accurate to name or categorize objects at the basic level (e.g., dog) relative to more general (animal) or specific (collie) levels, an effect replicated in Experiment 1 for categorization of object pictures. To some, this pattern suggests a dual-process mechanism, in which objects first activate basic-level…

  16. Statistical regularities of art images and natural scenes: spectra, sparseness and nonlinearities.

    PubMed

    Graham, Daniel J; Field, David J

    2007-01-01

    Paintings are the product of a process that begins with ordinary vision in the natural world and ends with manipulation of pigments on canvas. Because artists must produce images that can be seen by a visual system that is thought to take advantage of statistical regularities in natural scenes, artists are likely to replicate many of these regularities in their painted art. We have tested this notion by computing basic statistical properties and modeled cell response properties for a large set of digitized paintings and natural scenes. We find that both representational and non-representational (abstract) paintings from our sample (124 images) show basic similarities to a sample of natural scenes in terms of their spatial frequency amplitude spectra, but the paintings and natural scenes show significantly different mean amplitude spectrum slopes. We also find that the intensity distributions of paintings show a lower skewness and sparseness than natural scenes. We account for this by considering the range of luminances found in the environment compared to the range available in the medium of paint. A painting's range is limited by the reflective properties of its materials. We argue that artists do not simply scale the intensity range down but use a compressive nonlinearity. In our studies, modeled retinal and cortical filter responses to the images were less sparse for the paintings than for the natural scenes. But when a compressive nonlinearity was applied to the images, both the paintings' sparseness and the modeled responses to the paintings showed the same or greater sparseness compared to the natural scenes. This suggests that artists achieve some degree of nonlinear compression in their paintings. Because paintings have captivated humans for millennia, finding basic statistical regularities in paintings' spatial structure could grant insights into the range of spatial patterns that humans find compelling.

  17. Urological research in sub-Saharan Africa: a retrospective cohort study of abstracts presented at the Nigerian Association of Urological Surgeons conferences.

    PubMed

    Bello, Jibril Oyekunle

    2013-11-14

    Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.

  18. The effects of load carriage and muscle fatigue on lower-extremity joint mechanics.

    PubMed

    Wang, He; Frame, Jeff; Ozimek, Elicia; Leib, Daniel; Dugan, Eric L

    2013-09-01

    Military personnel are commonly afflicted by lower-extremity overuse injuries. Load carriage and muscular fatigue are major stressors during military basic training. To examine effects of load carriage and muscular fatigue on lower-extremity joint mechanics during walking. Eighteen men performed the following tasks: unloaded walking, walking with a 32-kg load, fatigued walking with a 32-kg load, and fatigued walking. After the second walking task, muscle fatigue was elicited through a fatiguing protocol consisting of metered step-ups and heel raises with a 16-kg load. Each walking task was performed at 1.67 m x s(-1) for 5 min. Walking movement was tracked by a VICON motion capture system at 120 Hz. Ground reaction forces were collected by a tandem force instrumented treadmill (AMTI) at 2,400 Hz. Lower-extremity joint mechanics were calculated in Visual 3D. There was no interaction between load carriage and fatigue on lower-extremity joint mechanics (p > .05). Both load carriage and fatigue led to pronounced alterations of lower-extremity joint mechanics (p < .05). Load carriage resulted in increases of pelvis anterior tilt, hip and knee flexion at heel contact, and increases of hip, knee, and ankle joint moments and powers during weight acceptance. Muscle fatigue led to decreases of ankle dorsiflexion at heel contact, dorsiflexor moment, and joint power at weight acceptance. In addition, muscle fatigue increased demand for hip extensor moment and power at weight acceptance. Statistically significant changes in lower-extremity joint mechanics during loaded and fatigued walking may expose military personnel to increased risk for overuse injuries.

  19. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  20. Federal Funds for Research and Development: Fiscal Years 1981, 1982, and 1983. Volume XXXI. Detailed Statistical Tables. Surveys of Science Resources Series.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Detailed statistical tables on federal funds for research and development (R&D) activities are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D- agency, character of work, and performer; total research- agency, performer, and field of science; basic research- agency,…

  1. Federal Funds for Research and Development: Fiscal Years 1983, 1984, and 1985. Volume XXXIII. Detailed Statistical Tables. Surveys of Science Resources Series.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Detailed statistical tables showing the funding levels of 92 federal agencies for research and development (R&D) are provided in this document. These tables are organized into the following sections: research, development, and R&D plant; R&D agency, character of work, and performer; total basic and applied applied research--agency,…

  2. WASP (Write a Scientific Paper) using Excel -5: Quartiles and standard deviation.

    PubMed

    Grech, Victor

    2018-03-01

    The almost inevitable descriptive statistics exercise that is undergone once data collection is complete, prior to inferential statistics, requires the acquisition of basic descriptors which may include standard deviation and quartiles. This paper provides pointers as to how to do this in Microsoft Excel™ and explains the relationship between the two. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Federal Funds for Research and Development. Fiscal Years 1982, 1983, and 1984. Volume XXXII. Detailed Statistical Tables. Surveys of Science Resources Series.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Detailed statistical tables on federal funds for research and development (R&D) are provided in this document. Tables are organized into the following sections: research, development, and R&D plant; R&D--agency, character of work, and performer; total research--agency, performer, and field of science; basic research--agency, performer,…

  4. The Answer Is in the Question: A Guide for Describing and Investigating the Conceptual Foundations and Statistical Properties of Cognitive Psychometric Models

    ERIC Educational Resources Information Center

    Rupp, Andre A.

    2007-01-01

    One of the most revolutionary advances in psychometric research during the last decades has been the systematic development of statistical models that allow for cognitive psychometric research (CPR) to be conducted. Many of the models currently available for such purposes are extensions of basic latent variable models in item response theory…

  5. Current state of the art for statistical modeling of species distributions [Chapter 16

    Treesearch

    Troy M. Hegel; Samuel A. Cushman; Jeffrey Evans; Falk Huettmann

    2010-01-01

    Over the past decade the number of statistical modelling tools available to ecologists to model species' distributions has increased at a rapid pace (e.g. Elith et al. 2006; Austin 2007), as have the number of species distribution models (SDM) published in the literature (e.g. Scott et al. 2002). Ten years ago, basic logistic regression (Hosmer and Lemeshow 2000)...

  6. Data-Base for Communication Planning. The Basic and Statistical Data Required for the Elaboration of a Plan for a National Communication System.

    ERIC Educational Resources Information Center

    Rahim, Syed A.

    Based in part on a list developed by the United Nations Educational, Scientific, and Cultural Organization (UNESCO) for use in Afghanistan, this document presents a comprehensive checklist of items of statistical and descriptive data required for planning a national communication system. It is noted that such a system provides the vital…

  7. Ratio

    NASA Astrophysics Data System (ADS)

    Webster, Nathan A. S.; Pownceby, Mark I.; Madsen, Ian C.; Studer, Andrew J.; Manuel, James R.; Kimpton, Justin A.

    2014-12-01

    Effects of basicity, B (CaO:SiO2 ratio) on the thermal range, concentration, and formation mechanisms of silico-ferrite of calcium and aluminum (SFCA) and SFCA-I iron ore sinter bonding phases have been investigated using an in situ synchrotron X-ray diffraction-based methodology with subsequent Rietveld refinement-based quantitative phase analysis. SFCA and SFCA-I phases are the key bonding materials in iron ore sinter, and improved understanding of the effects of processing parameters such as basicity on their formation and decomposition may assist in improving efficiency of industrial iron ore sintering operations. Increasing basicity significantly increased the thermal range of SFCA-I, from 1363 K to 1533 K (1090 °C to 1260 °C) for a mixture with B = 2.48, to ~1339 K to 1535 K (1066 °C to 1262 °C) for a mixture with B = 3.96, and to ~1323 K to 1593 K (1050 °C to 1320 °C) at B = 4.94. Increasing basicity also increased the amount of SFCA-I formed, from 18 wt pct for the mixture with B = 2.48 to 25 wt pct for the B = 4.94 mixture. Higher basicity of the starting sinter mixture will, therefore, increase the amount of SFCA-I, considered to be more desirable of the two phases. Basicity did not appear to significantly influence the formation mechanism of SFCA-I. It did, however, affect the formation mechanism of SFCA, with the decomposition of SFCA-I coinciding with the formation of a significant amount of additional SFCA in the B = 2.48 and 3.96 mixtures but only a minor amount in the highest basicity mixture. In situ neutron diffraction enabled characterization of the behavior of magnetite after melting of SFCA produced a magnetite plus melt phase assemblage.

  8. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  9. SIGPI. Fault Tree Cut Set System Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can bemore » input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  10. Probability sampling in legal cases: Kansas cellphone users

    NASA Astrophysics Data System (ADS)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  11. Basic HIV/AIDS Statistics

    MedlinePlus

    ... infections—down from 41,800 in 2010. a Gay, bisexual, and other men who have sex with ... HIV infections by transmission category , we see that gay, bisexual, and other men who have sex with ...

  12. Understanding your cancer prognosis

    MedlinePlus

    ... about: Treatment Palliative care Personal matters such as finances Knowing what to expect may make it easier ... treatment. www.cancer.net/navigating-cancer-care/cancer-basics/understanding-statistics-used-guide-prognosis-and-evaluate-treatment . ...

  13. Mechanisms of flame stabilisation at low lifted height in a turbulent lifted slot-jet flame

    DOE PAGES

    Karami, Shahram; Hawkes, Evatt R.; Talei, Mohsen; ...

    2015-07-23

    A turbulent lifted slot-jet flame is studied using direct numerical simulation (DNS). A one-step chemistry model is employed with a mixture-fraction-dependent activation energy which can reproduce qualitatively the dependence of the laminar burning rate on the equivalence ratio that is typical of hydrocarbon fuels. The basic structure of the flame base is first examined and discussed in the context of earlier experimental studies of lifted flames. Several features previously observed in experiments are noted and clarified. Some other unobserved features are also noted. Comparison with previous DNS modelling of hydrogen flames reveals significant structural differences. The statistics of flow andmore » relative edge-flame propagation velocity components conditioned on the leading edge locations are then examined. The results show that, on average, the streamwise flame propagation and streamwise flow balance, thus demonstrating that edge-flame propagation is the basic stabilisation mechanism. Fluctuations of the edge locations and net edge velocities are, however, significant. It is demonstrated that the edges tend to move in an essentially two-dimensional (2D) elliptical pattern (laterally outwards towards the oxidiser, then upstream, then inwards towards the fuel, then downstream again). It is proposed that this is due to the passage of large eddies, as outlined in Suet al.(Combust. Flame, vol. 144 (3), 2006, pp. 494–512). However, the mechanism is not entirely 2D, and out-of-plane motion is needed to explain how flames escape the high-velocity inner region of the jet. Finally, the time-averaged structure is examined. A budget of terms in the transport equation for the product mass fraction is used to understand the stabilisation from a time-averaged perspective. The result of this analysis is found to be consistent with the instantaneous perspective. The budget reveals a fundamentally 2D structure, involving transport in both the streamwise and transverse directions, as opposed to possible mechanisms involving a dominance of either one direction of transport. Furthermore, it features upstream transport balanced by entrainment into richer conditions, while on the rich side, upstream turbulent transport and entrainment from leaner conditions balance the streamwise convection.« less

  14. Preloaded joint analysis methodology for space flight systems

    NASA Technical Reports Server (NTRS)

    Chambers, Jeffrey A.

    1995-01-01

    This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.

  15. Some important considerations in the development of stress corrosion cracking test methods.

    NASA Technical Reports Server (NTRS)

    Wei, R. P.; Novak, S. R.; Williams, D. P.

    1972-01-01

    Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.

  16. Mechanical modeling for magnetorheological elastomer isolators based on constitutive equations and electromagnetic analysis

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Dong, Xufeng; Li, Luyu; Ou, Jinping

    2018-06-01

    As constitutive models are too complicated and existing mechanical models lack universality, these models are beyond satisfaction for magnetorheological elastomer (MRE) devices. In this article, a novel universal method is proposed to build concise mechanical models. Constitutive model and electromagnetic analysis were applied in this method to ensure universality, while a series of derivations and simplifications were carried out to obtain a concise formulation. To illustrate the proposed modeling method, a conical MRE isolator was introduced. Its basic mechanical equations were built based on equilibrium, deformation compatibility, constitutive equations and electromagnetic analysis. An iteration model and a highly efficient differential equation editor based model were then derived to solve the basic mechanical equations. The final simplified mechanical equations were obtained by re-fitting the simulations with a novel optimal algorithm. In the end, verification test of the isolator has proved the accuracy of the derived mechanical model and the modeling method.

  17. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    PubMed

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  18. Statistics and Discoveries at the LHC (1/4)

    ScienceCinema

    Cowan, Glen

    2018-02-09

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  19. Statistics and Discoveries at the LHC (3/4)

    ScienceCinema

    Cowan, Glen

    2018-02-19

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  20. Statistics and Discoveries at the LHC (4/4)

    ScienceCinema

    Cowan, Glen

    2018-05-22

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  1. Statistics and Discoveries at the LHC (2/4)

    ScienceCinema

    Cowan, Glen

    2018-04-26

    The lectures will give an introduction to statistics as applied in particle physics and will provide all the necessary basics for data analysis at the LHC. Special emphasis will be placed on the the problems and questions that arise when searching for new phenomena, including p-values, discovery significance, limit setting procedures, treatment of small signals in the presence of large backgrounds. Specific issues that will be addressed include the advantages and drawbacks of different statistical test procedures (cut-based, likelihood-ratio, etc.), the look-elsewhere effect and treatment of systematic uncertainties.

  2. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  3. Strong activation of bile acid-sensitive ion channel (BASIC) by ursodeoxycholic acid

    PubMed Central

    Wiemuth, Dominik; Sahin, Hacer; Lefèvre, Cathérine M.T.; Wasmuth, Hermann E.; Gründer, Stefan

    2013-01-01

    Bile acid-sensitive ion channel (BASIC) is a member of the DEG/ENaC gene family of unknown function. Rat BASIC (rBASIC) is inactive at rest. We have recently shown that cholangiocytes, the epithelial cells lining the bile ducts, are the main site of BASIC expression in the liver and identified bile acids, in particular hyo- and chenodeoxycholic acid, as agonists of rBASIC. Moreover, it seems that extracellular divalent cations stabilize the resting state of rBASIC, because removal of extracellular divalent cations opens the channel. In this addendum, we demonstrate that removal of extracellular divalent cations potentiates the activation of rBASIC by bile acids, suggesting an allosteric mechanism. Furthermore, we show that rBASIC is strongly activated by the anticholestatic bile acid ursodeoxycholic acid (UDCA), suggesting that BASIC might mediate part of the therapeutic effects of UDCA. PMID:23064163

  4. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    PubMed

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  5. Local quantum thermal susceptibility

    PubMed Central

    De Pasquale, Antonella; Rossini, Davide; Fazio, Rosario; Giovannetti, Vittorio

    2016-01-01

    Thermodynamics relies on the possibility to describe systems composed of a large number of constituents in terms of few macroscopic variables. Its foundations are rooted into the paradigm of statistical mechanics, where thermal properties originate from averaging procedures which smoothen out local details. While undoubtedly successful, elegant and formally correct, this approach carries over an operational problem, namely determining the precision at which such variables are inferred, when technical/practical limitations restrict our capabilities to local probing. Here we introduce the local quantum thermal susceptibility, a quantifier for the best achievable accuracy for temperature estimation via local measurements. Our method relies on basic concepts of quantum estimation theory, providing an operative strategy to address the local thermal response of arbitrary quantum systems at equilibrium. At low temperatures, it highlights the local distinguishability of the ground state from the excited sub-manifolds, thus providing a method to locate quantum phase transitions. PMID:27681458

  6. Local quantum thermal susceptibility

    NASA Astrophysics Data System (ADS)

    de Pasquale, Antonella; Rossini, Davide; Fazio, Rosario; Giovannetti, Vittorio

    2016-09-01

    Thermodynamics relies on the possibility to describe systems composed of a large number of constituents in terms of few macroscopic variables. Its foundations are rooted into the paradigm of statistical mechanics, where thermal properties originate from averaging procedures which smoothen out local details. While undoubtedly successful, elegant and formally correct, this approach carries over an operational problem, namely determining the precision at which such variables are inferred, when technical/practical limitations restrict our capabilities to local probing. Here we introduce the local quantum thermal susceptibility, a quantifier for the best achievable accuracy for temperature estimation via local measurements. Our method relies on basic concepts of quantum estimation theory, providing an operative strategy to address the local thermal response of arbitrary quantum systems at equilibrium. At low temperatures, it highlights the local distinguishability of the ground state from the excited sub-manifolds, thus providing a method to locate quantum phase transitions.

  7. Distinct sets of locomotor modules control the speed and modes of human locomotion

    PubMed Central

    Yokoyama, Hikaru; Ogawa, Tetsuya; Kawashima, Noritaka; Shinya, Masahiro; Nakazawa, Kimitaka

    2016-01-01

    Although recent vertebrate studies have revealed that different spinal networks are recruited in locomotor mode- and speed-dependent manners, it is unknown whether humans share similar neural mechanisms. Here, we tested whether speed- and mode-dependence in the recruitment of human locomotor networks exists or not by statistically extracting locomotor networks. From electromyographic activity during walking and running over a wide speed range, locomotor modules generating basic patterns of muscle activities were extracted using non-negative matrix factorization. The results showed that the number of modules changed depending on the modes and speeds. Different combinations of modules were extracted during walking and running, and at different speeds even during the same locomotor mode. These results strongly suggest that, in humans, different spinal locomotor networks are recruited while walking and running, and even in the same locomotor mode different networks are probably recruited at different speeds. PMID:27805015

  8. DQE analysis for CCD imaging arrays

    NASA Astrophysics Data System (ADS)

    Shaw, Rodney

    1997-05-01

    By consideration of the statistical interaction between exposure quanta and the mechanisms of image detection, the signal-to-noise limitations of a variety of image acquisition technologies are now well understood. However in spite of the growing fields of application for CCD imaging- arrays and the obvious advantages of their multi-level mode of quantum detection, only limited and largely empirical approaches have been made to quantify these advantages on an absolute basis. Here an extension is made of a previous model for noise-free sequential photon-counting to the more general case involving both count-noise and arbitrary separation functions between count levels. This allows a basic model to be developed for the DQE associated with devices which approximate to the CCD mode of operation, and conclusions to be made concerning the roles of the separation-function and count-noise in defining the departure from the ideal photon counter.

  9. Stochastic Dynamics of Lexicon Learning in an Uncertain and Nonuniform World

    NASA Astrophysics Data System (ADS)

    Reisenauer, Rainer; Smith, Kenny; Blythe, Richard A.

    2013-06-01

    We study the time taken by a language learner to correctly identify the meaning of all words in a lexicon under conditions where many plausible meanings can be inferred whenever a word is uttered. We show that the most basic form of cross-situational learning—whereby information from multiple episodes is combined to eliminate incorrect meanings—can perform badly when words are learned independently and meanings are drawn from a nonuniform distribution. If learners further assume that no two words share a common meaning, we find a phase transition between a maximally efficient learning regime, where the learning time is reduced to the shortest it can possibly be, and a partially efficient regime where incorrect candidate meanings for words persist at late times. We obtain exact results for the word-learning process through an equivalence to a statistical mechanical problem of enumerating loops in the space of word-meaning mappings.

  10. Marshall N. Rosenbluth Outstanding Doctoral Thesis Award: Magnetorotational turbulence and dynamo

    NASA Astrophysics Data System (ADS)

    Squire, Jonathan

    2017-10-01

    Accretion disks are ubiquitous in astrophysics and power some of the most luminous sources in the universe. In many disks, the transport of angular momentum, and thus the mass accretion itself, is thought to be caused by the magnetorotational instability (MRI). As the MRI saturates into strong turbulence, it also generates ordered magnetic fields, acting as a magnetic dynamo powered by the background shear flow. However, despite its importance for astrophysical accretion processes, basic aspects of MRI turbulence-including its saturation amplitude-remain poorly understood. In this talk, I will outline progress towards improving this situation, focusing in particular on the nonlinear shear dynamo and how this controls the turbulence. I will discuss how novel statistical simulation methods can be used to better understand this shear dynamo, in particular the distinct mechanisms that may play a role in MRI turbulence and how these depend on important physical parameters.

  11. Establishing linear solvation energy relationships between VOCs and monolayer-protected gold nanoclusters using quartz crystal microbalance.

    PubMed

    Li, Chi-Lin; Lu, Chia-Jung

    2009-08-15

    Linear solvation energy relationships (LSERs) have been recognized as a useful model for investigating the chemical forces behind the partition coefficients between vapor molecules and absorbents. This study is the first to determine the solvation properties of monolayer-protected gold nanoclusters (MPCs) with different surface ligands. The ratio of partition coefficients/MPC density (K/rho) of 18 volatile organic compounds (VOCs) for four different MPCs obtained through quartz crystal microbalance (QCM) experiments were used for the LSER model calculations. LSER modeling results indicate that all MPC surfaces showed a statistically significant (p<0.05) preference to hydrogen-bond acidic molecules. Through dipole-dipole attraction, 4-methoxythiophenol-capped MPCs can also interact with polar organics (s=1.04). Showing a unique preference for the hydrogen bond basicity of vapors (b=1.11), 2-benzothiazolethiol-capped MPCs provide evidence of an intra-molecular, proton-shift mechanism on surface of nano-gold.

  12. Zipf's word frequency law in natural language: a critical review and future directions.

    PubMed

    Piantadosi, Steven T

    2014-10-01

    The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf's law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf's law and are then used to evaluate many of the theoretical explanations of Zipf's law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf's law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data.

  13. Extracting Models in Single Molecule Experiments

    NASA Astrophysics Data System (ADS)

    Presse, Steve

    2013-03-01

    Single molecule experiments can now monitor the journey of a protein from its assembly near a ribosome to its proteolytic demise. Ideally all single molecule data should be self-explanatory. However data originating from single molecule experiments is particularly challenging to interpret on account of fluctuations and noise at such small scales. Realistically, basic understanding comes from models carefully extracted from the noisy data. Statistical mechanics, and maximum entropy in particular, provide a powerful framework for accomplishing this task in a principled fashion. Here I will discuss our work in extracting conformational memory from single molecule force spectroscopy experiments on large biomolecules. One clear advantage of this method is that we let the data tend towards the correct model, we do not fit the data. I will show that the dynamical model of the single molecule dynamics which emerges from this analysis is often more textured and complex than could otherwise come from fitting the data to a pre-conceived model.

  14. Intrinsic optimization using stochastic nanomagnets

    PubMed Central

    Sutton, Brian; Camsari, Kerem Yunus; Behin-Aein, Behtash; Datta, Supriyo

    2017-01-01

    This paper draws attention to a hardware system which can be engineered so that its intrinsic physics is described by the generalized Ising model and can encode the solution to many important NP-hard problems as its ground state. The basic constituents are stochastic nanomagnets which switch randomly between the ±1 Ising states and can be monitored continuously with standard electronics. Their mutual interactions can be short or long range, and their strengths can be reconfigured as needed to solve specific problems and to anneal the system at room temperature. The natural laws of statistical mechanics guide the network of stochastic nanomagnets at GHz speeds through the collective states with an emphasis on the low energy states that represent optimal solutions. As proof-of-concept, we present simulation results for standard NP-complete examples including a 16-city traveling salesman problem using experimentally benchmarked models for spin-transfer torque driven stochastic nanomagnets. PMID:28295053

  15. Intrinsic optimization using stochastic nanomagnets

    NASA Astrophysics Data System (ADS)

    Sutton, Brian; Camsari, Kerem Yunus; Behin-Aein, Behtash; Datta, Supriyo

    2017-03-01

    This paper draws attention to a hardware system which can be engineered so that its intrinsic physics is described by the generalized Ising model and can encode the solution to many important NP-hard problems as its ground state. The basic constituents are stochastic nanomagnets which switch randomly between the ±1 Ising states and can be monitored continuously with standard electronics. Their mutual interactions can be short or long range, and their strengths can be reconfigured as needed to solve specific problems and to anneal the system at room temperature. The natural laws of statistical mechanics guide the network of stochastic nanomagnets at GHz speeds through the collective states with an emphasis on the low energy states that represent optimal solutions. As proof-of-concept, we present simulation results for standard NP-complete examples including a 16-city traveling salesman problem using experimentally benchmarked models for spin-transfer torque driven stochastic nanomagnets.

  16. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  17. Multi-scale mechanics from molecules to morphogenesis

    PubMed Central

    Davidson, Lance; von Dassow, Michelangelo; Zhou, Jian

    2009-01-01

    Dynamic mechanical processes shape the embryo and organs during development. Little is understood about the basic physics of these processes, what forces are generated, or how tissues resist or guide those forces during morphogenesis. This review offers an outline of some of the basic principles of biomechanics, provides working examples of biomechanical analyses of developing embryos, and reviews the role of structural proteins in establishing and maintaining the mechanical properties of embryonic tissues. Drawing on examples we highlight the importance of investigating mechanics at multiple scales from milliseconds to hours and from individual molecules to whole embryos. Lastly, we pose a series of questions that will need to be addressed if we are to understand the larger integration of molecular and physical mechanical processes during morphogenesis and organogenesis. PMID:19394436

  18. Katapultos: Teaching Basic Statistics with Ballistics.

    ERIC Educational Resources Information Center

    Fitzgerald, Mike

    2001-01-01

    Describes the use of catapults as a way to increase math, science, and technology correlations within the classroom. Includes detailed instructions, a list of materials for building a catapult, and print and Internet resources. (JOW)

  19. Genetics of Hearing Loss

    MedlinePlus

    ... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...

  20. Hearing Loss in Children

    MedlinePlus

    ... Hearing Loss Homepage Basics Noise-Induced Hearing Loss Genetics of Hearing Loss Screening & Diagnosis Types of Hearing Loss About Sound Treatment & Intervention Services Learning Language Bacterial Meningitis Studies Data & Statistics EHDI Annual Data 2016 2015 2014 2013 ...

  1. 77 FR 61791 - System of Records; Presidential Management Fellows Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-11

    ... program personnel for the following reasons: a. To determine basic program eligibility and to evaluate... descriptive statistics and analytical studies in support of the function for which the records are collected...

  2. Oral Medication

    MedlinePlus

    ... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...

  3. Weight Loss

    MedlinePlus

    ... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...

  4. Understanding Carbohydrates

    MedlinePlus

    ... Honor Donation Donate by phone at 1-800-DIABETES (1-800-342-2383) Donate by mail Why Give? ... My Health Advisor Tools to Know Your Risk Diabetes Basics Symptoms Type 1 Type 2 Gestational Myths Statistics Common Terms Genetics ...

  5. Using basic statistics on the individual patient's own numeric data.

    PubMed

    Hart, John

    2012-12-01

    This theoretical report gives an example for how coefficient of variation (CV) and quartile analysis (QA) to assess outliers might be able to be used to analyze numeric data in practice for an individual patient. A patient was examined for 8 visits using infrared instrumentation for measurement of mastoid fossa temperature differential (MFTD) readings. The CV and QA were applied to the readings. The participant also completed the Short Form-12 health perception survey on each visit, and these findings were correlated with CV to determine if CV had outcomes support (clinical significance). An outlier MFTD reading was observed on the eighth visit according to QA that coincided with the largest CV value for the MFTDs. Correlations between the Short Form-12 and CV were low to negligible, positive, and statistically nonsignificant. This case provides an example of how basic statistical analyses could possibly be applied to numerical data in chiropractic practice for an individual patient. This might add objectivity to analyzing an individual patient's data in practice, particularly if clinical significance of a clinical numerical finding is unknown.

  6. Basic biostatistics for post-graduate students

    PubMed Central

    Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.

    2012-01-01

    Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501

  7. An Analysis of a Nationwide Study on Curricular Emphasis in Basic Mechanics

    ERIC Educational Resources Information Center

    Raville, M. E.; Lnenicka, W. J.

    1976-01-01

    Discusses a survey of curricular allocations to mechanics in departments and schools of engineering. Tables show trends of coverage of mechanics topics and faculty perceptions of teaching and learning trends. (MLH)

  8. Calculation of streamflow statistics for Ontario and the Great Lakes states

    USGS Publications Warehouse

    Piggott, Andrew R.; Neff, Brian P.

    2005-01-01

    Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.

  9. Comparing early signs and basic symptoms as methods for predicting psychotic relapse in clinical practice.

    PubMed

    Eisner, Emily; Drake, Richard; Lobban, Fiona; Bucci, Sandra; Emsley, Richard; Barrowclough, Christine

    2018-02-01

    Early signs interventions show promise but could be further developed. A recent review suggested that 'basic symptoms' should be added to conventional early signs to improve relapse prediction. This study builds on preliminary evidence that basic symptoms predict relapse and aimed to: 1. examine which phenomena participants report prior to relapse and how they describe them; 2. determine the best way of identifying pre-relapse basic symptoms; 3. assess current practice by comparing self- and casenote-reported pre-relapse experiences. Participants with non-affective psychosis were recruited from UK mental health services. In-depth interviews (n=23), verbal checklists of basic symptoms (n=23) and casenote extracts (n=208) were analysed using directed content analysis and non-parametric statistical tests. Three-quarters of interviewees reported basic symptoms and all reported conventional early signs and 'other' pre-relapse experiences. Interviewees provided rich descriptions of basic symptoms. Verbal checklist interviews asking specifically about basic symptoms identified these experiences more readily than open questions during in-depth interviews. Only 5% of casenotes recorded basic symptoms; interviewees were 16 times more likely to report basic symptoms than their casenotes did. The majority of interviewees self-reported pre-relapse basic symptoms when asked specifically about these experiences but very few casenotes reported these symptoms. Basic symptoms may be potent predictors of relapse that clinicians miss. A self-report measure would aid monitoring of basic symptoms in routine clinical practice and would facilitate a prospective investigation comparing basic symptoms and conventional early signs as predictors of relapse. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Background Information and User’s Guide for MIL-F-9490

    DTIC Science & Technology

    1975-01-01

    requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting

  11. Transparency, Accountability, and Engagement: A Recipe for Building Trust in Policing

    DTIC Science & Technology

    2017-06-01

    Toward Community-orientated Policing: Potential, Basic Requirements, and Threshold Questions,” Crime and Delinquency 33 (1987): 6–30. 49 More, Current...States,” in Sourcebook of Criminal Justice Statistics Online, accessed June 4, 2017, http://www.albany.edu/sourcebook/csv/ t2332011.csv. 89 Gary...to-date crime statistics , and empowered them to think creatively to develop individualized plans to address crime trends and conditions. His focus

  12. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  13. Are P values and statistical assessments in poster abstracts presented at annual meetings of Taiwan Society of Anesthesiologists relative to the characteristics of hospitals?

    PubMed

    Lee, Fu-Jung; Wu, Chih-Cheng; Peng, Shih-Yen; Fan, Kuo-Tung

    2007-09-01

    Many anesthesiologists in medical centers (MC) or in anesthesiologist-training hospitals (ATH) are accustomed to present their research data in the form of poster abstracts at the annual meetings of Taiwan Society of Anesthesiologists (TSA) to represent their academic gainings in a designated period of time. However, an orphaned P value without mentioning the related specified statistical test has frequently been found in these articles. The difference in presentation of statistical test after P value between MC/ATH and non-MC/non-ATH in recent three TSA consecutive annual meetings was explored in this article. We collected the proceedings handbooks of TSA annual meetings in a period spanning 3 yrs (2003 to 2005) and analyzed the hospital characteristic of first institute-byliner in the poster abstract. Data were analyzed with Fisher's exact test and statistical significance was assumed if P < 0.05. Included were 101 poster abstracts with byliners of 20 hospitals. Only 2 of the 20 hospitals were accredited as non-ATH and 4 as non-MC. There were 64 (63%) abstracts without specified statistical test after P value and no significant difference was found among each category. (P = 0.47 in ATH vs. non-ATH and P = 0.07 in MC vs. non-MC). The basic concept of P value with specified statistical test was not applicable comprehensively in poster abstracts of the annual conferences. Based on our wishful intention, we suggest that the anesthesia administrators and senior anesthesiologists at ATH or MC, and the members of the committee responsible for running academic affairs in TSA, should pay attention to this prodigy and work together to improve our basic statistics in poster presentation.

  14. Development of polytoxicomania in function of defence from psychoticism.

    PubMed

    Nenadović, Milutin M; Sapić, Rosa

    2011-01-01

    Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.

  15. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  16. Fundamental structural characteristics of planar granular assemblies: Self-organization and scaling away friction and initial state.

    PubMed

    Matsushima, Takashi; Blumenfeld, Raphael

    2017-03-01

    The microstructural organization of a granular system is the most important determinant of its macroscopic behavior. Here we identify the fundamental factors that determine the statistics of such microstructures, using numerical experiments to gain a general understanding. The experiments consist of preparing and compacting isotropically two-dimensional granular assemblies of polydisperse frictional disks and analyzing the emergent statistical properties of quadrons-the basic structural elements of granular solids. The focus on quadrons is because the statistics of their volumes have been found to display intriguing universal-like features [T. Matsushima and R. Blumenfeld, Phys. Rev. Lett. 112, 098003 (2014)PRLTAO0031-900710.1103/PhysRevLett.112.098003]. The dependence of the structures and of the packing fraction on the intergranular friction and the initial state is analyzed, and a number of significant results are found. (i) An analytical formula is derived for the mean quadron volume in terms of three macroscopic quantities: the mean coordination number, the packing fraction, and the rattlers fraction. (ii) We derive a unique, initial-state-independent relation between the mean coordination number and the rattler-free packing fraction. The relation is supported numerically for a range of different systems. (iii) We collapse the quadron volume distributions from all systems onto one curve, and we verify that they all have an exponential tail. (iv) The nature of the quadron volume distribution is investigated by decomposition into conditional distributions of volumes given the cell order, and we find that each of these also collapses onto a single curve. (v) We find that the mean quadron volume decreases with increasing intergranular friction coefficients, an effect that is prominent in high-order cells. We argue that this phenomenon is due to an increased probability of stable irregularly shaped cells, and we test this using a herewith developed free cell analytical model. We conclude that, in principle, the microstructural characteristics are governed mainly by the packing procedure, while the effects of intergranular friction and initial states are details that can be scaled away. However, mechanical stability constraints suppress slightly the occurrence of small quadron volumes in cells of order ≥6, and the magnitude of this effect does depend on friction. We quantify in detail this dependence and the deviation it causes from an exact collapse for these cells. (vi) We argue that our results support strongly the view that ensemble granular statistical mechanics does not satisfy the uniform measure assumption of conventional statistical mechanics. Results (i)-(iv) have been reported in the aforementioned reference, and they are reviewed and elaborated on here.

  17. The mirror mechanism in the parietal lobe.

    PubMed

    Rizzolatti, Giacomo; Rozzi, Stefano

    2018-01-01

    The mirror mechanism is a basic mechanism that transforms sensory representations of others' actions into motor representations of the same actions in the brain of the observer. The mirror mechanism plays an important role in understanding actions of others. In the present chapter we discuss first the basic organization of the posterior parietal lobe in the monkey, stressing that it is best characterized as a motor scaffold, on the top of which sensory information is organized. We then describe the location of the mirror mechanism in the posterior parietal cortex of the monkey, and its functional role in areas PFG, and anterior, ventral, and lateral intraparietal areas. We will then present evidence that a similar functional organization is present in humans. We will conclude by discussing the role of the mirror mechanism in the recognition of action performed with tools. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing.

    PubMed

    Zackay, Arie; Steinhoff, Christine

    2010-12-15

    Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.

  19. MethVisual - visualization and exploratory statistical analysis of DNA methylation profiles from bisulfite sequencing

    PubMed Central

    2010-01-01

    Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174

  20. Statistical mechanics explanation for the structure of ocean eddies and currents

    NASA Astrophysics Data System (ADS)

    Venaille, A.; Bouchet, F.

    2010-12-01

    The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.

  1. Forest fires in Pennsylvania.

    Treesearch

    Donald A. Haines; William A. Main; Eugene F. McNamara

    1978-01-01

    Describes factors that contribute to forest fires in Pennsylvania. Includes an analysis of basic statistics; distribution of fires during normal, drought, and wet years; fire cause, fire activity by day-of-week; multiple-fire day; and fire climatology.

  2. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  3. 77 FR 55475 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-10

    ... collected will be analyzed to produce estimates and basic descriptive statistics on the quantity and type of... mode of data collection by event types, and conduct correlations, cross tabulations of responses and...

  4. Geothermal Systems for School.

    ERIC Educational Resources Information Center

    Dinse, David H.

    1998-01-01

    Describes an award-winning school heating and cooling system in which two energy-efficient technologies, variable-flow pumping and geothermal heat pumps, were combined. The basic system schematic and annual energy use and cost savings statistics are provided. (GR)

  5. S.P.S.S. User's Manual #1-#4. Basic Program Construction in S.P.S.S.; S.P.S.S. Non-Procedural Statements and Procedural Commands; System Control Language and S.P.S.S.; Quick File Equate Statement Reference.

    ERIC Educational Resources Information Center

    Earl, Lorna L.

    This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…

  6. Improving the Quality of Basic Education, Volume 6. Country Papers: Antigua, Bermuda, India, St. Kitts, Nevis, Turks & Caicos Islands. Conference of Commonwealth Education Ministers (11th, Barbados, October 29-November 2, 1990).

    ERIC Educational Resources Information Center

    Commonwealth Inst., London (England).

    Commonwealth Ministries of Education were asked to report on how they are undertaking the improvement of the quality of basic education in their respective countries. The papers in this volume focus on: (1) Antigua; (2) Bermuda; (3) India; (4) St. Kitts and Nevis; and (5) Turks and Caicos Islands. Charts and statistical data support each country's…

  7. Biomechanical concepts applicable to minimally invasive fracture repair in small animals.

    PubMed

    Chao, Peini; Lewis, Daniel D; Kowaleski, Michael P; Pozzi, Antonio

    2012-09-01

    Understanding the basic biomechanical principles of surgical stabilization of fractures is essential for developing an appropriate preoperative plan as well as making prudent intraoperative decisions. This article aims to provide basic biomechanical knowledge essential to the understanding of the complex interaction between the mechanics and biology of fracture healing. The type of healing and the outcome can be influenced by several mechanical factors, which depend on the interaction between bone and implant. The surgeon should understand the mechanical principles of fracture fixation and be able to choose the best type of fixation for each specific fracture. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Fractional vector calculus and fluid mechanics

    NASA Astrophysics Data System (ADS)

    Lazopoulos, Konstantinos A.; Lazopoulos, Anastasios K.

    2017-04-01

    Basic fluid mechanics equations are studied and revised under the prism of fractional continuum mechanics (FCM), a very promising research field that satisfies both experimental and theoretical demands. The geometry of the fractional differential has been clarified corrected and the geometry of the fractional tangent spaces of a manifold has been studied in Lazopoulos and Lazopoulos (Lazopoulos KA, Lazopoulos AK. Progr. Fract. Differ. Appl. 2016, 2, 85-104), providing the bases of the missing fractional differential geometry. Therefore, a lot can be contributed to fractional hydrodynamics: the basic fractional fluid equations (Navier Stokes, Euler and Bernoulli) are derived and fractional Darcy's flow in porous media is studied.

  9. Apes are intuitive statisticians.

    PubMed

    Rakoczy, Hannes; Clüver, Annette; Saucke, Liane; Stoffregen, Nicole; Gräbener, Alice; Migura, Judith; Call, Josep

    2014-04-01

    Inductive learning and reasoning, as we use it both in everyday life and in science, is characterized by flexible inferences based on statistical information: inferences from populations to samples and vice versa. Many forms of such statistical reasoning have been found to develop late in human ontogeny, depending on formal education and language, and to be fragile even in adults. New revolutionary research, however, suggests that even preverbal human infants make use of intuitive statistics. Here, we conducted the first investigation of such intuitive statistical reasoning with non-human primates. In a series of 7 experiments, Bonobos, Chimpanzees, Gorillas and Orangutans drew flexible statistical inferences from populations to samples. These inferences, furthermore, were truly based on statistical information regarding the relative frequency distributions in a population, and not on absolute frequencies. Intuitive statistics in its most basic form is thus an evolutionarily more ancient rather than a uniquely human capacity. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Relationship between organizational structure and creativity in teaching hospitals.

    PubMed

    Rezaee, Rita; Marhamati, Saadat; Nabeiei, Parisa; Marhamati, Raheleh

    2014-07-01

    Organization structure and manpower constitute two basic components of anorganization and both are necessary for stablishing an organization. The aim of this survey was to investigate the type of the organization structure (mechanic and organic) from viewpoint of senior and junior managers in Shiraz teaching hospitals and creativity in each of these two structures. In this cross-sectional and descriptive-analytic study, organization structure and organizational creation questionnaires were filled out by hospital managers. According to the statistical consultation and due to limited target population, the entire study population was considered as sample. Thus, the sample size in this study was 84 (12 hospitals and every hospital, n = 7). For data analysis, SPSS 14 was used and Spearman correlation coefficient and t-test were used. RESULTS showed that there is a negative association between centralization and complexity with organizational creation and its dimensions. Also there was a negative association between formalization and 4 organizational creation dimensions: reception change, accepting ambiguity, abet new view and less control outside (p=0.001). The results of this study showed that the creation in hospitals with organic structure is more than that in hospitals with mechanic structure.

  11. Developing QSPR model of gas/particle partition coefficients of neutral poly-/perfluoroalkyl substances

    NASA Astrophysics Data System (ADS)

    Yuan, Quan; Ma, Guangcai; Xu, Ting; Serge, Bakire; Yu, Haiying; Chen, Jianrong; Lin, Hongjun

    2016-10-01

    Poly-/perfluoroalkyl substances (PFASs) are a class of synthetic fluorinated organic substances that raise increasing concern because of their environmental persistence, bioaccumulation and widespread presence in various environment media and organisms. PFASs can be released into the atmosphere through both direct and indirect sources, and the gas/particle partition coefficient (KP) is an important parameter that helps us to understand their atmospheric behavior. In this study, we developed a temperature-dependent predictive model for log KP of PFASs and analyzed the molecular mechanism that governs their partitioning equilibrium between gas phase and particle phase. All theoretical computation was carried out at B3LYP/6-31G (d, p) level based on neutral molecular structures by Gaussian 09 program package. The regression model has a good statistical performance and robustness. The application domain has also been defined according to OECD guidance. The mechanism analysis shows that electrostatic interaction and dispersion interaction play the most important role in the partitioning equilibrium. The developed model can be used to predict log KP values of neutral fluorotelomer alcohols and perfluor sulfonamides/sulfonamidoethanols with different substitutions at nitrogen atoms, providing basic data for their ecological risk assessment.

  12. Relationship between organizational structure and creativity in teaching hospitals

    PubMed Central

    REZAEE, RITA; MARHAMATI, SAADAT; NABEIEI, PARISA; MARHAMATI, RAHELEH

    2014-01-01

    Introduction: Organization structure and manpower constitute two basic components of anorganization and both are necessary for stablishing an organization. The aim of this survey was to investigate the type of the organization structure (mechanic and organic) from viewpoint of senior and junior managers in Shiraz teaching hospitals and creativity in each of these two structures. Methods: In this cross-sectional and descriptive-analytic study, organization structure and organizational creation questionnaires were filled out by hospital managers. According to the statistical consultation and due to limited target population, the entire study population was considered as sample. Thus, the sample size in this study was 84 (12 hospitals and every hospital, n = 7). For data analysis, SPSS 14 was used and Spearman correlation coefficient and t-test were used. Results: Results showed that there is a negative association between centralization and complexity with organizational creation and its dimensions. Also there was a negative association between formalization and 4 organizational creation dimensions: reception change, accepting ambiguity, abet new view and less control outside (p=0.001). Conclusion: The results of this study showed that the creation in hospitals with organic structure is more than that in hospitals with mechanic structure. PMID:25512934

  13. Multi-Agent Market Modeling of Foreign Exchange Rates

    NASA Astrophysics Data System (ADS)

    Zimmermann, Georg; Neuneier, Ralph; Grothmann, Ralph

    A market mechanism is basically driven by a superposition of decisions of many agents optimizing their profit. The oeconomic price dynamic is a consequence of the cumulated excess demand/supply created on this micro level. The behavior analysis of a small number of agents is well understood through the game theory. In case of a large number of agents one may use the limiting case that an individual agent does not have an influence on the market, which allows the aggregation of agents by statistic methods. In contrast to this restriction, we can omit the assumption of an atomic market structure, if we model the market through a multi-agent approach. The contribution of the mathematical theory of neural networks to the market price formation is mostly seen on the econometric side: neural networks allow the fitting of high dimensional nonlinear dynamic models. Furthermore, in our opinion, there is a close relationship between economics and the modeling ability of neural networks because a neuron can be interpreted as a simple model of decision making. With this in mind, a neural network models the interaction of many decisions and, hence, can be interpreted as the price formation mechanism of a market.

  14. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  15. Accuracy analysis of pointing control system of solar power station

    NASA Technical Reports Server (NTRS)

    Hung, J. C.; Peebles, P. Z., Jr.

    1978-01-01

    The first-phase effort concentrated on defining the minimum basic functions that the retrodirective array must perform, identifying circuits that are capable of satisfying the basic functions, and looking at some of the error sources in the system and how they affect accuracy. The initial effort also examined three methods for generating torques for mechanical antenna control, performed a rough analysis of the flexible body characteristics of the solar collector, and defined a control system configuration for mechanical pointing control of the array.

  16. High cumulants of conserved charges and their statistical uncertainties

    NASA Astrophysics Data System (ADS)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  17. Detector noise statistics in the non-linear regime

    NASA Technical Reports Server (NTRS)

    Shopbell, P. L.; Bland-Hawthorn, J.

    1992-01-01

    The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.

  18. Awareness, Attitude, and Knowledge of Basic Life Support among Medical, Dental, and Nursing Faculties and Students in the University Hospital.

    PubMed

    Sangamesh, N C; Vidya, K C; Pathi, Jugajyoti; Singh, Arpita

    2017-01-01

    To assess the awareness, attitude, and knowledge about basic life support (BLS) among medical, dental, and nursing students and faculties and the proposal of BLS skills in the academic curriculum of undergraduate (UG) course. Recognition, prevention, and effective management of life-threatening emergencies are the responsibility of health-care professionals. These situations can be successfully managed by proper knowledge and training of the BLS skills. These life-saving maneuvers can be given through the structured resuscitation programs, which are lacking in the academic curriculum. A questionnaire study consisting of 20 questions was conducted among 659 participants in the Kalinga Institute of Dental Sciences, Kalinga Institute of Medical Sciences, KIIT University. Medical junior residents, BDS faculties, interns, nursing faculties, and 3 rd -year and final-year UG students from both medical and dental colleges were chosen. The statistical analysis was carried out using SPSS software version 20.0 (Armonk, NY:IBM Corp). After collecting the data, the values were statistically analyzed and tabulated. Statistical analysis was performed using Mann-Whitney U-test. The results with P < 0.05 were considered statistically significant. Our participants were aware of BLS, showed positive attitude toward it, whereas the knowledge about BLS was lacking, with the statistically significant P value. By introducing BLS regularly in the academic curriculum and by routine hands on workshops, all the health-care providers should be well versed with the BLS skills for effectively managing the life-threatening emergencies.

  19. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  20. A Role for Weak Electrostatic Interactions in Peripheral Membrane Protein Binding

    PubMed Central

    Khan, Hanif M.; He, Tao; Fuglebakk, Edvin; Grauffel, Cédric; Yang, Boqian; Roberts, Mary F.; Gershenson, Anne; Reuter, Nathalie

    2016-01-01

    Bacillus thuringiensis phosphatidylinositol-specific phospholipase C (BtPI-PLC) is a secreted virulence factor that binds specifically to phosphatidylcholine (PC) bilayers containing negatively charged phospholipids. BtPI-PLC carries a negative net charge and its interfacial binding site has no obvious cluster of basic residues. Continuum electrostatic calculations show that, as expected, nonspecific electrostatic interactions between BtPI-PLC and membranes vary as a function of the fraction of anionic lipids present in the bilayers. Yet they are strikingly weak, with a calculated ΔGel below 1 kcal/mol, largely due to a single lysine (K44). When K44 is mutated to alanine, the equilibrium dissociation constant for small unilamellar vesicles increases more than 50 times (∼2.4 kcal/mol), suggesting that interactions between K44 and lipids are not merely electrostatic. Comparisons of molecular-dynamics simulations performed using different lipid compositions reveal that the bilayer composition does not affect either hydrogen bonds or hydrophobic contacts between the protein interfacial binding site and bilayers. However, the occupancies of cation-π interactions between PC choline headgroups and protein tyrosines vary as a function of PC content. The overall contribution of basic residues to binding affinity is also context dependent and cannot be approximated by a rule-of-thumb value because these residues can contribute to both nonspecific electrostatic and short-range protein-lipid interactions. Additionally, statistics on the distribution of basic amino acids in a data set of membrane-binding domains reveal that weak electrostatics, as observed for BtPI-PLC, might be a less unusual mechanism for peripheral membrane binding than is generally thought. PMID:27028646

  1. Ergodic theorem, ergodic theory, and statistical mechanics

    PubMed Central

    Moore, Calvin C.

    2015-01-01

    This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697

  2. Adaptive statistical pattern classifiers for remotely sensed data

    NASA Technical Reports Server (NTRS)

    Gonzalez, R. C.; Pace, M. O.; Raulston, H. S.

    1975-01-01

    A technique for the adaptive estimation of nonstationary statistics necessary for Bayesian classification is developed. The basic approach to the adaptive estimation procedure consists of two steps: (1) an optimal stochastic approximation of the parameters of interest and (2) a projection of the parameters in time or position. A divergence criterion is developed to monitor algorithm performance. Comparative results of adaptive and nonadaptive classifier tests are presented for simulated four dimensional spectral scan data.

  3. Increasing Effectiveness and Efficiency Through Risk-Based Deployments

    DTIC Science & Technology

    2015-12-01

    Shaw and Henry McKay, both University of Chicago professors, began using maps to understand juvenile delinquency better in Chicago, IL.36 In the...André-Michel Guerry’s Ordonnateur Statistique: The First Statistical Calculator?,” The American Statistician 66, no. 3 (August 1, 2012): 195–200...micro or macro levels using basic inferential statistics .”91 5. Protecting Civil Rights and Liberties It is also important to note that a risk

  4. IDA (Institute for Defense Analyses) GAMMA-Ray Laser Annual Summary Report (1986). Investigation of the Feasibility of Developing a Laser Using Nuclear Transitions

    DTIC Science & Technology

    1988-12-01

    a computer simulation for a small value of r .................................... 25 Figure 5. A typical pulse shape for r = 8192...26 Figure 6. Pulse duration as function of r from the statistical simulations , assuming a spontaneous lifetime of 1 s...scaling factor from the statistical simulations ................. 29 Figure 10. Basic pulse characteristics and associated Bloch vector angles for the

  5. Transfer of SIMNET Training in the Armor Officer Basic Course

    DTIC Science & Technology

    1991-01-01

    group correctly performed more tasks in the posttest , but the difference was not statistically significant for these small samples. Gains from pretest ...to posttest were not compared statistically, but the field-trained group showed little average gain. Based on these results and other supporting data...that serve as a control group , and (b) SIMNET classes after the change that serve as a treatment group . The comparison is termed quasi - experimental

  6. Statistical Characteristics of Single Sort of Grape Bulgarian Wines

    NASA Astrophysics Data System (ADS)

    Boyadzhiev, D.

    2008-10-01

    The aim of this paper is to evaluate the differences in the values of the 8 basic physicochemical indices of single sort of grape Bulgarian wines (white and red ones), obligatory for the standardization of ready production in the winery. Statistically significant differences in the values of various sorts and vintages are established and possibilities for identifying the sort and the vintage on the base of these indices by applying discriminant analysis are discussed.

  7. Assembly of Ultra-Dense Nanowire-Based Computing Systems

    DTIC Science & Technology

    2006-06-30

    34* characterized basic device element properties and statistics "* demonstrated product of sums (POS) validating assembled 2-bit adder structures " Demonstrated...linear region (Vds= 10 mV) from the peak g = 3 jiS at IVg -VTI= 0.13 V using the charge control model, representsmore than a factor of 10 improvement over...disrupted by ionizing particles or thermal fluctuation. Further, when working with such small charges, it is statistically possible that logic

  8. [Bayesian statistics in medicine -- part II: main applications and inference].

    PubMed

    Montomoli, C; Nichelatti, M

    2008-01-01

    Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.

  9. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  10. Mutual information and phase dependencies: measures of reduced nonlinear cardiorespiratory interactions after myocardial infarction.

    PubMed

    Hoyer, Dirk; Leder, Uwe; Hoyer, Heike; Pompe, Bernd; Sommer, Michael; Zwiener, Ulrich

    2002-01-01

    The heart rate variability (HRV) is related to several mechanisms of the complex autonomic functioning such as respiratory heart rate modulation and phase dependencies between heart beat cycles and breathing cycles. The underlying processes are basically nonlinear. In order to understand and quantitatively assess those physiological interactions an adequate coupling analysis is necessary. We hypothesized that nonlinear measures of HRV and cardiorespiratory interdependencies are superior to the standard HRV measures in classifying patients after acute myocardial infarction. We introduced mutual information measures which provide access to nonlinear interdependencies as counterpart to the classically linear correlation analysis. The nonlinear statistical autodependencies of HRV were quantified by auto mutual information, the respiratory heart rate modulation by cardiorespiratory cross mutual information, respectively. The phase interdependencies between heart beat cycles and breathing cycles were assessed basing on the histograms of the frequency ratios of the instantaneous heart beat and respiratory cycles. Furthermore, the relative duration of phase synchronized intervals was acquired. We investigated 39 patients after acute myocardial infarction versus 24 controls. The discrimination of these groups was improved by cardiorespiratory cross mutual information measures and phase interdependencies measures in comparison to the linear standard HRV measures. This result was statistically confirmed by means of logistic regression models of particular variable subsets and their receiver operating characteristics.

  11. Granularity refined by knowledge: contingency tables and rough sets as tools of discovery

    NASA Astrophysics Data System (ADS)

    Zytkow, Jan M.

    2000-04-01

    Contingency tables represent data in a granular way and are a well-established tool for inductive generalization of knowledge from data. We show that the basic concepts of rough sets, such as concept approximation, indiscernibility, and reduct can be expressed in the language of contingency tables. We further demonstrate the relevance to rough sets theory of additional probabilistic information available in contingency tables and in particular of statistical tests of significance and predictive strength applied to contingency tables. Tests of both type can help the evaluation mechanisms used in inductive generalization based on rough sets. Granularity of attributes can be improved in feedback with knowledge discovered in data. We demonstrate how 49er's facilities for (1) contingency table refinement, for (2) column and row grouping based on correspondence analysis, and (3) the search for equivalence relations between attributes improve both granularization of attributes and the quality of knowledge. Finally we demonstrate the limitations of knowledge viewed as concept approximation, which is the focus of rough sets. Transcending that focus and reorienting towards the predictive knowledge and towards the related distinction between possible and impossible (or statistically improbable) situations will be very useful in expanding the rough sets approach to more expressive forms of knowledge.

  12. Epigenetics and cancer: towards an evaluation of the impact of environmental and dietary factors.

    PubMed

    Herceg, Zdenko

    2007-03-01

    While the field of cancer genetics has enjoyed a great deal of attention among cancer researchers in the last few decades, the appreciation of cancer epigenetics is more recent, -owing to the fact that epigenetic mechanisms have emerged as key mechanisms in cancer development. All critical changes in cancer cells, such as silencing of tumour-suppressor genes, activation of oncogenes and defects in DNA repair, are caused not only by genetic but also by epigenetic mechanisms. Epigenetic events can affect many steps in tumour development; therefore, better understanding of epigenetic mechanisms is fundamental to our ability to successfully prevent, diagnose and treat cancer. Various environmental and dietary agents and lifestyles are suspected to be implicated in the development of a wide range of human cancers by eliciting epigenetic changes, though the contribution of epigenetic mechanisms to a given human cancer type and the precise targets of epigenetic alterations during cancer development are largely unknown. The major obstacle in establishing a relationship between epigenetic changes and exposure to dietary, lifestyle and environmental factors and cancer is the fact that studies are typically too small and lack statistical power to identify the interactions between epigenetic changes and exposures. Tremendous advances in our understanding of basic epigenetic mechanisms and rapid progress that is being made in developing new powerful technologies, such as those for sensitive and quantitative detection of epigenetic changes as well as for genome-wide analysis (epigenomics), hold great promise that these issues may be addressed in near future. Therefore, experimental evidence on the precise role of epigenetic changes induced by environment, diet and lifestyle is eagerly awaited.

  13. Calculating Student Grades.

    ERIC Educational Resources Information Center

    Allswang, John M.

    1986-01-01

    This article provides two short microcomputer gradebook programs. The programs, written in BASIC for the IBM-PC and Apple II, provide statistical information about class performance and calculate grades either on a normal distribution or based on teacher-defined break points. (JDH)

  14. Urban Data Book : Volume 1. Urban Data - Atlanta-Miami

    DOT National Transportation Integrated Search

    1975-11-01

    A quick reference compilation of certain population, socio-economic, employment, and modal split characteristics of the 35 largest Standard Metropolitan Statistical Areas (SMSA) in the United States is presented. The three basic groups or urban data ...

  15. Internet starter kit update 1997

    DOT National Transportation Integrated Search

    1997-01-01

    The Bureau of Transportation Statistics (BTS) established an Internet site in 1995, and also produced an Internet Starter Kit not only to assist transportation professionals in accessing the new Internet site but also to give them a basic overview of...

  16. SEDPAK—A comprehensive operational system and data-processing package in APPLESOFT BASIC for a settling tube, sediment analyzer

    NASA Astrophysics Data System (ADS)

    Goldbery, R.; Tehori, O.

    SEDPAK provides a comprehensive software package for operation of a settling tube and sand analyzer (2-0.063 mm) and includes data-processing programs for statistical and graphic output of results. The programs are menu-driven and written in APPLESOFT BASIC, conforming with APPLE 3.3 DOS. Data storage and retrieval from disc is an important feature of SEDPAK. Additional features of SEDPAK include condensation of raw settling data via standard size-calibration curves to yield statistical grain-size parameters, plots of grain-size frequency distributions and cumulative log/probability curves. The program also has a module for processing of grain-size frequency data from sieved samples. An addition feature of SEDPAK is the option for automatic data processing and graphic output of a sequential or nonsequential array of samples on one side of a disc.

  17. Endpoints and cutpoints in head and neck oncology trials: methodical background, challenges, current practice and perspectives.

    PubMed

    Hezel, Marcus; von Usslar, Kathrin; Kurzweg, Thiemo; Lörincz, Balazs B; Knecht, Rainald

    2016-04-01

    This article reviews the methodical and statistical basics of designing a trial, with a special focus on the process of defining and choosing endpoints and cutpoints as the foundations of clinical research, and ultimately that of evidence-based medicine. There has been a significant progress in the treatment of head and neck cancer in the past few decades. Currently available treatment options can have a variety of different goals, depending e.g. on tumor stage, among other factors. The outcome of a specific treatment in clinical trials is measured using endpoints. Besides classical endpoints, such as overall survival or organ preservation, other endpoints like quality of life are becoming increasingly important in designing and conducting a trial. The present work is based on electronic research and focuses on the solid methodical and statistical basics of a clinical trial, on the structure of study designs and on the presentation of various endpoints.

  18. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    NASA Astrophysics Data System (ADS)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  19. A new paradigm for the molecular basis of rubber elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, David E.; Barber, John L.

    The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremore » serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide a brief review of previous elasticity theories and their deficiencies, and present a new paradigm with an emphasis on experimental comparisons.« less

  20. A new paradigm for the molecular basis of rubber elasticity

    DOE PAGES

    Hanson, David E.; Barber, John L.

    2015-02-19

    The molecular basis for rubber elasticity is arguably the oldest and one of the most important questions in the field of polymer physics. The theoretical investigation of rubber elasticity began in earnest almost a century ago with the development of analytic thermodynamic models, based on simple, highly-symmetric configurations of so-called Gaussian chains, i.e. polymer chains that obey Markov statistics. Numerous theories have been proposed over the past 90 years based on the ansatz that the elastic force for individual network chains arises from the entropy change associated with the distribution of end-to-end distances of a free polymer chain. There aremore » serious philosophical objections to this assumption and others, such as the assumption that all network nodes undergo affine motion and that all of the network chains have the same length. Recently, a new paradigm for elasticity in rubber networks has been proposed that is based on mechanisms that originate at the molecular level. Using conventional statistical mechanics analyses, quantum chemistry, and molecular dynamics simulations, the fundamental entropic and enthalpic chain extension forces for polyisoprene (natural rubber) have been determined, along with estimates for the basic force constants. Concurrently, the complex morphology of natural rubber networks (the joint probability density distributions that relate the chain end-to-end distance to its contour length) has also been captured in a numerical model. When molecular chain forces are merged with the network structure in this model, it is possible to study the mechanical response to tensile and compressive strains of a representative volume element of a polymer network. As strain is imposed on a network, pathways of connected taut chains, that completely span the network along strain axis, emerge. Although these chains represent only a few percent of the total, they account for nearly all of the elastic stress at high strain. Here we provide a brief review of previous elasticity theories and their deficiencies, and present a new paradigm with an emphasis on experimental comparisons.« less

  1. Physics Education Research at the Upper Division at the University of Maine

    NASA Astrophysics Data System (ADS)

    Thompson, John

    2013-04-01

    Researchers from the University of Maine Physics Education Research Laboratory are conducting several investigations of the learning and teaching of physics beyond the introductory level. Content topics include intermediate mechanics, electronics, thermodynamics and statistical mechanics. One focus of our work is the identification and addressing of specific student difficulties with topics such as damped harmonic motion, bipolar junction transistor (BJT) circuits, work, entropy, and the Boltzmann factor. Student understanding and use of the underlying mathematics has been one important emerging theme, including definite integrals, partial derivatives, and linear differential equations. Recent work in mechanics has focused on understanding the interplay of mathematical and physical reasoning when describing damped harmonic motion, including framing and representational issues. In electronics, there has been an ongoing investigation of student understanding of the behavior of basic BJT follower and amplifier circuits as well as related issues of signal and bias. In thermal physics, student understanding of state functions, heat engines and the Carnot cycle, the First and Second Laws of thermodynamics, and the macroscopic and microscopic perspectives on entropy have been investigated. The greater content sophistication in these courses has drawn attention to the specific needs, constraints, and advantages of instructional materials tailored to the upper division. Future directions include more attention to interdisciplinary topics across mathematics, physics, and engineering in particular, as well as metacognition in the laboratory.

  2. The central engine of quasars and AGNs: A relativistic proton radiative shock

    NASA Astrophysics Data System (ADS)

    Kazanas, D.; Ellison, D. C.

    1985-08-01

    Active galactic nuclei (AGNs) and quasars (QSOs) appear to emit roughly equal energy per decade from radio to gamma-ray energies (e.g. Ramaty and Ligenfelter 1982). This argues strongly for a nonthermal radiation mechanism (see Rees 1984). In addition, statistical studies have indicated that the spectra of these objects in the IR-UV and 2 to 50 keV X-ray band, can be fitted very well with power laws of specific indices. These spectral indices do not seem to depend on the luminosity or morphology of the objects (Rothschild et al. 1983; Malkan 1984), and any theory should account for them in a basic and model independent way. If shocks accelerate relativistic protons via the first-order Fermi mechanism (e.g. Axfor 1981), the radiating electrons can be produced as secondaries throughout the source by proton-proton (p-p) collisions and pion decay, thus eliminating Compton losses (Protheroe and Kazanas 1983). As shown by Kazanas (1984), if relativistic electrons are injected at high energies, e+-e- pair production results in a steady state electron distribution that is very similar to that observed in AGNs, independent of the details of injection and the dynamics of the source. The conditions required by this mechanism are met in the shock model of Eichler (1984) and Ellison and Eichler (1984) which allows the self-consistent calculation of the shock acceleration efficiency.

  3. Role of basic biological sciences in clinical orthodontics: a case series.

    PubMed

    Davidovitch, Ze'ev; Krishnan, Vinod

    2009-02-01

    Orthodontic therapy is based on interaction between mechanics and biology. Basic biologic research aims at developing a better understanding of the mechanism of transformation of mechanical energy into biologic reactions, and exposing the reasons for iatrogenic tissue damage in orthodontics. Previous research has shown that inflammation is a major part of the biologic response to orthodontic forces. In inflammation, signal molecules that originate in remote diseased organs can reach strained paradental tissues and exacerbate the inflammatory process, leading to tissue damage. Our case series includes 3 patients, each having had systemic diseases and malocclusion. One had diabetes mellitus, Hashimoto's thyroiditis, and depression. Concern about the possible effect of these conditions on the well-being of the teeth and their surrounding tissues compelled the orthodontist to choose not to treat this patient. The other 2 patients had allergies, and 1 also had bronchial asthma and bruises. Although these conditions are thought to be risk factors for root resorption, these patients received orthodontic treatment for 2 and 3.5 years, respectively. At the end of treatment, both had excessive root resorption of many teeth. In 1 patient, this damage led to the loss of most maxillary teeth. Basic research should continue to address questions related to the biologic mechanisms of tooth movement on tissue, cellular, and molecular levels. Moreover, this research should continue to identify risk factors that might jeopardize the longevity of treated teeth. Such basic research should promote the development of new tissue-friendly and patient-friendly therapeutic methods.

  4. Radiation therapy and internet - what can patients expect? homepage analysis of german radiotherapy institutions.

    PubMed

    Janssen, Stefan; Meyer, Andreas; Vordermark, Dirk; Steinmann, Diana

    2010-12-01

    the internet as a source of medical information has emerged during the last years. There is a confusing amount of medical websites with a great diversity of quality. Websites of radiotherapy institutions could offer a safe and an easy-to-control way to assist patients' requests. 205 internet appearances of German radiotherapy institutions were analyzed in June 2009 (nonuniversity hospitals n = 108, medical practices n = 62, university hospitals n = 35). For the evaluation of each homepage verifiable criteria concerning basic information, service and medical issues were used. the quality of information published via internet by different radiotherapy institutions showed a large variety. Basic information like telephone numbers, operating hours, and direction guidance were provided in 96.7%, 40%, and 50.7%, respectively. 85% of the websites introduced the staff, 50.2% supplied photos and 14% further information on the attending physicians. The mean amount of continuative links to other websites was 5.4, the mean amount of articles supplying medical information for patients summed up to 4.6. Medical practices and university hospitals had statistically significant more informative articles and links to other websites than nonuniversity hospitals. No statistically significant differences could be found in most other categories like service issues and basic information. internet presences of radiotherapy institutions hold the chance to supply patients with professional and individualized medical information. While some websites are already using this opportunity, others show a lack of basic information or of user-friendliness.

  5. Vertical integration of basic science in final year of medical education.

    PubMed

    Rajan, Sudha Jasmine; Jacob, Tripti Meriel; Sathyendra, Sowmya

    2016-01-01

    Development of health professionals with ability to integrate, synthesize, and apply knowledge gained through medical college is greatly hampered by the system of delivery that is compartmentalized and piecemeal. There is a need to integrate basic sciences with clinical teaching to enable application in clinical care. To study the benefit and acceptance of vertical integration of basic science in final year MBBS undergraduate curriculum. After Institutional Ethics Clearance, neuroanatomy refresher classes with clinical application to neurological diseases were held as part of the final year posting in two medical units. Feedback was collected. Pre- and post-tests which tested application and synthesis were conducted. Summative assessment was compared with the control group of students who had standard teaching in other two medical units. In-depth interview was conducted on 2 willing participants and 2 teachers who did neurology bedside teaching. Majority (>80%) found the classes useful and interesting. There was statistically significant improvement in the post-test scores. There was a statistically significant difference between the intervention and control groups' scores during summative assessment (76.2 vs. 61.8 P < 0.01). Students felt that it reinforced, motivated self-directed learning, enabled correlations, improved understanding, put things in perspective, gave confidence, aided application, and enabled them to follow discussions during clinical teaching. Vertical integration of basic science in final year was beneficial and resulted in knowledge gain and improved summative scores. The classes were found to be useful, interesting and thought to help in clinical care and application by majority of students.

  6. Random Fields

    NASA Astrophysics Data System (ADS)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  7. Visual saliency detection based on modeling the spatial Gaussianity

    NASA Astrophysics Data System (ADS)

    Ju, Hongbin

    2015-04-01

    In this paper, a novel salient object detection method based on modeling the spatial anomalies is presented. The proposed framework is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous objects among complex background. It is supposed that a natural image can be seen as a combination of some similar or dissimilar basic patches, and there is a direct relationship between its saliency and anomaly. Some patches share high degree of similarity and have a vast number of quantity. They usually make up the background of an image. On the other hand, some patches present strong rarity and specificity. We name these patches "anomalies". Generally, anomalous patch is a reflection of the edge or some special colors and textures in an image, and these pattern cannot be well "explained" by their surroundings. Human eyes show great interests in these anomalous patterns, and will automatically pick out the anomalous parts of an image as the salient regions. To better evaluate the anomaly degree of the basic patches and exploit their nonlinear statistical characteristics, a multivariate Gaussian distribution saliency evaluation model is proposed. In this way, objects with anomalous patterns usually appear as the outliers in the Gaussian distribution, and we identify these anomalous objects as salient ones. Experiments are conducted on the well-known MSRA saliency detection dataset. Compared with other recent developed visual saliency detection methods, our method suggests significant advantages.

  8. Particles, Waves, and the Interpretation of Quantum Mechanics

    ERIC Educational Resources Information Center

    Christoudouleas, N. D.

    1975-01-01

    Presents an explanation, without mathematical equations, of the basic principles of quantum mechanics. Includes wave-particle duality, the probability character of the wavefunction, and the uncertainty relations. (MLH)

  9. Basic Mechanisms of the Epilepsies.

    ERIC Educational Resources Information Center

    Jasper, Herbert H., Ed.; And Others

    A collection of highly technical scientific articles by international basic and clinical neuroscientists constitutes a review of their knowledge of the brain and nervous system, particularly the aspects related to loss of brain function control and its explosive discharges which cause epileptic seizures. Anatomy, biophysics, biochemistry, and…

  10. Lessons for Young Scholars Seeking to Publish.

    ERIC Educational Resources Information Center

    Natriello, Gary

    1996-01-01

    The lessons offered are in three groups: basic lessons related to social behaviors; lessons related to the basic mechanics of putting together a paper to submit to a journal; and lessons regarding matching the study to the substantive and stylistic preferences of a particular journal. (SM)

  11. Influence of platelet aspect ratio on the mechanical behaviour of bio-inspired nanocomposites using molecular dynamics.

    PubMed

    Mathiazhagan, S; Anup, S

    2016-06-01

    Superior mechanical properties of biocomposites such as nacre and bone are attributed to their basic building blocks. These basic building blocks have nanoscale features and play a major role in achieving combined stiffening, strengthening and toughening mechanisms. Bioinspired nanocomposites based on these basic building blocks, regularly and stairwise staggered arrangements of hard platelets in soft matrix, have huge potential for developing advanced materials. The study of applicability of mechanical principles of biological materials to engineered materials will guide designing advanced materials. To probe the generic mechanical characteristics of these bioinspired nanocomposites, the model material concept in molecular dynamics (MD) is used. In this paper, the effect of platelets aspect ratio (AR) on the mechanical behaviour of bioinspired nanocomposites is investigated. The obtained Young׳s moduli of both the models and the strengths of the regularly staggered models agree with the available theories. However, the strengths of the stairwise staggered models show significant difference. For the stairwise staggered model, we demonstrate the existence of two critical ARs, a smaller critical AR above which platelet fracture occurs and a higher critical AR above which composite strength remains constant. Our MD study also shows the existence of mechanisms of platelet pull-out and breakage for lower and higher ARs. Pullout mechanism acts as a major source of plasticity. Further, we find that the regularly staggered model can achieve an optimal combination of high Young׳s modulus, flow strength and toughness, and the stairwise staggered model is efficient in obtaining high Young׳s modulus and tensile strength. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Counselling by primary care physicians may help patients with heartburn-predominant uninvestigated dyspepsia.

    PubMed

    Paré, Pierre; Lee, Joanna; Hawes, Ian A

    2010-03-01

    To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician's discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management.

  13. Counselling by primary care physicians may help patients with heartburn-predominant uninvestigated dyspepsia

    PubMed Central

    Paré, Pierre; Math, Joanna Lee M; Hawes, Ian A

    2010-01-01

    OBJECTIVE: To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. METHODS: Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician’s discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. RESULTS: A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). CONCLUSIONS: A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management. PMID:20352148

  14. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  15. Genetic dissection of main and epistatic effects of QTL based on augmented triple test cross design

    PubMed Central

    Zhang, Zheng; Dai, Zhijun; Chen, Yuan; Yuan, Xiong; Yuan, Zheming; Tang, Wenbang; Li, Lanzhi; Hu, Zhongli

    2017-01-01

    The use of heterosis has considerably increased the productivity of many crops; however, the biological mechanism underpinning the technique remains elusive. The North Carolina design III (NCIII) and the triple test cross (TTC) are powerful and popular genetic mating design that can be used to decipher the genetic basis of heterosis. However, when using the NCIII design with the present quantitative trait locus (QTL) mapping method, if epistasis exists, the estimated additive or dominant effects are confounded with epistatic effects. Here, we propose a two-step approach to dissect all genetic effects of QTL and digenic interactions on a whole genome without sacrificing statistical power based on an augmented TTC (aTTC) design. Because the aTTC design has more transformation combinations than do the NCIII and TTC designs, it greatly enriches the QTL mapping for studying heterosis. When the basic population comprises recombinant inbred lines (RIL), we can use the same materials in the NCIII design for aTTC-design QTL mapping with transformation combination Z1, Z2, and Z4 to obtain genetic effect of QTL and digenic interactions. Compared with RIL-based TTC design, RIL-based aTTC design saves time, money, and labor for basic population crossed with F1. Several Monte Carlo simulation studies were carried out to confirm the proposed approach; the present genetic parameters could be identified with high statistical power, precision, and calculation speed, even at small sample size or low heritability. Additionally, two elite rice hybrid datasets for nine agronomic traits were estimated for real data analysis. We dissected the genetic effects and calculated the dominance degree of each QTL and digenic interaction. Real mapping results suggested that the dominance degree in Z2 that mainly characterize heterosis showed overdominance and dominance for QTL and digenic interactions. Dominance and overdominance were the major genetic foundations of heterosis in rice. PMID:29240818

  16. Tendon basic science: Development, repair, regeneration, and healing.

    PubMed

    Andarawis-Puri, Nelly; Flatow, Evan L; Soslowsky, Louis J

    2015-06-01

    Tendinopathy and tendon rupture are common and disabling musculoskeletal conditions. Despite the prevalence of these injuries, a limited number of investigators are conducting fundamental, basic science studies focused on understanding processes governing tendinopathies and tendon healing. Development of effective therapeutics is hindered by the lack of fundamental guiding data on the biology of tendon development, signal transduction, mechanotransduction, and basic mechanisms underlying tendon pathogenesis and healing. To propel much needed progress, the New Frontiers in Tendon Research Conference, co-sponsored by NIAMS/NIH, the Orthopaedic Research Society, and the Icahn School of Medicine at Mount Sinai, was held to promote exchange of ideas between tendon researchers and basic science experts from outside the tendon field. Discussed research areas that are underdeveloped and represent major hurdles to the progress of the field will be presented in this review. To address some of these outstanding questions, conference discussions and breakout sessions focused on six topic areas (Cell Biology and Mechanics, Functional Extracellular Matrix, Development, Mechano-biology, Scarless Healing, and Mechanisms of Injury and Repair), which are reviewed in this special issue and briefly presented in this review. Review articles in this special issue summarize the progress in the field and identify essential new research directions. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  17. Unit mechanisms of fission gas release: Current understanding and future needs

    DOE PAGES

    Tonks, Michael; Andersson, David; Devanathan, Ram; ...

    2018-03-01

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less

  18. Unit mechanisms of fission gas release: Current understanding and future needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonks, Michael; Andersson, David; Devanathan, Ram

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. Here, this basic understanding of the fission gas behavior mechanisms has the potentialmore » to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.« less

  19. Unit mechanisms of fission gas release: Current understanding and future needs

    NASA Astrophysics Data System (ADS)

    Tonks, Michael; Andersson, David; Devanathan, Ram; Dubourg, Roland; El-Azab, Anter; Freyss, Michel; Iglesias, Fernando; Kulacsy, Katalin; Pastore, Giovanni; Phillpot, Simon R.; Welland, Michael

    2018-06-01

    Gaseous fission product transport and release has a large impact on fuel performance, degrading fuel and gap properties. While gaseous fission product behavior has been investigated with bulk reactor experiments and simplified analytical models, recent improvements in experimental and modeling approaches at the atomistic and mesoscales are beginning to reveal new understanding of the unit mechanisms that define fission product behavior. Here, existing research on the basic mechanisms of fission gas release during normal reactor operation are summarized and critical areas where work is needed are identified. This basic understanding of the fission gas behavior mechanisms has the potential to revolutionize our ability to predict fission product behavior and to design fuels with improved performance. In addition, this work can serve as a model on how a coupled experimental and modeling approach can be applied to understand the unit mechanisms behind other critical behaviors in reactor materials.

  20. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    ERIC Educational Resources Information Center

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

Top