Sample records for nonextensive statistical effects

  1. Nonextensive statistical mechanics approach to electron trapping in degenerate plasmas

    NASA Astrophysics Data System (ADS)

    Mebrouk, Khireddine; Gougam, Leila Ait; Tribeche, Mouloud

    2016-06-01

    The electron trapping in a weakly nondegenerate plasma is reformulated and re-examined by incorporating the nonextensive entropy prescription. Using the q-deformed Fermi-Dirac distribution function including the quantum as well as the nonextensive statistical effects, we derive a new generalized electron density with a new contribution proportional to the electron temperature T, which may dominate the usual thermal correction (∼T2) at very low temperatures. To make the physics behind the effect of this new contribution more transparent, we analyze the modifications arising in the propagation of ion-acoustic solitary waves. Interestingly, we find that due to the nonextensive correction, our plasma model allows the possibility of existence of quantum ion-acoustic solitons with velocity higher than the Fermi ion-sound velocity. Moreover, as the nonextensive parameter q increases, the critical temperature Tc beyond which coexistence of compressive and rarefactive solitons sets in, is shifted towards higher values.

  2. Perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases.

    PubMed

    Mohammadzadeh, Hosein; Adli, Fereshteh; Nouri, Sahereh

    2016-12-01

    We investigate perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases. We show that the intrinsic statistical interaction of nonextensive Bose (Fermi) gas is attractive (repulsive) similar to the extensive case but the value of thermodynamic curvature is changed by a nonextensive parameter. In contrary to the extensive ideal classical gas, the nonextensive one may be divided to two different regimes. According to the deviation parameter of the system to the nonextensive case, one can find a special value of fugacity, z^{*}, where the sign of thermodynamic curvature is changed. Therefore, we argue that the nonextensive parameter induces an attractive (repulsive) statistical interaction for zz^{*}) for an ideal classical gas. Also, according to the singular point of thermodynamic curvature, we consider the condensation of nonextensive Boson gas.

  3. Examining nonextensive statistics in relativistic heavy-ion collisions

    NASA Astrophysics Data System (ADS)

    Simon, A.; Wolschin, G.

    2018-04-01

    We show in detailed numerical solutions of the nonlinear Fokker-Planck equation (FPE), which has been associated with nonextensive q statistics, that the available data on rapidity distributions for stopping in relativistic heavy-ion collisions cannot be reproduced with any permitted value of the nonextensivity parameter (1

  4. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project. References F. Vallianatos, "A non-extensive approach to risk assessment", Nat. Hazards Earth Syst. Sci., 9, 211-216, 2009 F. Vallianatos and P. Sammonds "Is plate tectonics a case of non-extensive thermodynamics?" Physica A: Statistical Mechanics and its Applications, 389 (21), 4989-4993, 2010, F. Vallianatos, G. Michas, G. Papadakis and P. Sammonds " A non extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece)", Acta Geophysica, 60(3), 758-768, 2012 F. Vallianatos and L. Telesca, Statistical mechanics in earth physics and natural hazards (editorial), Acta Geophysica, 60, 3, 499-501, 2012 F. Vallianatos, G. Michas, G. Papadakis and A. Tzanis "Evidence of non-extensivity in the seismicity observed during the 2011-2012 unrest at the Santorini volcanic complex, Greece" Nat. Hazards Earth Syst. Sci.,13,177-185, 2013 F. Vallianatos and P. Sammonds, "Evidence of non-extensive statistical physics of the lithospheric instability approaching the 2004 Sumatran-Andaman and 2011 Honshu mega-earthquakes" Tectonophysics, 590 , 52-58, 2013 G. Papadakis, F. Vallianatos, P. Sammonds, " Evidence of Nonextensive Statistical Physics behavior of the Hellenic Subduction Zone seismicity" Tectonophysics, 608, 1037 -1048, 2013 G. Michas, F. Vallianatos, and P. Sammonds, Non-extensivity and long-range correlations in the earthquake activity at the West Corinth rift (Greece) Nonlin. Processes Geophys., 20, 713-724, 2013

  5. ΛCDM model with dissipative nonextensive viscous dark matter

    NASA Astrophysics Data System (ADS)

    Gimenes, H. S.; Viswanathan, G. M.; Silva, R.

    2018-03-01

    Many models in cosmology typically assume the standard bulk viscosity. We study an alternative interpretation for the origin of the bulk viscosity. Using nonadditive statistics proposed by Tsallis, we propose a bulk viscosity component that can only exist by a nonextensive effect through the nonextensive/dissipative correspondence (NexDC). In this paper, we consider a ΛCDM model for a flat universe with a dissipative nonextensive viscous dark matter component, following the Eckart theory of bulk viscosity, without any perturbative approach. In order to analyze cosmological constraints, we use one of the most recent observations of Type Ia Supernova, baryon acoustic oscillations and cosmic microwave background data.

  6. Chemical freezeout parameters within generic nonextensive statistics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel; Yassin, Hayam; Abo Elyazeed, Eman R.

    2018-06-01

    The particle production in relativistic heavy-ion collisions seems to be created in a dynamically disordered system which can be best described by an extended exponential entropy. In distinguishing between the applicability of this and Boltzmann-Gibbs (BG) in generating various particle-ratios, generic (non)extensive statistics is introduced to the hadron resonance gas model. Accordingly, the degree of (non)extensivity is determined by the possible modifications in the phase space. Both BG extensivity and Tsallis nonextensivity are included as very special cases defined by specific values of the equivalence classes (c, d). We found that the particle ratios at energies ranging between 3.8 and 2760 GeV are best reproduced by nonextensive statistics, where c and d range between ˜ 0.9 and ˜ 1 . The present work aims at illustrating that the proposed approach is well capable to manifest the statistical nature of the system on interest. We don't aim at highlighting deeper physical insights. In other words, while the resulting nonextensivity is neither BG nor Tsallis, the freezeout parameters are found very compatible with BG and accordingly with the well-known freezeout phase-diagram, which is in an excellent agreement with recent lattice calculations. We conclude that the particle production is nonextensive but should not necessarily be accompanied by a radical change in the intensive or extensive thermodynamic quantities, such as internal energy and temperature. Only, the two critical exponents defining the equivalence classes (c, d) are the physical parameters characterizing the (non)extensivity.

  7. BIG BANG NUCLEOSYNTHESIS WITH A NON-MAXWELLIAN DISTRIBUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertulani, C. A.; Fuqua, J.; Hussein, M. S.

    The abundances of light elements based on the big bang nucleosynthesis model are calculated using the Tsallis non-extensive statistics. The impact of the variation of the non-extensive parameter q from the unity value is compared to observations and to the abundance yields from the standard big bang model. We find large differences between the reaction rates and the abundance of light elements calculated with the extensive and the non-extensive statistics. We found that the observations are consistent with a non-extensive parameter q = 1{sub -} {sub 0.12}{sup +0.05}, indicating that a large deviation from the Boltzmann-Gibbs statistics (q = 1)more » is highly unlikely.« less

  8. Phase transition for the system of finite volume in the ϕ4 theory in the Tsallis nonextensive statistics

    NASA Astrophysics Data System (ADS)

    Ishihara, Masamichi

    2018-04-01

    We studied the effects of nonextensivity on the phase transition for the system of finite volume V in the ϕ4 theory in the Tsallis nonextensive statistics of entropic parameter q and temperature T, when the deviation from the Boltzmann-Gibbs (BG) statistics, |q ‑ 1|, is small. We calculated the condensate and the effective mass to the order q ‑ 1 with the normalized q-expectation value under the free particle approximation with zero bare mass. The following facts were found. The condensate Φ divided by v, Φ/v, at q (v is the value of the condensate at T = 0) is smaller than that at q‧ for q > q‧ as a function of Tph/v which is the physical temperature Tph divided by v. The physical temperature Tph is related to the variation of the Tsallis entropy and the variation of the internal energies, and Tph at q = 1 coincides with T. The effective mass decreases, reaches minimum, and increases after that, as Tph increases. The effective mass at q > 1 is lighter than the effective mass at q = 1 at low physical temperature and heavier than the effective mass at q = 1 at high physical temperature. The effects of the nonextensivity on the physical quantity as a function of Tph become strong as |q ‑ 1| increases. The results indicate the significance of the definition of the expectation value, the definition of the physical temperature, and the constraints for the density operator, when the terms including the volume of the system are not negligible.

  9. Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Sugiyama, Masaru

    . Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.

  10. Dark energy models through nonextensive Tsallis' statistics

    NASA Astrophysics Data System (ADS)

    Barboza, Edésio M.; Nunes, Rafael da C.; Abreu, Everton M. C.; Ananias Neto, Jorge

    2015-10-01

    The accelerated expansion of the Universe is one of the greatest challenges of modern physics. One candidate to explain this phenomenon is a new field called dark energy. In this work we have used the Tsallis nonextensive statistical formulation of the Friedmann equation to explore the Barboza-Alcaniz and Chevalier-Polarski-Linder parametric dark energy models and the Wang-Meng and Dalal vacuum decay models. After that, we have discussed the observational tests and the constraints concerning the Tsallis nonextensive parameter. Finally, we have described the dark energy physics through the role of the q-parameter.

  11. Generalized ensemble theory with non-extensive statistics

    NASA Astrophysics Data System (ADS)

    Shen, Ke-Ming; Zhang, Ben-Wei; Wang, En-Ke

    2017-12-01

    The non-extensive canonical ensemble theory is reconsidered with the method of Lagrange multipliers by maximizing Tsallis entropy, with the constraint that the normalized term of Tsallis' q -average of physical quantities, the sum ∑ pjq, is independent of the probability pi for Tsallis parameter q. The self-referential problem in the deduced probability and thermal quantities in non-extensive statistics is thus avoided, and thermodynamical relationships are obtained in a consistent and natural way. We also extend the study to the non-extensive grand canonical ensemble theory and obtain the q-deformed Bose-Einstein distribution as well as the q-deformed Fermi-Dirac distribution. The theory is further applied to the generalized Planck law to demonstrate the distinct behaviors of the various generalized q-distribution functions discussed in literature.

  12. Rogue Waves in Multi-Ion Cometary Plasmas

    NASA Astrophysics Data System (ADS)

    Sreekala, G.; Manesh, M.; Neethu, T. W.; Anu, V.; Sijo, S.; Venugopal, C.

    2018-01-01

    The effect of pair ions on the formation of rogue waves in a six-component plasma composed of two hot and one colder electron component, hot ions, and pair ions is studied. The kappa distribution, which provides an unambiguous replacement for a Maxwellian distribution in space plasmas, is connected with nonextensive statistical mechanics and provides a continuous energy spectrum. Hence, the colder and one component of the hotter electrons is modeled by kappa distributions and the other hot electron component, by a q-nonextensive distribution. It is found that the rogue wave amplitude is different for various pair-ion components. The magnitude, however, increases with increasing spectral index and nonextensive parameter q. These results may be useful in understanding the basic characteristics of rogue waves in cometary plasmas.

  13. Study of pre-seismic kHz EM emissions by means of complex systems

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos

    2010-05-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.

  14. Nonextensivity in a Dark Maximum Entropy Landscape

    NASA Astrophysics Data System (ADS)

    Leubner, M. P.

    2011-03-01

    Nonextensive statistics along with network science, an emerging branch of graph theory, are increasingly recognized as potential interdisciplinary frameworks whenever systems are subject to long-range interactions and memory. Such settings are characterized by non-local interactions evolving in a non-Euclidean fractal/multi-fractal space-time making their behavior nonextensive. After summarizing the theoretical foundations from first principles, along with a discussion of entropy bifurcation and duality in nonextensive systems, we focus on selected significant astrophysical consequences. Those include the gravitational equilibria of dark matter (DM) and hot gas in clustered structures, the dark energy(DE) negative pressure landscape governed by the highest degree of mutual correlations and the hierarchy of discrete cosmic structure scales, available upon extremizing the generalized nonextensive link entropy in a homogeneous growing network.

  15. Tsallis non-extensive statistics and solar wind plasma complexity

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.

    2015-03-01

    This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).

  16. Evidence of the non-extensive character of Earth's ambient noise.

    NASA Astrophysics Data System (ADS)

    Koutalonis, Ioannis; Vallianatos, Filippos

    2017-04-01

    Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.

  17. Generalising the logistic map through the q-product

    NASA Astrophysics Data System (ADS)

    Pessoa, R. W. S.; Borges, E. P.

    2011-03-01

    We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 <= xn <= 1, 0 < a <= 2) where otimesq stands for a generalisation of the ordinary product, known as q-product [Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.

  18. Keyword extraction by nonextensivity measure.

    PubMed

    Mehri, Ali; Darooneh, Amir H

    2011-05-01

    The presence of a long-range correlation in the spatial distribution of a relevant word type, in spite of random occurrences of an irrelevant word type, is an important feature of human-written texts. We classify the correlation between the occurrences of words by nonextensive statistical mechanics for the word-ranking process. In particular, we look at the nonextensivity parameter as an alternative metric to measure the spatial correlation in the text, from which the words may be ranked in terms of this measure. Finally, we compare different methods for keyword extraction. © 2011 American Physical Society

  19. Lattice QCD Thermodynamics and RHIC-BES Particle Production within Generic Nonextensive Statistics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser

    2018-05-01

    The current status of implementing Tsallis (nonextensive) statistics on high-energy physics is briefly reviewed. The remarkably low freezeout-temperature, which apparently fails to reproduce the firstprinciple lattice QCD thermodynamics and the measured particle ratios, etc. is discussed. The present work suggests a novel interpretation for the so-called " Tsallis-temperature". It is proposed that the low Tsallis-temperature is due to incomplete implementation of Tsallis algebra though exponential and logarithmic functions to the high-energy particle-production. Substituting Tsallis algebra into grand-canonical partition-function of the hadron resonance gas model seems not assuring full incorporation of nonextensivity or correlations in that model. The statistics describing the phase-space volume, the number of states and the possible changes in the elementary cells should be rather modified due to interacting correlated subsystems, of which the phase-space is consisting. Alternatively, two asymptotic properties, each is associated with a scaling function, are utilized to classify a generalized entropy for such a system with large ensemble (produced particles) and strong correlations. Both scaling exponents define equivalence classes for all interacting and noninteracting systems and unambiguously characterize any statistical system in its thermodynamic limit. We conclude that the nature of lattice QCD simulations is apparently extensive and accordingly the Boltzmann-Gibbs statistics is fully fulfilled. Furthermore, we found that the ratios of various particle yields at extreme high and extreme low energies of RHIC-BES is likely nonextensive but not necessarily of Tsallis type.

  20. Toward a Parastatistics in Quantum Nonextensive Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Zaripov, R. G.

    2018-05-01

    On the basis of Bose quantum states in parastatistics the equations for the equilibrium distribution of quantum additive and nonextensive systems are determined. The fluctuations and variances of physical quantities for the equilibrium system are found. The Abelian group of microscopic entropies is determined for the composition law with a quadratic nonlinearity.

  1. The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics

    NASA Astrophysics Data System (ADS)

    Pavlos, George

    2015-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time series at three cases: sunspot index, solar flare and solar wind data. The non-linear analysis of the sunspot index is embedded in the non-extensive statistical theory of Tsallis (1988; 2004; 2009). The q-triplet of Tsallis, as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the SVD components of the sunspot index timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000, 2001). Our analysis showed clearly the following: (a) a phase transition process in the solar dynamics from high dimensional non-Gaussian SOC state to a low dimensional non-Gaussian chaotic state, (b) strong intermittent solar turbulence and anomalous (multifractal) diffusion solar process, which is strengthened as the solar dynamics makes a phase transition to low dimensional chaos in accordance to Ruzmaikin, Zelenyi and Milovanov's studies (Zelenyi and Milovanov, 1991; Milovanov and Zelenyi, 1993; Ruzmakin et al., 1996), (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of: (i) non-Gaussian probability distribution function P(x), (ii) multifractal scaling exponent spectrum f(a) and generalized Renyi dimension spectrum Dq, (iii) exponent spectrum J(p) of the structure functions estimated for the sunspot index and its underlying non equilibrium solar dynamics. Also, the q-triplet of Tsallis as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the singular value decomposition (SVD) components of the solar flares timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000). Our analysis showed clearly the following: (a) a phase transition process in the solar flare dynamics from a high dimensional non-Gaussian self-organized critical (SOC) state to a low dimensional also non-Gaussian chaotic state, (b) strong intermittent solar corona turbulence and an anomalous (multifractal) diffusion solar corona process, which is strengthened as the solar corona dynamics makes a phase transition to low dimensional chaos, (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of the functions: (i) non-Gaussian probability distribution function P(x), (ii) f(a) and D(q), and (iii) J(p) for the solar flares timeseries and its underlying non-equilibrium solar dynamics, and (d) the solar flare dynamical profile is revealed similar to the dynamical profile of the solar corona zone as far as the phase transition process from self-organized criticality (SOC) to chaos state. However the solar low corona (solar flare) dynamical characteristics can be clearly discriminated from the dynamical characteristics of the solar convection zone. At last we present novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which can take place in Solar wind plasma system. The solar wind plasma as well as the entire solar plasma system is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields ( ) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992). References 1. T. Arimitsu, N. Arimitsu, Tsallis statistics and fully developed turbulence, J. Phys. A: Math. Gen. 33 (2000) L235. 2. T. Arimitsu, N. Arimitsu, Analysis of turbulence by statistics based on generalized entropies, Physica A 295 (2001) 177-194. 3. T. Chang, Low-dimensional behavior and symmetry braking of stochastic systems near criticality can these effects be observed in space and in the laboratory, IEEE 20 (6) (1992) 691-694. 4. U. Frisch, Turbulence, Cambridge University Press, Cambridge, UK, 1996, p. 310. 5. L.P. Karakatsanis, G.P. Pavlos, M.N. Xenakis, Tsallis non-extensive statistics, intermittent turbulence, SOC and chaos in the solar plasma. Part two: Solar flares dynamics, Physica A 392 (2013) 3920-3944. 6. A.V. Milovanov, Topological proof for the Alexander-Orbach conjecture, Phys. Rev. E 56 (3) (1997) 2437-2446. 7. A.V. Milovanov, L.M. Zelenyi, Fracton excitations as a driving mechanism for the self-organized dynamical structuring in the solar wind, Astrophys. Space Sci. 264 (1-4) (1999) 317-345. 8. A.V. Milovanov, Stochastic dynamics from the fractional Fokker-Planck-Kolmogorov equation: large-scale behavior of the turbulent transport coefficient, Phys. Rev. E 63 (2001) 047301. 9. G.P. Pavlos, et al., Universality of non-extensive Tsallis statistics and time series analysis: Theory and applications, Physica A 395 (2014) 58-95. 10. G.P. Pavlos, et al., Tsallis non-extensive statistics and solar wind plasma complexity, Physica A 422 (2015) 113-135. 11. A.A. Ruzmaikin, et al., Spectral properties of solar convection and diffusion, ApJ 471 (1996) 1022. 12. V.E. Tarasov, Review of some promising fractional physical models, Internat. J. Modern Phys. B 27 (9) (2013) 1330005. 13. C. Tsallis, Possible generalization of BG statistics, J. Stat. Phys. J 52 (1-2) (1988) 479-487. 14. C. Tsallis, Nonextensive statistical mechanics: construction and physical interpretation, in: G.M. Murray, C. Tsallis (Eds.), Nonextensive Entropy-Interdisciplinary Applications, Oxford Univ. Press, 2004, pp. 1-53. 15. C. Tsallis, Introduction to Non-Extensive Statistical Mechanics, Springer, 2009. 16. G.M. Zaslavsky, Chaos, fractional kinetics, and anomalous transport, Physics Reports 371 (2002) 461-580. 17. L.M. Zelenyi, A.V. Milovanov, Fractal properties of sunspots, Sov. Astron. Lett. 17 (6) (1991) 425. 18. L.M. Zelenyi, A.V. Milovanov, Fractal topology and strange kinetics: from percolation theory to problems in cosmic electrodynamics, Phys.-Usp. 47 (8), (2004) 749-788.

  2. Implication of Tsallis entropy in the Thomas–Fermi model for self-gravitating fermions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ourabah, Kamel; Tribeche, Mouloud, E-mail: mouloudtribeche@yahoo.fr

    The Thomas–Fermi approach for self-gravitating fermions is revisited within the theoretical framework of the q-statistics. Starting from the q-deformation of the Fermi–Dirac distribution function, a generalized Thomas–Fermi equation is derived. It is shown that the Tsallis entropy preserves a scaling property of this equation. The q-statistical approach to Jeans’ instability in a system of self-gravitating fermions is also addressed. The dependence of the Jeans’ wavenumber (or the Jeans length) on the parameter q is traced. It is found that the q-statistics makes the Fermionic system unstable at scales shorter than the standard Jeans length. -- Highlights: •Thomas–Fermi approach for self-gravitatingmore » fermions. •A generalized Thomas–Fermi equation is derived. •Nonextensivity preserves a scaling property of this equation. •Nonextensive approach to Jeans’ instability of self-gravitating fermions. •It is found that nonextensivity makes the Fermionic system unstable at shorter scales.« less

  3. Non-Extensive Statistical Analysis of Magnetic Field and SEPs during the March 2012 ICME event, using a multi-spacecraft approach

    NASA Astrophysics Data System (ADS)

    Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas

    2017-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  4. Evidence of Non-extensivity in Earth's Ambient Noise

    NASA Astrophysics Data System (ADS)

    Koutalonis, Ioannis; Vallianatos, Filippos

    2017-12-01

    The study of ambient seismic noise is one of the important scientific and practical research challenges, due to its use in a number of geophysical applications. In this work, we describe Earth's ambient noise fluctuations in terms of non-extensive statistical physics. We found that Earth's ambient noise increments follow the q-Gaussian distribution. This indicates that Earth's ambient noise's fluctuations are not random and present long-term memory effects that could be described in terms of Tsallis entropy. Our results suggest that q values depend on the time length used and that the non-extensive parameter, q, converges to value q → 1 for short-time windows and a saturation value of q ≈ 1.33 for longer ones. The results are discussed from the point of view of superstatistics introduced by Beck [Contin Mech Thermodyn 16(3):293-304, 2004] and connects the q values with the system's degrees of freedom. Our work indicates that the converged (maximum) value is q = 1.33 and is related to 5 degrees of freedom.

  5. Statistical analysis of Geopotential Height (GH) timeseries based on Tsallis non-extensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.

    2018-02-01

    In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.

  6. Evidence of non-extensivity and complexity in the seismicity observed during 2011-2012 at the Santorini volcanic complex, Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.

    2012-04-01

    Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).

  7. Characteristics of Electronegative Plasma Sheath with q-Nonextensive Electron Distribution

    NASA Astrophysics Data System (ADS)

    Borgohain, D. R.; Saharia, K.

    2018-01-01

    The characteristics of sheath in a plasma system containing q-nonextensive electrons, cold fluid ions, and Boltzmann-distributed negative ions are investigated. A modified Bohm sheath criterion is derived by using the Sagdeev pseudopotential technique. It is found that the proposed Bohm velocity depends on the degree of nonextensivity ( q), negative ion temperature to nonextensive electron temperature ratio (σ), and negative ion density ( B). Using the modified Bohm sheath criterion, the sheath characteristics, such as the spatial distribution of the potential, positive ion velocity, and density profile, have been numerically investigated, which clearly shows the effect of negative ions, as well as the nonextensive distribution of electrons. It is found that, as the nonextensivity parameter and the electronegativity increases, the electrostatic sheath potential increases sharply and the sheath width decreases.

  8. From QCD-based hard-scattering to nonextensive statistical mechanical descriptions of transverse momentum spectra in high-energy p p and p p ¯ collisions

    DOE PAGES

    Wong, Cheuk-Yin; Wilk, Grzegorz; Cirto, Leonardo J. L.; ...

    2015-06-22

    Transverse spectra of both jets and hadrons obtained in high-energymore » $pp$ and $$p\\bar p $$ collisions at central rapidity exhibit power-law behavior of $$1/p_T^n$$ at high $$p_T$$. The power index $n$ is 4-5 for jet production and is slightly greater for hadron production. Furthermore, the hadron spectra spanning over 14 orders of magnitude down to the lowest $$p_T$$ region in $pp$ collisions at LHC can be adequately described by a single nonextensive statistical mechanical distribution that is widely used in other branches of science. This suggests indirectly the dominance of the hard-scattering process over essentially the whole $$p_T$$ region at central rapidity in $pp$ collisions at LHC. We show here direct evidences of such a dominance of the hard-scattering process by investigating the power index of UA1 jet spectra over an extended $$p_T$$ region and the two-particle correlation data of the STAR and PHENIX Collaborations in high-energy $pp$ and $$p \\bar p$$ collisions at central rapidity. We then study how the showering of the hard-scattering product partons alters the power index of the hadron spectra and leads to a hadron distribution that can be cast into a single-particle non-extensive statistical mechanical distribution. Lastly, because of such a connection, the non-extensive statistical mechanical distribution can be considered as a lowest-order approximation of the hard-scattering of partons followed by the subsequent process of parton showering that turns the jets into hadrons, in high energy $pp$ and $$p\\bar p$$ collisions.« less

  9. Non-Extensive Statistical Analysis of Solar Wind Electric, Magnetic Fields and Solar Energetic Particle time series.

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Malandraki, O.; Khabarova, O.; Livadiotis, G.; Pavlos, E.; Karakatsanis, L. P.; Iliopoulos, A. C.; Parisis, K.

    2017-12-01

    In this work we study the non-extensivity of Solar Wind space plasma by using electric-magnetic field data obtained by in situ spacecraft observations at different dynamical states of solar wind system especially in interplanetary coronal mass ejections (ICMEs), Interplanetary shocks, magnetic islands, or near the Earth Bow shock. Especially, we study the energetic particle non extensive fractional acceleration mechanism producing kappa distributions as well as the intermittent turbulence mechanism producing multifractal structures related with the Tsallis q-entropy principle. We present some new and significant results concerning the dynamics of ICMEs observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere as well as magnetic islands. In-situ measurements of energetic particles at L1 are analyzed, in response to major solar eruptive events at the Sun (intense flares, fast CMEs). The statistical characteristics are obtained and compared for the Solar Energetic Particles (SEPs) originating at the Sun, the energetic particle enhancements associated with local acceleration during the CME-driven shock passage over the spacecraft (Energetic Particle Enhancements, ESPs) as well as the energetic particle signatures observed during the passage of the ICME. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of electric-magnetic field and the kappa distributions of solar energetic particles time series of the ICME, magnetic islands, resulting from the solar eruptive activity or the internal Solar Wind dynamics. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states.

  10. Non-extensive statistical analysis of magnetic field during the March 2012 ICME event using a multi-spacecraft approach

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Malandraki, O. E.; Pavlos, E. G.; Iliopoulos, A. C.; Karakatsanis, L. P.

    2016-12-01

    In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat ,qsen ,qrel) of magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For this, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as ;quiet;, ;shock; and ;aftershock;, while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014a,b; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event.

  11. Nonextensive models for earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, R.; Franca, G.S.; Vilar, C.S.

    2006-02-15

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment {epsilon}{proportional_to}r{sup 3}. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofisica.more » Although both approaches provide very similar values for the nonextensive parameter q, other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.« less

  12. Nonextensive models for earthquakes.

    PubMed

    Silva, R; França, G S; Vilar, C S; Alcaniz, J S

    2006-02-01

    We have revisited the fragment-asperity interaction model recently introduced by Sotolongo-Costa and Posadas [Phy. Rev. Lett. 92, 048501 (2004)] by considering a different definition for mean values in the context of Tsallis nonextensive statistics and introducing a scale between the earthquake energy and the size of fragment epsilon proportional to r3. The energy-distribution function (EDF) deduced in our approach is considerably different from the one obtained in the above reference. We have also tested the viability of this EDF with data from two different catalogs (in three different areas), namely, the NEIC and the Bulletin Seismic of the Revista Brasileira de Geofísica. Although both approaches provide very similar values for the nonextensive parameter , other physical quantities, e.g., energy density, differ considerably by several orders of magnitude.

  13. Universal renormalization-group dynamics at the onset of chaos in logistic maps and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Baldovin, F.; Robledo, A.

    2002-10-01

    We uncover the dynamics at the chaos threshold μ∞ of the logistic map and find that it consists of trajectories made of intertwined power laws that reproduce the entire period-doubling cascade that occurs for μ<μ∞. We corroborate this structure analytically via the Feigenbaum renormalization-group (RG) transformation and find that the sensitivity to initial conditions has precisely the form of a q exponential, of which we determine the q index and the q-generalized Lyapunov coefficient λq. Our results are an unequivocal validation of the applicability of the nonextensive generalization of Boltzmann-Gibbs statistical mechanics to critical points of nonlinear maps.

  14. Evidence of nonextensive statistical physics behavior in the watershed distribution in active tectonic areas: examples from Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos; Kouli, Maria

    2013-08-01

    The Digital Elevation Model (DEM) for the Crete Island with a resolution of approximately 20 meters was used in order to delineate watersheds by computing the flow direction and using it in the Watershed function. The Watershed function uses a raster of flow direction to determine contributing area. The Geographic Information Systems routine procedure was applied and the watersheds as well as the streams network (using a threshold of 2000 cells, i.e. the minimum number of cells that constitute a stream) were extracted from the hydrologically corrected (free of sinks) DEM. A number of a few thousand watersheds were delineated, and their areal extent was calculated. From these watersheds a number of 300 was finally selected for further analysis as the watersheds of extremely small area were excluded in order to avoid possible artifacts. Our analysis approach is based on the basic principles of Complexity theory and Tsallis Entropy introduces in the frame of non-extensive statistical physics. This concept has been successfully used for the analysis of a variety of complex dynamic systems including natural hazards, where fractality and long-range interactions are important. The analysis indicates that the statistical distribution of watersheds can be successfully described with the theoretical estimations of non-extensive statistical physics implying the complexity that characterizes the occurrences of them.

  15. Assessing information content and interactive relationships of subgenomic DNA sequences of the MHC using complexity theory approaches based on the non-extensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Karakatsanis, L. P.; Pavlos, G. P.; Iliopoulos, A. C.; Pavlos, E. G.; Clark, P. M.; Duke, J. L.; Monos, D. S.

    2018-09-01

    This study combines two independent domains of science, the high throughput DNA sequencing capabilities of Genomics and complexity theory from Physics, to assess the information encoded by the different genomic segments of exonic, intronic and intergenic regions of the Major Histocompatibility Complex (MHC) and identify possible interactive relationships. The dynamic and non-extensive statistical characteristics of two well characterized MHC sequences from the homozygous cell lines, PGF and COX, in addition to two other genomic regions of comparable size, used as controls, have been studied using the reconstructed phase space theorem and the non-extensive statistical theory of Tsallis. The results reveal similar non-linear dynamical behavior as far as complexity and self-organization features. In particular, the low-dimensional deterministic nonlinear chaotic and non-extensive statistical character of the DNA sequences was verified with strong multifractal characteristics and long-range correlations. The nonlinear indices repeatedly verified that MHC sequences, whether exonic, intronic or intergenic include varying levels of information and reveal an interaction of the genes with intergenic regions, whereby the lower the number of genes in a region, the less the complexity and information content of the intergenic region. Finally we showed the significance of the intergenic region in the production of the DNA dynamics. The findings reveal interesting content information in all three genomic elements and interactive relationships of the genes with the intergenic regions. The results most likely are relevant to the whole genome and not only to the MHC. These findings are consistent with the ENCODE project, which has now established that the non-coding regions of the genome remain to be of relevance, as they are functionally important and play a significant role in the regulation of expression of genes and coordination of the many biological processes of the cell.

  16. q-triplet for Brazos River discharge: The edge of chaos?

    NASA Astrophysics Data System (ADS)

    Stosic, Tatijana; Stosic, Borko; Singh, Vijay P.

    2018-04-01

    We study the daily discharge data of Brazos River in Texas, USA, from 1900 to 2017, in terms of concepts drawn from the non-extensive statistics recently introduced by Tsallis. We find that the Brazos River discharge indeed follows non-extensive statistics regarding equilibrium, relaxation and sensitivity. Besides being the first such finding of a full-fledged q-triplet in hydrological data with possible future impact on water resources management, the fact that all three Tsallis q-triplet values are remarkably close to those of the logistic map at the onset of chaos opens up new questions towards a deeper understanding of the Brazos River dynamics, that may prove relevant for hydrological research in a more general sense.

  17. Non-extensive quantum statistics with particle-hole symmetry

    NASA Astrophysics Data System (ADS)

    Biró, T. S.; Shen, K. M.; Zhang, B. W.

    2015-06-01

    Based on Tsallis entropy (1988) and the corresponding deformed exponential function, generalized distribution functions for bosons and fermions have been used since a while Teweldeberhan et al. (2003) and Silva et al. (2010). However, aiming at a non-extensive quantum statistics further requirements arise from the symmetric handling of particles and holes (excitations above and below the Fermi level). Naive replacements of the exponential function or "cut and paste" solutions fail to satisfy this symmetry and to be smooth at the Fermi level at the same time. We solve this problem by a general ansatz dividing the deformed exponential to odd and even terms and demonstrate that how earlier suggestions, like the κ- and q-exponential behave in this respect.

  18. Tsallis p⊥ distribution from statistical clusters

    NASA Astrophysics Data System (ADS)

    Bialas, A.

    2015-07-01

    It is shown that the transverse momentum distributions of particles emerging from the decay of statistical clusters, distributed according to a power law in their transverse energy, closely resemble those following from the Tsallis non-extensive statistical model. The experimental data are well reproduced with the cluster temperature T ≈ 160 MeV.

  19. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.

  20. Marginal instability threshold of magnetosonic waves in kappa distributed plasma

    NASA Astrophysics Data System (ADS)

    Bashir, M. F.; Manzoor, M. Z.; Ilie, R.; Yoon, P. H.; Miasli, M. S.

    2017-12-01

    The dispersion relation of magnetosonic wave is studied taking the non-extensive anisotropic counter-streaming distribution which follows the Tsallis statistics. The effects of non-extensivity parameter (q), counter-streaming parameter (P) and the wave-particle interaction is analyzed on the growth rate and the marginal instability threshold condition of Magnetosonic (MS) mode to provide the possible explanation of different regions the Bale-diagram obtained from the solar wind data at 1 AU as represented by the temperature anisotropy ( ) vs plasma beta ( ) solar wind data plot. It is shown that the most of the regions of Bale-diagram is bounded by the MS instability under different condition and best fitted by the non-extesnive distribution. The results for the bi-kappa distribution and bi- Maxwellian distribution are also obtained in the limits and respectively.

  1. Nonextensive Thomas-Fermi model

    NASA Astrophysics Data System (ADS)

    Shivamoggi, Bhimsen; Martinenko, Evgeny

    2007-11-01

    Nonextensive Thomas-Fermi model was father investigated in the following directions: Heavy atom in strong magnetic field. following Shivamoggi work on the extension of Kadomtsev equation we applied nonextensive formalism to father generalize TF model for the very strong magnetic fields (of order 10e12 G). The generalized TF equation and the binding energy of atom were calculated which contain a new nonextensive term dominating the classical one. The binding energy of a heavy atom was also evaluated. Thomas-Fermi equations in N dimensions which is technically the same as in Shivamoggi (1998) ,but behavior is different and in interesting 2 D case nonextesivity prevents from becoming linear ODE as in classical case. Effect of nonextensivity on dielectrical screening reveals itself in the reduction of the envelope radius. It was shown that nonextesivity in each case is responsible for new term dominating classical thermal correction term by order of magnitude, which is vanishing in a limit q->1. Therefore it appears that nonextensive term is ubiquitous for a wide range of systems and father work is needed to understand the origin of it.

  2. A family of nonlinear Schrödinger equations admitting q-plane wave solutions

    NASA Astrophysics Data System (ADS)

    Nobre, F. D.; Plastino, A. R.

    2017-08-01

    Nonlinear Schrödinger equations with power-law nonlinearities have attracted considerable attention recently. Two previous proposals for these types of equations, corresponding respectively to the Gross-Pitaievsky equation and to the one associated with nonextensive statistical mechanics, are here unified into a single, parameterized family of nonlinear Schrödinger equations. Power-law nonlinear terms characterized by exponents depending on a real index q, typical of nonextensive statistical mechanics, are considered in such a way that the Gross-Pitaievsky equation is recovered in the limit q → 1. A classical field theory shows that, due to these nonlinearities, an extra field Φ (x → , t) (besides the usual one Ψ (x → , t)) must be introduced for consistency. The new field can be identified with Ψ* (x → , t) only when q → 1. For q ≠ 1 one has a pair of coupled nonlinear wave equations governing the joint evolution of the complex valued fields Ψ (x → , t) and Φ (x → , t). These equations reduce to the usual pair of complex-conjugate ones only in the q → 1 limit. Interestingly, the nonlinear equations obeyed by Ψ (x → , t) and Φ (x → , t) exhibit a common, soliton-like, traveling solution, which is expressible in terms of the q-exponential function that naturally emerges within nonextensive statistical mechanics.

  3. Nonextensivity at the Circum-Pacific subduction zones-Preliminary studies

    NASA Astrophysics Data System (ADS)

    Scherrer, T. M.; França, G. S.; Silva, R.; de Freitas, D. B.; Vilar, C. S.

    2015-05-01

    Following the fragment-asperity interaction model introduced by Sotolongo-Costa and Posadas (2004) and revised by Silva et al. (2006), we try to explain the nonextensive effect in the context of the asperity model designed by Lay and Kanamori (1981). To address this issue, we used data from the NEIC catalog in the decade between 2001 and 2010, in order to investigate the so-called Circum-Pacific subduction zones. We propose a geophysical explanation to nonextensive parameter q. The results need further investigation however evidence of correlation between the nonextensive parameter and the asperity model is shown, i.e., we show that q-value is higher for areas with larger asperities and stronger coupling.

  4. Modulational instability: Conservation laws and bright soliton solution of ion-acoustic waves in electron-positron-ion-dust plasmas

    NASA Astrophysics Data System (ADS)

    EL-Kalaawy, O. H.

    2018-02-01

    We consider the nonlinear propagation of non-planar (cylindrical and spherical) ion-acoustic (IA) envelope solitary waves in an unmagnetized plasma consisting of electron-positron-ion-dust plasma with two-electron temperature distributions in the context of the non-extensive statistics. The basic set of fluid equations is reduced to the modified nonlinear Schrödinger (MNLS) equation in cylindrical and spherical geometry by using the reductive perturbation method (RPM). It is found that the nature of the modulational instabilities would be significantly modified due to the effects of the non-extensive and other plasma parameters as well as cylindrical and spherical geometry. Conservation laws of the MNLS equation are obtained by Lie symmetry and multiplier method. A new exact solution (envelope bright soliton) is obtained by the extended homogeneous balance method. Finally, we study the results of this article.

  5. Dynamics of regional brain activity in epilepsy: a cross-disciplinary study on both intracranial and scalp-recorded epileptic seizures.

    PubMed

    Minadakis, George; Ventouras, Errikos; Gatzonis, Stylianos D; Siatouni, Anna; Tsekou, Hara; Kalatzis, Ioannis; Sakas, Damianos E; Stonham, John

    2014-04-01

    Recent cross-disciplinary literature suggests a dynamical analogy between earthquakes and epileptic seizures. This study extends the focus of inquiry for the applicability of models for earthquake dynamics to examine both scalp-recorded and intracranial electroencephalogram recordings related to epileptic seizures. First, we provide an updated definition of the electric event in terms of magnitude and we focus on the applicability of (i) a model for earthquake dynamics, rooted in a nonextensive Tsallis framework, (ii) the traditional Gutenberg and Richter law and (iii) an alternative method for the magnitude-frequency relation for earthquakes. Second, we apply spatiotemporal analysis in terms of nonextensive statistical physics and we further examine the behavior of the parameters included in the nonextensive formula for both types of electroencephalogram recordings under study. We confirm the previously observed power-law distribution, showing that the nonextensive formula can adequately describe the sequences of electric events included in both types of electroencephalogram recordings. We also show the intermittent behavior of the epileptic seizure cycle which is analogous to the earthquake cycles and we provide evidence of self-affinity of the regional electroencephalogram epileptic seizure activity. This study may provide a framework for the analysis and interpretation of epileptic brain activity and other biological phenomena with similar underlying dynamical mechanisms.

  6. Chandrasekhar's dynamical friction and non-extensive statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, J.M.; Lima, J.A.S.; De Souza, R.E.

    2016-05-01

    The motion of a point like object of mass M passing through the background potential of massive collisionless particles ( m || M ) suffers a steady deceleration named dynamical friction. In his classical work, Chandrasekhar assumed a Maxwellian velocity distribution in the halo and neglected the self gravity of the wake induced by the gravitational focusing of the mass M . In this paper, by relaxing the validity of the Maxwellian distribution due to the presence of long range forces, we derive an analytical formula for the dynamical friction in the context of the q -nonextensive kinetic theory. Inmore » the extensive limiting case ( q = 1), the classical Gaussian Chandrasekhar result is recovered. As an application, the dynamical friction timescale for Globular Clusters spiraling to the galactic center is explicitly obtained. Our results suggest that the problem concerning the large timescale as derived by numerical N -body simulations or semi-analytical models can be understood as a departure from the standard extensive Maxwellian regime as measured by the Tsallis nonextensive q -parameter.« less

  7. Tsallis non-extensive statistical mechanics in the ionospheric detrended total electron content during quiet and storm periods

    NASA Astrophysics Data System (ADS)

    Ogunsua, B. O.; Laoye, J. A.

    2018-05-01

    In this paper, the Tsallis non-extensive q-statistics in ionospheric dynamics was investigated using the total electron content (TEC) obtained from two Global Positioning System (GPS) receiver stations. This investigation was carried out considering the geomagnetically quiet and storm periods. The micro density variation of the ionospheric total electron content was extracted from the TEC data by method of detrending. The detrended total electron content, which represent the variation in the internal dynamics of the system was further analyzed using for non-extensive statistical mechanics using the q-Gaussian methods. Our results reveals that for all the analyzed data sets the Tsallis Gaussian probability distribution (q-Gaussian) with value q > 1 were obtained. It was observed that there is no distinct difference in pattern between the values of qquiet and qstorm. However the values of q varies with geophysical conditions and possibly with local dynamics for the two stations. Also observed are the asymmetric pattern of the q-Gaussian and a highly significant level of correlation for the q-index values obtained for the storm periods compared to the quiet periods between the two GPS receiver stations where the TEC was measured. The factors responsible for this variation can be mostly attributed to the varying mechanisms resulting in the self-reorganization of the system dynamics during the storm periods. The result shows the existence of long range correlation for both quiet and storm periods for the two stations.

  8. Nonadditive entropy Sq and nonextensive statistical mechanics: Applications in geophysics and elsewhere

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2012-06-01

    The celebrated Boltzmann-Gibbs (BG) entropy, S BG = -kΣi p i ln p i, and associated statistical mechanics are essentially based on hypotheses such as ergodicity, i.e., when ensemble averages coincide with time averages. This dynamical simplification occurs in classical systems (and quantum counterparts) whose microscopic evolution is governed by a positive largest Lyapunov exponent (LLE). Under such circumstances, relevant microscopic variables behave, from the probabilistic viewpoint, as (nearly) independent. Many phenomena exist, however, in natural, artificial and social systems (geophysics, astrophysics, biophysics, economics, and others) that violate ergodicity. To cover a (possibly) wide class of such systems, a generalization (nonextensive statistical mechanics) of the BG theory was proposed in 1988. This theory is based on nonadditive entropies such as S_q = kfrac{{1 - sumnolimits_i {p_i^q } }} {{q - 1}}left( {S_1 = S_{BG} } right). Here we comment some central aspects of this theory, and briefly review typical predictions, verifications and applications in geophysics and elsewhere, as illustrated through theoretical, experimental, observational, and computational results.

  9. Two solitons oblique collision in anisotropic non-extensive dusty plasma

    NASA Astrophysics Data System (ADS)

    El-Labany, S. K.; El-Taibany, W. F.; Behery, E. E.; Fouda, S. M.

    2017-03-01

    Using an extended Poincaré-Lighthill-Kue method, the oblique collision of two dust acoustic solitons (DASs) in a magnetized non-extensive plasma with the effect of dust pressure anisotropy is studied. The dust fluid is supposed to have an arbitrary charge. A couple of Korteweg-de Vries (KdV) equations are derived for the colliding DASs. The phase shift of each soliton is obtained. It is found that the dust pressure anisotropy, the non-extensive parameter for electrons and ions, plays an important role in determining the collision phase shifts. The present results show that, for the negative dust case, the phase shift of the first soliton decreases, while that of the second soliton increases as either the dust pressure ratio increases or the ion non-extensive parameter decreases. On the other hand, for the positive dust case, the phase shift of the first soliton decreases, while the phase shift of the second soliton increases as either the dust pressure ratio or the ion non-extensive parameter increases. The application of the present findings to some dusty plasma phenomena occurring in space and laboratory plasmas is briefly discussed.

  10. Asymmetry of price returns—Analysis and perspectives from a non-extensive statistical physics point of view

    PubMed Central

    Bil, Łukasz; Zienowicz, Magdalena

    2017-01-01

    We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and—for comparison—data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies—owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)—but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives indications that the statement:” where you are is more important than who you are” is true on trading markets. PMID:29190696

  11. Asymmetry of price returns-Analysis and perspectives from a non-extensive statistical physics point of view.

    PubMed

    Bil, Łukasz; Grech, Dariusz; Zienowicz, Magdalena

    2017-01-01

    We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and-for comparison-data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives indications that the statement:" where you are is more important than who you are" is true on trading markets.

  12. Dynamics of regional brain activity in epilepsy: a cross-disciplinary study on both intracranial and scalp-recorded epileptic seizures

    NASA Astrophysics Data System (ADS)

    Minadakis, George; Ventouras, Errikos; Gatzonis, Stylianos D.; Siatouni, Anna; Tsekou, Hara; Kalatzis, Ioannis; Sakas, Damianos E.; Stonham, John

    2014-04-01

    Objective. Recent cross-disciplinary literature suggests a dynamical analogy between earthquakes and epileptic seizures. This study extends the focus of inquiry for the applicability of models for earthquake dynamics to examine both scalp-recorded and intracranial electroencephalogram recordings related to epileptic seizures. Approach. First, we provide an updated definition of the electric event in terms of magnitude and we focus on the applicability of (i) a model for earthquake dynamics, rooted in a nonextensive Tsallis framework, (ii) the traditional Gutenberg and Richter law and (iii) an alternative method for the magnitude-frequency relation for earthquakes. Second, we apply spatiotemporal analysis in terms of nonextensive statistical physics and we further examine the behavior of the parameters included in the nonextensive formula for both types of electroencephalogram recordings under study. Main results. We confirm the previously observed power-law distribution, showing that the nonextensive formula can adequately describe the sequences of electric events included in both types of electroencephalogram recordings. We also show the intermittent behavior of the epileptic seizure cycle which is analogous to the earthquake cycles and we provide evidence of self-affinity of the regional electroencephalogram epileptic seizure activity. Significance. This study may provide a framework for the analysis and interpretation of epileptic brain activity and other biological phenomena with similar underlying dynamical mechanisms.

  13. Magnetic storms and solar flares: can be analysed within similar mathematical framework with other extreme events?

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Potirakis, Stelios M.; Papadimitriou, Constantinos; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2015-04-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up to different extreme events, in order to support the suggestion that a dynamical analogy characterizes the generation of a single magnetic storm, solar flare, earthquake (in terms of pre-seismic electromagnetic signals) , epileptic seizure, and economic crisis. The analysis reveals that all the above mentioned different extreme events can be analyzed within similar mathematical framework. More precisely, we show that the populations of magnitudes of fluctuations included in all the above mentioned pulse-like-type time series follow the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar nonextensive q-parameter values. Moreover, based on a multidisciplinary statistical analysis we show that the extreme events are characterized by crucial common symptoms, namely: (i) high organization, high compressibility, low complexity, high information content; (ii) strong persistency; and (iii) existence of clear preferred direction of emerged activities. These symptoms clearly discriminate the appearance of the extreme events under study from the corresponding background noise.

  14. Dynamical analogy between economical crisis and earthquake dynamics within the nonextensive statistical mechanics framework

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2013-07-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. Several authors have suggested that earthquake dynamics and the dynamics of economic (financial) systems can be analyzed within similar mathematical frameworks. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up with these different extreme events, in order to support the suggestion that a dynamical analogy exists between a financial crisis (in the form of share or index price collapse) and a single earthquake. We also investigate the existence of such an analogy by means of scale-free statistics (the Gutenberg-Richter distribution of event sizes). We show that the populations of: (i) fracto-electromagnetic events rooted in the activation of a single fault, emerging prior to a significant earthquake, (ii) the trade volume events of different shares/economic indices, prior to a collapse, and (iii) the price fluctuation (considered as the difference of maximum minus minimum price within a day) events of different shares/economic indices, prior to a collapse, follow both the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar parameter values. The obtained results imply the existence of a dynamic analogy between earthquakes and economic crises, which moreover follow the dynamics of seizures, magnetic storms and solar flares.

  15. Dust-acoustic waves and stability in the permeating dusty plasma. II. Power-law distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gong Jingyu; Du Jiulin; Liu Zhipeng

    2012-08-15

    The dust-acoustic waves and the stability theory for the permeating dusty plasma with power-law distributions are studied by using nonextensive q-statistics. In two limiting physical cases, when the thermal velocity of the flowing dusty plasma is much larger than, and much smaller than the phase velocity of the waves, we derived the dust-acoustic wave frequency, the instability growth rate, and the instability critical flowing velocity. As compared with the formulae obtained in part I [Gong et al., Phys. Plasmas 19, 043704 (2012)], all formulae of the present cases and the resulting plasma characteristics are q-dependent, and the power-law distribution ofmore » each plasma component of the permeating dusty plasma has a different q-parameter and thus has a different nonextensive effect. Further, we make numerical analyses of an example that a cometary plasma tail is passing through the interplanetary space dusty plasma and we show that these power-law distributions have significant effects on the plasma characteristics of this kind of plasma environment.« less

  16. Relativity, nonextensivity, and extended power law distributions.

    PubMed

    Silva, R; Lima, J A S

    2005-11-01

    A proof of the relativistic theorem by including nonextensive effects is given. As it happens in the nonrelativistic limit, the molecular chaos hypothesis advanced by Boltzmann does not remain valid, and the second law of thermodynamics combined with a duality transformation implies that the parameter lies on the interval [0,2]. It is also proven that the collisional equilibrium states (null entropy source term) are described by the relativistic power law extension of the exponential Juttner distribution which reduces, in the nonrelativistic domain, to the Tsallis power law function. As a simple illustration of the basic approach, we derive the relativistic nonextensive equilibrium distribution for a dilute charged gas under the action of an electromagnetic field . Such results reduce to the standard ones in the extensive limit, thereby showing that the nonextensive entropic framework can be harmonized with the space-time ideas contained in the special relativity theory.

  17. Electron-acoustic rogue waves in a plasma with Tribeche–Tsallis–Cairns distributed electrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merriche, Abderrzak; Tribeche, Mouloud, E-mail: mouloudtribeche@yahoo.fr; Algerian Academy of Sciences and Technologies, Algiers

    2017-01-15

    The problem of electron-acoustic (EA) rogue waves in a plasma consisting of fluid cold electrons, nonthermal nonextensive electrons and stationary ions, is addressed. A standard multiple scale method has been carried out to derive a nonlinear Schrödinger-like equation. The coefficients of dispersion and nonlinearity depend on the nonextensive and nonthermal parameters. The EA wave stability is analyzed. Interestingly, it is found that the wave number threshold, above which the EA wave modulational instability (MI) sets in, increases as the nonextensive parameter increases. As the nonthermal character of the electrons increases, the MI occurs at large wavelength. Moreover, it is shownmore » that as the nonextensive parameter increases, the EA rogue wave pulse grows while its width is narrowed. The amplitude of the EA rogue wave decreases with an increase of the number of energetic electrons. In the absence of nonthermal electrons, the nonextensive effects are more perceptible and more noticeable. In view of the crucial importance of rogue waves, our results can contribute to the understanding of localized electrostatic envelope excitations and underlying physical processes, that may occur in space as well as in laboratory plasmas.« less

  18. On the nature and dynamics of the seismogenetic systems of North California, USA: An analysis based on Non-Extensive Statistical Physics

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2017-09-01

    We examine the nature of the seismogenetic system in North California, USA, by searching for evidence of complexity and non-extensivity in the earthquake record. We attempt to determine whether earthquakes are generated by a self-excited Poisson process, in which case they obey Boltzmann-Gibbs thermodynamics, or by a Critical process, in which long-range interactions in non-equilibrium states are expected (correlation) and the thermodynamics deviate from the Boltzmann-Gibbs formalism. Emphasis is given to background seismicity since it is generally agreed that aftershock sequences comprise correlated sets. We use the complete and homogeneous earthquake catalogue published by the North California Earthquake Data Centre, in which aftershocks are either included, or have been removed by a stochastic declustering procedure. We examine multivariate cumulative frequency distributions of earthquake magnitudes, interevent time and interevent distance in the context of Non-Extensive Statistical Physics, which is a generalization of extensive Boltzmann-Gibbs thermodynamics to non-equilibrating (non-extensive) systems. Our results indicate that the seismogenetic systems of North California are generally sub-extensive complex and non-Poissonian. The background seismicity exhibits long-range interaction as evidenced by the overall increase of correlation observed by declustering the earthquake catalogues, as well as by the high correlation observed for earthquakes separated by long interevent distances. It is also important to emphasize that two subsystems with rather different properties appear to exist. The correlation observed along the Sierra Nevada Range - Walker Lane is quasi-stationary and indicates a Self-Organized Critical fault system. Conversely, the north segment of the San Andreas Fault exhibits changes in the level of correlation with reference to the large Loma Prieta event of 1989 and thus has attributes of Critical Point behaviour albeit without acceleration of seismic release rates. SOC appears to be a likely explanation of complexity mechanisms but since there are other ways by which complexity may emerge, additional work is required before assertive conclusions can be drawn.

  19. Drift dust acoustic soliton in the presence of field-aligned sheared flow and nonextensivity effects

    NASA Astrophysics Data System (ADS)

    Shah, AttaUllah; Mushtaq, A.; Farooq, M.; Khan, Aurangzeb; Aman-ur-Rehman

    2018-05-01

    Low frequency electrostatic dust drift acoustic (DDA) waves are studied in an inhomogeneous dust magnetoplasma comprised of dust components of opposite polarity, Boltzmannian ions, and nonextensive distributed electrons. The magnetic-field-aligned dust sheared flow drives the electrostatic drift waves in the presence of ions and electrons. The sheared flow decreases or increases the frequency of the DDA wave, mostly depending on its polarity. The conditions of instability for this mode, with nonextensivity and dust streaming effects, are discussed. The nonlinear dynamics is then investigated for the DDA wave by deriving the Koeteweg-deVries (KdV) nonlinear equation. The KdV equation yields an electrostatic structure in the form of a DDA soliton. The relevancy of the work to laboratory four component dusty plasmas is illustrated.

  20. The precise time-dependent solution of the Fokker–Planck equation with anomalous diffusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Ran; Du, Jiulin, E-mail: jiulindu@aliyun.com

    2015-08-15

    We study the time behavior of the Fokker–Planck equation in Zwanzig’s rule (the backward-Ito’s rule) based on the Langevin equation of Brownian motion with an anomalous diffusion in a complex medium. The diffusion coefficient is a function in momentum space and follows a generalized fluctuation–dissipation relation. We obtain the precise time-dependent analytical solution of the Fokker–Planck equation and at long time the solution approaches to a stationary power-law distribution in nonextensive statistics. As a test, numerically we have demonstrated the accuracy and validity of the time-dependent solution. - Highlights: • The precise time-dependent solution of the Fokker–Planck equation with anomalousmore » diffusion is found. • The anomalous diffusion satisfies a generalized fluctuation–dissipation relation. • At long time the time-dependent solution approaches to a power-law distribution in nonextensive statistics. • Numerically we have demonstrated the accuracy and validity of the time-dependent solution.« less

  1. Effect of q-nonextensive parameter and saturation time on electron density steepening in electron-positron-ion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hashemzadeh, M., E-mail: hashemzade@gmail.com

    2015-11-15

    The effect of q-nonextensive parameter and saturation time on the electron density steepening in electron-positron-ion plasmas is studied by particle in cell method. Phase space diagrams show that the size of the holes, and consequently, the number of trapped particles strongly depends on the q-parameter and saturation time. Furthermore, the mechanism of the instability and exchange of energy between electron-positron and electric field is explained by the profiles of the energy density. Moreover, it is found that the q-parameter, saturation time, and electron and positron velocities affect the nonlinear evolution of the electron density which leads to the steepening ofmore » its structure. The q-nonextensive parameter or degree of nonextensivity is the relation between temperature gradient and potential energy of the system. Therefore, the deviation of q-parameter from unity indicates the degree of inhomogeneity of temperature or deviation from equilibrium. Finally, using the kinetic theory, a generalized q-dispersion relation is presented for electron-positron-ion plasma systems. It is found that the simulation results in the linear regime are in good agreement with the growth rate results obtained by the kinetic theory.« less

  2. Non-extensive entropy of modified Gaussian quantum dot under polaron effects

    NASA Astrophysics Data System (ADS)

    Bahramiyan, H.; Khordad, R.; Sedehi, H. R. Rastegar

    2018-01-01

    The effect of electron-phonon (e-p) interaction on the non-extensive Tsallis entropy of a modified Gaussian quantum dot has been investigated. In this work, the LO-phonons, SO-phonons and LO + SO-phonons have been considered. It is found that the entropy increases with enhancing the confinement potential range and depth. The entropy decreases with considering the electron-phonon interaction. The electron-LO + SO-phonon interaction has the largest contribution to the entropy.

  3. Nonlinear waves in viscoelastic magnetized complex astroplasmas with polarized dust-charge variations

    NASA Astrophysics Data System (ADS)

    Das, Papari; Karmakar, Pralay Kumar

    2018-01-01

    A nonextensive nonthermal magnetized viscoelastic astrofluid, compositionally containing nonthermal electrons and ions together with massive polarized dust micro-spherical grains of variable electric charge, is allowed to endure weakly nonlinear perturbation around its equilibrium. The nonextensivity originating from the large-scale non-local effects is included via the Tsallis thermo-statistical distribution laws describing the lighter species. Assuming the equilibrium as a homogeneous hydrostatic one, the dust polarization effects are incorporated via the conventional homogeneous polarization force law. The perturbed fluid model evolves as a unique conjugate pair of coupled extended Korteweg-de Vries (e-KdV) equations. A constructed numerical tapestry shows the collective excitations of a new pair of distinct classes of nonlinear mode structures in new parametric space. The first family indicates periodic electrostatic compressive eigenmodes in the form of soliton-chains. Likewise, the second one reveals gravitational rarefactive solitary patterns. Their microphysical multi-parametric dependencies of the eigen-patterns are illustratively analyzed and bolstered. The paper ends up with some promising implications and applications in the astro-cosmo-plasmic context of wave-induced accretive triggering processes responsible for gravitationally bounded (gravito-condensed) astro-structure formation, such as stellesimals, planetsimals, etc.

  4. Effects of dust polarity and nonextensive electrons on the dust-ion acoustic solitons and double layers in earth atmosphere

    NASA Astrophysics Data System (ADS)

    Ghobakhloo, Marzieh; Zomorrodian, Mohammad Ebrahim; Javidan, Kurosh

    2018-05-01

    Propagation of dustion acoustic solitary waves (DIASWs) and double layers is discussed in earth atmosphere, using the Sagdeev potential method. The best model for distribution function of electrons in earth atmosphere is found by fitting available data on different distribution functions. The nonextensive function with parameter q = 0.58 provides the best fit on observations. Thus we analyze the propagation of localized waves in an unmagnetized plasma containing nonextensive electrons, inertial ions, and negatively/positively charged stationary dust. It is found that both compressive and rarefactive solitons as well as double layers exist depending on the sign (and the value) of dust polarity. Characters of propagated waves are described using the presented model.

  5. Propagation of cylindrical ion acoustic waves in a plasma with q-nonextensive electrons with nonthermal distribution

    NASA Astrophysics Data System (ADS)

    El-Depsy, A.; Selim, M. M.

    2016-12-01

    The propagation of ion acoustic waves (IAWs) in a cylindrical collisionless unmagnetized plasma, containing ions and electrons is investigated. The electrons are considered to be nonextensive and follow nonthermal distribution. The reductive perturbation technique (RPT) is used to obtain a nonlinear cylindrical Kadomtsev-Petviashvili (CKP) evolution equation. This equation is solved analytically. The effects of plasma parameters on the IAWs characteristics are discussed in details. Both compressive and rarefactive solitons are found to be created in the proposed plasma system. The profile of IAWs is found to depend on the nonextensive and nonthermal parameters. The present study is useful for understanding IAWs in the regions where mixed electron distribution in space, or laboratory plasmas, exist.

  6. Prehensile apparatus

    DOEpatents

    Smith, C.M.

    1993-10-12

    The present invention relates to an apparatus for handling a workpiece comprising a vessel that is longitudinally extensible and pressurizable, and a nonextensible and laterally flexible member on the vessel. The member constrains one side of the vessel to be nonextensible, causing the vessel to bend in the direction of the nonextensible member when pressurized. 8 figures.

  7. Prehensile apparatus

    DOEpatents

    Smith, Christopher M.

    1993-01-01

    The present invention relates to an apparatus for handling a workpiece comprising a vessel that is longitudinally extensible and pressurizable, and a nonextensible and laterally flexible member on the vessel. The member constrains one side of the vessel to be nonextensible, causing the vessel to bend in the direction of the nonextensible member when pressurized.

  8. The characters of ion acoustic rogue waves in nonextensive plasma

    NASA Astrophysics Data System (ADS)

    Du, Hai-su; Lin, Mai-mai; Gong, Xue; Duan, Wen-shan

    2017-10-01

    Several well-known nonlinear waves in the rational solutions of the nonlinear Schrödinger equation are studied in two-component plasmas consisting of ions fluid and nonextensive electrons, such as Kuznetsov-Ma breather (K-M), bright soliton, rogue wave (RW), Akhmediev breather (AB) and dark soliton, and so on. In this paper, we have investigated the characteristics of K-M, AB, and RW's propagation in plasma with nonextensive electron distribution, and the dependence of amplitude and width for ion acoustic rogue waves in this system. It is found that K-M' triplet is appearance-disappearance-appearance-disappearance. AB solitons only appear once and RW is a single wave that appears from nowhere and then disappears. It is also noted that the wave number and nonextensive parameter of electrons have a significant influence on the maximum envelope amplitude, but, the influence of the width was not significant. At the same time, the effects of the small parameter, which represent the nonlinear strength, on the amplitude and width of ion acoustic rogue waves are also being highlighted.

  9. Small amplitude two dimensional electrostatic excitations in a magnetized dusty plasma with q-distributed electrons

    NASA Astrophysics Data System (ADS)

    Khan, Shahab Ullah; Adnan, Muhammad; Qamar, Anisa; Mahmood, Shahzad

    2016-07-01

    The propagation of linear and nonlinear electrostatic waves is investigated in magnetized dusty plasma with stationary negatively or positively charged dust, cold mobile ions and non-extensive electrons. Two normal modes are predicted in the linear regime, whose characteristics are investigated parametrically, focusing on the effect of electrons non-extensivity, dust charge polarity, concentration of dust and magnetic field strength. Using the reductive perturbation technique, a Zakharov-Kuznetsov (ZK) type equation is derived which governs the dynamics of small-amplitude solitary waves in magnetized dusty plasma. The properties of the solitary wave structures are analyzed numerically with the system parameters i.e. electrons non-extensivity, concentration of dust, polarity of dust and magnetic field strength. Following Allen and Rowlands (J. Plasma Phys. 53:63, 1995), we have shown that the pulse soliton solution of the ZK equation is unstable, and have analytically traced the dependence of the instability growth rate on the nonextensive parameter q for electrons, dust charge polarity and magnetic field strength. The results should be useful for understanding the nonlinear propagation of DIA solitary waves in laboratory and space plasmas.

  10. Nonextensive Entropy Approach to Space Plasma Fluctuations and Turbulence

    NASA Astrophysics Data System (ADS)

    Leubner, M. P.; Vörös, Z.; Baumjohann, W.

    Spatial intermittency in fully developed turbulence is an established feature of astrophysical plasma fluctuations and in particular apparent in the interplanetary medium by in situ observations. In this situation, the classical Boltzmann— Gibbs extensive thermo-statistics, applicable when microscopic interactions and memory are short ranged and the environment is a continuous and differentiable manifold, fails. Upon generalization of the entropy function to nonextensivity, accounting for long-range interactions and thus for correlations in the system, it is demonstrated that the corresponding probability distribution functions (PDFs) are members of a family of specific power-law distributions. In particular, the resulting theoretical bi-κ functional reproduces accurately the observed global leptokurtic, non-Gaussian shape of the increment PDFs of characteristic solar wind variables on all scales, where nonlocality in turbulence is controlled via a multiscale coupling parameter. Gradual decoupling is obtained by enhancing the spatial separation scale corresponding to increasing κ-values in case of slow solar wind conditions where a Gaussian is approached in the limit of large scales. Contrary, the scaling properties in the high speed solar wind are predominantly governed by the mean energy or variance of the distribution, appearing as second parameter in the theory. The PDFs of solar wind scalar field differences are computed from WIND and ACE data for different time-lags and bulk speeds and analyzed within the nonextensive theory, where also a particular nonlinear dependence of the coupling parameter and variance with scale arises for best fitting theoretical PDFs. Consequently, nonlocality in fluctuations, related to both, turbulence and its large scale driving, should be related to long-range interactions in the context of nonextensive entropy generalization, providing fundamentally the physical background of the observed scale dependence of fluctuations in intermittent space plasmas.

  11. Non-extensive Statistics to the Cosmological Lithium Problem

    NASA Astrophysics Data System (ADS)

    Hou, S. Q.; He, J. J.; Parikh, A.; Kahl, D.; Bertulani, C. A.; Kajino, T.; Mathews, G. J.; Zhao, G.

    2017-01-01

    Big Bang nucleosynthesis (BBN) theory predicts the abundances of the light elements D, 3He, 4He, and 7Li produced in the early universe. The primordial abundances of D and 4He inferred from observational data are in good agreement with predictions, however, BBN theory overestimates the primordial 7Li abundance by about a factor of three. This is the so-called “cosmological lithium problem.” Solutions to this problem using conventional astrophysics and nuclear physics have not been successful over the past few decades, probably indicating the presence of new physics during the era of BBN. We have investigated the impact on BBN predictions of adopting a generalized distribution to describe the velocities of nucleons in the framework of Tsallis non-extensive statistics. This generalized velocity distribution is characterized by a parameter q, and reduces to the usually assumed Maxwell-Boltzmann distribution for q = 1. We find excellent agreement between predicted and observed primordial abundances of D, 4He, and 7Li for 1.069 ≤ q ≤ 1.082, suggesting a possible new solution to the cosmological lithium problem.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunes, Rafael C.; Abreu, Everton M.C.; Neto, Jorge Ananias

    Based on the relationship between thermodynamics and gravity we propose, with the aid of Verlinde's formalism, an alternative interpretation of the dynamical evolution of the Friedmann-Robertson-Walker Universe. This description takes into account the entropy and temperature intrinsic to the horizon of the universe due to the information holographically stored there through non-gaussian statistical theories proposed by Tsallis and Kaniadakis. The effect of these non-gaussian statistics in the cosmological context is to change the strength of the gravitational constant. In this paper, we consider the w CDM model modified by the non-gaussian statistics and investigate the compatibility of these non-gaussian modificationmore » with the cosmological observations. In order to analyze in which extend the cosmological data constrain these non-extensive statistics, we will use type Ia supernovae, baryon acoustic oscillations, Hubble expansion rate function and the linear growth of matter density perturbations data. We show that Tsallis' statistics is favored at 1σ confidence level.« less

  13. The roles of non-extensivity and dust concentration as bifurcation parameters in dust-ion acoustic traveling waves in magnetized dusty plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narayan Ghosh, Uday; Kumar Mandal, Pankaj, E-mail: pankajwbmsd@gmail.com; Chatterjee, Prasanta

    Dust ion-acoustic traveling waves are studied in a magnetized dusty plasma in presence of static dust and non-extensive distributed electrons in the framework of Zakharov-Kuznesstov-Burgers (ZKB) equation. System of coupled nonlinear ordinary differential equations is derived from ZKB equation, and equilibrium points are obtained. Nonlinear wave phenomena are studied numerically using fourth order Runge-Kutta method. The change from unstable to stable solution and consequently to asymptotic stable of dust ion acoustic traveling waves is studied through dynamical system approach. It is found that some dramatical features emerge when the non-extensive parameter and the dust concentration parameters are varied. Behavior ofmore » the solution of the system changes from unstable to stable and stable to asymptotic stable depending on the value of the non-extensive parameter. It is also observed that when the dust concentration is increased the solution pattern is changed from oscillatory shocks to periodic solution. Thus, non-extensive and dust concentration parameters play crucial roles in determining the nature of the stability behavior of the system. Thus, the non-extensive parameter and the dust concentration parameters can be treated as bifurcation parameters.« less

  14. Compressive and rarefactive double layers in non-uniform plasma with q-nonextensive distributed electrons

    NASA Astrophysics Data System (ADS)

    Shan, S. Ali; Saleem, H.

    2018-05-01

    Electrostatic solitary waves and double layers (DLs) formed by the coupled ion acoustic (IA) and drift waves have been investigated in non-uniform plasma using q-nonextensive distribution function for the electrons and assuming ions to be cold Ti< Te. It is found that both compressive and rarefactive nonlinear structures (solitary waves and DLs) are possible in such a system. The steeper gradients are supportive for compressive solitary (and double layers) and destructive for rarefactive ones. The q-nonextensivity parameter q and the magnitudes of gradient scale lengths of density and temperature have significant effects on the amplitude of the double layers (and double layers) as well as on the speed of these structures. This theoretical model is general which has been applied here to the F-region ionosphere for illustration.

  15. Tsallis q-triplet, intermittent turbulence and Portevin-Le Chatelier effect

    NASA Astrophysics Data System (ADS)

    Iliopoulos, A. C.; Aifantis, E. C.

    2018-05-01

    In this paper, we extend a previous study concerning Portevin-LeChatelier (PLC) effect and Tsallis statistics (Iliopoulos et al., 2015). In particular, we estimate Tsallis' q-triplet, namely {qstat, qsens, qrel} for two sets of stress serration time series concerning the deformation of Cu-15%Al alloy corresponding to different deformation temperatures and thus types (A and B) of PLC bands. The results concerning the stress serrations analysis reveal that Tsallis q- triplet attains values different from unity ({qstat, qsens, qrel} ≠ {1,1,1}). In particular, PLC type A bands' serrations were found to follow Tsallis super-q-Gaussian, non-extensive, sub-additive, multifractal statistics indicating that the underlying dynamics are at the edge of chaos, characterized by global long range correlations and power law scaling. For PLC type B bands' serrations, the results revealed a Tsallis sub-q-Gaussian, non-extensive, super-additive, multifractal statistical profile. In addition, our results reveal also significant differences in statistical and dynamical features, indicating important variations of the stress field dynamics in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. We also estimate parameters commonly used for characterizing fully developed turbulence, such as structure functions and flatness coefficient (F), in order to provide further information about jerky flow underlying dynamics. Finally, we use two multifractal models developed to describe turbulence, namely Arimitsu and Arimitsu (A&A) [2000, 2001] theoretical model which is based on Tsallis statistics and p-model to estimate theoretical multifractal spectrums f(a). Furthermore, we estimate flatness coefficient (F) using a theoretical formula based on Tsallis statistics. The theoretical results are compared with the experimental ones showing a remarkable agreement between modeling and experiment. Finally, the results of this study verify, as well as, extend previous studies which stated that type B and type A PLC bands underlying dynamics are connected with distinct dynamical behavior, namely chaotic behavior for the first and self-organized critical (SOC) behavior for the latter, while they shed new light concerning the turbulent character of the PLC jerky flow.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niknam, A. R., E-mail: a-niknam@sbu.ac.ir; Rastbood, E.; Khorashadizadeh, S. M.

    The dielectric permittivity tensor of a magnetoactive current-driven plasma is obtained by employing the kinetic theory based on the Vlasov equation and Lorentz transformation formulas with an emphasize on the q-nonextensive statistics. By deriving the q-generalized dispersion relation of the low frequency modes in this plasma system, the possibility and properties of filamentation and ion acoustic instabilities are then studied. It is shown that the occurrence and the growth rate of these instabilities depend strongly on the nonextensive parameters, external magnetic field strength, and drift velocity. It is observed that the growth rate of ion acoustic instability is affected bymore » the magnetic field strength much more than that of the filamentation instability in the low frequency range. The external magnetic field facilitates the development of the ion-acoustic instability. It is also shown that the filamentation is the dominant instability only for the high value of drift velocity.« less

  17. Nonextensive statistics and skin depth of transverse wave in collisional plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hashemzadeh, M., E-mail: hashemzade@gmail.com

    Skin depth of transverse wave in a collisional plasma is studied taking into account the nonextensive electron distribution function. Considering the kinetic theory for charge particles and using the Bhatnagar-Gross-Krook collision model, a generalized transverse dielectric permittivity is obtained. The transverse dispersion relation in different frequency ranges is investigated. Obtaining the imaginary part of the wave vector from the dispersion relation, the skin depth for these frequency ranges is also achieved. Profiles of the skin depth show that by increasing the q parameter, the penetration depth decreases. In addition, the skin depth increases by increasing the electron temperature. Finally, itmore » is found that in the high frequency range and high electron temperature, the penetration depth decreases by increasing the collision frequency. In contrast, by increasing the collision frequency in a highly collisional frequency range, the skin depth of transverse wave increases.« less

  18. Head-on collision of dust acoustic solitons in a nonextensive plasma with variable size dust grains of arbitrary charge

    NASA Astrophysics Data System (ADS)

    Behery, E. E.

    2016-11-01

    The head-on collision of two dust acoustic solitons (DASs) in a nonextensive plasma with positive or negative dust grains fluid including the effect of dust size distribution (DSD) is studied. The phase shifts for the two solitons due to the collision are derived by applying the extended Poincaré-Lighthill-Kuo (PLK) method. The influences of the power law DSD and the nonextensivity of plasma particles on the characteristic properties of the head-on collision of DASs are analyzed. It is found that the phase shifts can vanish, only for the case of positive dust grains, for certain values and ranges of the dust grain radius and the entropic index of ions (qi) . Also, they undergo a cutoff in the range of qi>1 for the subextensive distribution. A brief discussion of possible applications in laboratory and space plasmas is included.

  19. Behavior of collisional sheath in electronegative plasma with q-nonextensive electron distribution

    NASA Astrophysics Data System (ADS)

    Borgohain, Dima Rani; Saharia, K.

    2018-03-01

    Electronegative plasma sheath is addressed in a collisional unmagnetized plasma consisting of q-nonextensive electrons, Boltzmann distributed negative ions and cold fluid positive ions. Considering the positive ion-neutral collisions and ignoring the effects of ionization and collisions between negative species and positive ions (neutrals), a modified Bohm sheath criterion and hence floating potential are derived by using multifluid model. Using the modified Bohm sheath criterion, the sheath characteristics such as spatial profiles of density, potential and net space charge density have been numerically investigated. It is found that increasing values of q-nonextensivity, electronegativity and collisionality lead to a decrease of the sheath thickness and an increase of the sheath potential and the net space charge density. With increasing values of the electron temperature to negative ion temperature ratio, the sheath thickness increases and the sheath potential as well as the net space charge density in the sheath region decreases.

  20. On the putative essential discreteness of q-generalized entropies

    NASA Astrophysics Data System (ADS)

    Plastino, A.; Rocca, M. C.

    2017-12-01

    It has been argued in Abe (2010), entitled Essential discreteness in generalized thermostatistics with non-logarithmic entropy, that ;continuous Hamiltonian systems with long-range interactions and the so-called q-Gaussian momentum distributions are seen to be outside the scope of non-extensive statistical mechanics;. The arguments are clever and appealing. We show here that, however, some mathematical subtleties render them unconvincing.

  1. Noisy coupled logistic maps in the vicinity of chaos threshold.

    PubMed

    Tirnakli, Ugur; Tsallis, Constantino

    2016-04-01

    We focus on a linear chain of N first-neighbor-coupled logistic maps in the vicinity of their edge of chaos in the presence of a common noise. This model, characterised by the coupling strength ϵ and the noise width σmax, was recently introduced by Pluchino et al. [Phys. Rev. E 87, 022910 (2013)]. They detected, for the time averaged returns with characteristic return time τ, possible connections with q-Gaussians, the distributions which optimise, under appropriate constraints, the nonadditive entropy, Sq, basis of nonextensive statistics mechanics. Here, we take a closer look on this model, and numerically obtain probability distributions which exhibit a slight asymmetry for some parameter values, in variance with simple q-Gaussians. Nevertheless, along many decades, the fitting with q-Gaussians turns out to be numerically very satisfactory for wide regions of the parameter values, and we illustrate how the index q evolves with (N,τ,ϵ,σmax). It is nevertheless instructive on how careful one must be in such numerical analysis. The overall work shows that physical and/or biological systems that are correctly mimicked by this model are thermostatistically related to nonextensive statistical mechanics when time-averaged relevant quantities are studied.

  2. NON-EXTENSIVE STATISTICS TO THE COSMOLOGICAL LITHIUM PROBLEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, S. Q.; He, J. J.; Parikh, A.

    Big Bang nucleosynthesis (BBN) theory predicts the abundances of the light elements D, {sup 3}He, {sup 4}He, and {sup 7}Li produced in the early universe. The primordial abundances of D and {sup 4}He inferred from observational data are in good agreement with predictions, however, BBN theory overestimates the primordial {sup 7}Li abundance by about a factor of three. This is the so-called “cosmological lithium problem.” Solutions to this problem using conventional astrophysics and nuclear physics have not been successful over the past few decades, probably indicating the presence of new physics during the era of BBN. We have investigated themore » impact on BBN predictions of adopting a generalized distribution to describe the velocities of nucleons in the framework of Tsallis non-extensive statistics. This generalized velocity distribution is characterized by a parameter q , and reduces to the usually assumed Maxwell–Boltzmann distribution for q  = 1. We find excellent agreement between predicted and observed primordial abundances of D, {sup 4}He, and {sup 7}Li for 1.069 ≤  q  ≤ 1.082, suggesting a possible new solution to the cosmological lithium problem.« less

  3. Generalization of symmetric α-stable Lévy distributions for q >1

    NASA Astrophysics Data System (ADS)

    Umarov, Sabir; Tsallis, Constantino; Gell-Mann, Murray; Steinberg, Stanly

    2010-03-01

    The α-stable distributions introduced by Lévy play an important role in probabilistic theoretical studies and their various applications, e.g., in statistical physics, life sciences, and economics. In the present paper we study sequences of long-range dependent random variables whose distributions have asymptotic power-law decay, and which are called (q,α)-stable distributions. These sequences are generalizations of independent and identically distributed α-stable distributions and have not been previously studied. Long-range dependent (q,α)-stable distributions might arise in the description of anomalous processes in nonextensive statistical mechanics, cell biology, finance. The parameter q controls dependence. If q =1 then they are classical independent and identically distributed with α-stable Lévy distributions. In the present paper we establish basic properties of (q,α)-stable distributions and generalize the result of Umarov et al. [Milan J. Math. 76, 307 (2008)], where the particular case α =2,qɛ[1,3) was considered, to the whole range of stability and nonextensivity parameters α ɛ(0,2] and q ɛ[1,3), respectively. We also discuss possible further extensions of the results that we obtain and formulate some conjectures.

  4. Noisy coupled logistic maps in the vicinity of chaos threshold

    NASA Astrophysics Data System (ADS)

    Tirnakli, Ugur; Tsallis, Constantino

    2016-04-01

    We focus on a linear chain of N first-neighbor-coupled logistic maps in the vicinity of their edge of chaos in the presence of a common noise. This model, characterised by the coupling strength ɛ and the noise width σmax, was recently introduced by Pluchino et al. [Phys. Rev. E 87, 022910 (2013)]. They detected, for the time averaged returns with characteristic return time τ, possible connections with q-Gaussians, the distributions which optimise, under appropriate constraints, the nonadditive entropy, Sq, basis of nonextensive statistics mechanics. Here, we take a closer look on this model, and numerically obtain probability distributions which exhibit a slight asymmetry for some parameter values, in variance with simple q-Gaussians. Nevertheless, along many decades, the fitting with q-Gaussians turns out to be numerically very satisfactory for wide regions of the parameter values, and we illustrate how the index q evolves with ( N , τ , ɛ , σ m a x ) . It is nevertheless instructive on how careful one must be in such numerical analysis. The overall work shows that physical and/or biological systems that are correctly mimicked by this model are thermostatistically related to nonextensive statistical mechanics when time-averaged relevant quantities are studied.

  5. Effects of the non-extensive parameter on the propagation of ion acoustic waves in five-component cometary plasma system

    NASA Astrophysics Data System (ADS)

    Mahmoud, Abeer A.

    2018-01-01

    Some important evolution nonlinear partial differential equations are derived using the reductive perturbation method for unmagnetized collisionless system of five component plasma. This plasma system is a multi-ion contains negatively and positively charged Oxygen ions (heavy ions), positive Hydrogen ions (lighter ions), hot electrons from solar origin and colder electrons from cometary origin. The positive Hydrogen ion and the two types of electrons obey q-non-extensive distributions. The derived equations have three types of ion acoustic waves, which are soliton waves, shock waves and kink waves. The effects of the non-extensive parameters for the hot electrons, the colder electrons and the Hydrogen ions on the propagation of the envelope waves are studied. The compressive and rarefactive shapes of the three envelope waves appear in this system for the first order of the power of the nonlinearity strength with different values of non-extensive parameters. For the second order, the strength of nonlinearity will increase and the compressive type of the envelope wave only appears.

  6. The existence of electron-acoustic shock waves and their interactions in a non-Maxwellian plasma with q-nonextensive distributed electrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, Jiu-Ning; He, Yong-Lin; Han, Zhen-Hai

    2013-07-15

    We present a theoretical investigation for the nonlinear interaction between electron-acoustic shock waves in a nonextensive two-electron plasma. The interaction is governed by a pair of Korteweg-de Vries-Burgers equations. We focus on studying the colliding effects on the propagation of shock waves, more specifically, we have studied the effects of plasma parameters, i.e., the nonextensive parameter q, the “hot” to “cold” electron number density ratio α, and the normalized electron kinematic viscosity η{sub 0} on the trajectory changes (phase shifts) of shock waves. It is found that there are trajectory changes (phase shifts) for both colliding shock waves in themore » present plasma system. We also noted that the nonlinearity has no decisive effect on the trajectory changes, the occurrence of trajectory changes may be due to the combined role played by the dispersion and dissipation of the nonlinear structure. Our theoretical study may be beneficial to understand the propagation and interaction of nonlinear electrostatic waves and may brings a possibility to develop the nonlinear theory of electron-acoustic waves in astrophysical plasma systems.« less

  7. On the connection between financial processes with stochastic volatility and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Queirós, S. M. D.; Tsallis, C.

    2005-11-01

    The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).

  8. Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2013-04-01

    Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  9. To b or not to b ?? A nonextensive view of b-value in the Gutenberg-Richter law.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2014-05-01

    The Gutenberg-Richter (GR) (Gutenberg and Richter, 1944) law one of the cornerstones of modern seismology has been considered as a paradigm of manifestation of self-organized criticality since the dependence of the cumulative number of earthquakes with energy, i.e., the number of earthquakes with energy greater than E, behaves as a power law with the b value related to the critical exponent. A great number of seismic hazard studies have been originated as a result of this law. The Gutenberg-Richter (GR) law is an empirical relationship, which recent efforts relate it with general physical principles (Kagan and Knopoff, 1981; Wesnousky, 1999; Sarlis et al., 2010; Telesca, 2012; Vallianatos and Sammonds, 2013). Nonextensive statistical mechanics pioneered by Tsallis (Tsallis, 2009) provides a consistent theoretical framework for the studies of complex systems in their nonequilibrium stationary states, systems with multi fractal and self-similar structures, long-range interacting systems, etc. Earth is such system. In the present work we analyze the different pathways (originated in Sotolongo-Costa, A. Posadas , 2004; Silva et al., 2006) to extract the generalization of the G-R law as obtained in the frame of non extensive statistical physics. We estimate the b-value and we discuss its underline physics. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme. References Gutenberg, B. and C. F. Richter (1944). Bull. Seismol. Soc. Am. 34, 185-188. Kagan, Y. Y. and L. Knopoff (1981). J. Geophys. Res. 86, 2853-2862. Sarlis, N., E. Skordas and P. Varotsos (2010). Physical Review E - Statistical, Nonlinear, and Soft Matter Physics 82 (2) , 021110. Silva, R., G. Franca, C. Vilar and J. Alcaniz (2006). Phys. Rev. E, 73, 026102 Sotolongo-Costa, O. and A. Posadas (2004). Phys. Rev. Lett., 92, 048501 Telesca, L. (2012). Bull. Seismol. Soc. Amer., 102,886-891. Tsallis, C. (2009). Introduction to Nonextensive Statistical Mechanics, Approaching a Complex World Springer, New York Vallianatos, F. and P. Sammonds, (2013). Tectonophysics 590, 52-58 Wesnousky, S. G. (1999). Bull. Seismol. Soc. Am. 89, 1131-1137.

  10. Accelerated cosmos in a nonextensive setup

    NASA Astrophysics Data System (ADS)

    Moradpour, H.; Bonilla, Alexander; Abreu, Everton M. C.; Neto, Jorge Ananias

    2017-12-01

    Here we consider a flat FRW universe whose horizon entropy meets the Rényi entropy of nonextensive systems. In our model, the ordinary energy-momentum conservation law is not always valid. By applying the Clausius relation as well as the Cai-Kim temperature to the apparent horizon of a flat FRW universe, we obtain modified Friedmann equations. Fitting the model to the observational data on the current accelerated universe, some values for the model parameters are also addressed. Our study shows that the current accelerating phase of universe expansion may be described by a geometrical fluid, originated from the nonextensive aspects of geometry, which models a varying dark energy source interacting with the matter field in the Rastall way. Moreover, our results indicate that the probable nonextensive features of spacetime may also be used to model a varying dark energy source which does not interact with the matter field and is compatible with the current accelerated phase of the Universe.

  11. Derivative pricing with non-linear Fokker-Planck dynamics

    NASA Astrophysics Data System (ADS)

    Michael, Fredrick; Johnson, M. D.

    2003-06-01

    We examine how the Black-Scholes derivative pricing formula is modified when the underlying security obeys non-extensive statistics and Fokker-Planck dynamics. An unusual feature of such securities is that the volatility in the underlying Ito-Langevin equation depends implicitly on the actual market rate of return. This complicates most approaches to valuation. Here we show that progress is possible using variations of the Cox-Ross valuation technique.

  12. The foreign exchange market: return distributions, multifractality, anomalous multifractality and the Epps effect

    NASA Astrophysics Data System (ADS)

    Drożdż, Stanisław; Kwapień, Jarosław; Oświȩcimka, Paweł; Rak, Rafał

    2010-10-01

    We present a systematic study of various statistical characteristics of high-frequency returns from the foreign exchange market. This study is based on six exchange rates forming two triangles: EUR-GBP-USD and GBP-CHF-JPY. It is shown that the exchange rate return fluctuations for all of the pairs considered are well described by the non-extensive statistics in terms of q-Gaussians. There exist some small quantitative variations in the non-extensivity q-parameter values for different exchange rates (which depend also on the time scales studied), and this can be related to the importance of a given exchange rate in the world's currency trade. Temporal correlations organize the series of returns such that they develop the multifractal characteristics for all of the exchange rates, with a varying degree of symmetry of the singularity spectrum f(α), however. The most symmetric spectrum is identified for the GBP/USD. We also form time series of triangular residual returns and find that the distributions of their fluctuations develop disproportionately heavier tails as compared to small fluctuations, which excludes description in terms of q-Gaussians. The multifractal characteristics of these residual returns reveal such anomalous properties as negative singularity exponents and even negative singularity spectra. Such anomalous multifractal measures have so far been considered in the literature in connection with diffusion-limited aggregation and with turbulence. Studying the cross-correlations among different exchange rates, we found that market inefficiency on short time scales leads to the occurrence of the Epps effect on much longer time scales, but comparable to the ones for the stock market. Although the currency market is much more liquid than the stock markets and has a much greater transaction frequency, the building up of correlations takes up to several hours—a duration that does not differ much from what is observed in the stock markets. This may suggest that non-synchronicity of transactions is not the unique source of the observed effect.

  13. Vertical sizes of 1-D and 2-D electrostatic solitons with nonextensive and trapped electrons in the upper ionosphere

    NASA Astrophysics Data System (ADS)

    Ali Shan, Shaukat; Saleem, Hamid

    2018-05-01

    The vertical sizes of one-dimensional (1-D) and two dimensional (2-D) electrostatic solitons are estimated in the oxygen-hydrogen (O - H) and pure oxygen plasmas of the upper ionosphere taking into account the effects of non-extensive and trapped electrons. The field-aligned flow of oxygen ions is also considered. It is found that both electron trapping and non-extensivity play a constructive role in the formation of 1-D and 2-D solitary structures. The vertical size of the solitons is not known through observations, but here it is pointed out that the vertical size of these structures should be of the order of a few meters at the altitude of 800 km in the 1-D case. On the other hand, in the 2-D case, the vertical size is much larger than the horizontal size and it turns out to be of the order of a few kilometers, while the width is about a few hundred meters in agreement with the observations.

  14. Thermodynamic geometry for a non-extensive ideal gas

    NASA Astrophysics Data System (ADS)

    López, J. L.; Obregón, O.; Torres-Arenas, J.

    2018-05-01

    A generalized entropy arising in the context of superstatistics is applied to an ideal gas. The curvature scalar associated to the thermodynamic space generated by this modified entropy is calculated using two formalisms of the geometric approach to thermodynamics. By means of the curvature/interaction hypothesis of the geometric approach to thermodynamic geometry it is found that as a consequence of considering a generalized statistics, an effective interaction arises but the interaction is not enough to generate a phase transition. This generalized entropy seems to be relevant in confinement or in systems with not so many degrees of freedom, so it could be interesting to use such entropies to characterize the thermodynamics of small systems.

  15. Relativistic H-theorem and nonextensive kinetic theory

    NASA Astrophysics Data System (ADS)

    Silva, R.; Lima, J. A. S.

    2003-08-01

    In 1988 Tsallis proposed a striking generalization of the Boltzmann-Gibbs entropy functional form given by [1] (1) where kB is Boltzmann's constant, pi is the probability of the i-th microstate, and the parameter q is any real number. Nowadays, the q-thermostatistics associated with Sq is being hailed as the possible basis of a theoretical framework appropriate to deal with nonextensive settings. There is a growing body of evidence suggesting that Sq provides a convenient frame for the thermostatistical analysis of many physical systems and processes ranging from the laboratory scale to the astrophysical domain [2]. However, all the basic results, including the proof of the H-theorem has been worked in the classical non-relativistic domain [3]. In this context we discuss the relativistic kinetic foundations of the Tsallis' nonextensive approach through the full Boltzmann's transport equation. Our analysis follows from a nonextensive generalization of the "molecular chaos hypothesis". For q > 0, the q-transport equation satisfies a relativistic H-theorem based on Tsallis entropy. It is also proved that the collisional equilibrium is given by the relativistic Tsallis' q-nonextensive velocity distribution. References [1] C. Tsallis, J. Stat. Phys. 52, 479 (1988). [2] J. A. S. Lima, R. Silva, and J. Santos, Astron. and Astrophys. 396, 309 (2002). [3] J. A. S. Lima, R. Silva, and A. R. Plastino, Phys. Rev. Lett. 86, 2938 (2001).

  16. Strong evidences for a nonextensive behavior of the rotation period in open clusters

    NASA Astrophysics Data System (ADS)

    de Freitas, D. B.; Nepomuceno, M. M. F.; Soares, B. B.; Silva, J. R. P.

    2014-11-01

    Time-dependent nonextensivity in a stellar astrophysical scenario combines nonextensive entropic indices qK derived from the modified Kawaler's parametrization, and q, obtained from rotational velocity distribution. These q's are related through a heuristic single relation given by q≈ q0(1-Δ t/qK) , where t is the cluster age. In a nonextensive scenario, these indices are quantities that measure the degree of nonextensivity present in the system. Recent studies reveal that the index q is correlated to the formation rate of high-energy tails present in the distribution of rotation velocity. On the other hand, the index qK is determined by the stellar rotation-age relationship. This depends on the magnetic-field configuration through the expression qK=1+4aN/3 , where a and N denote the saturation level of the star magnetic field and its topology, respectively. In the present study, we show that the connection q-qK is also consistent with 548 rotation period data for single main-sequence stars in 11 open clusters aged less than 1 Gyr. The value of qK ˜ 2.5 from our unsaturated model shows that the mean magnetic-field topology of these stars is slightly more complex than a purely radial field. Our results also suggest that stellar rotational braking behavior affects the degree of anti-correlation between q and cluster age t. Finally, we suggest that stellar magnetic braking can be scaled by the entropic index q.

  17. Text mining by Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Jamaati, Maryam; Mehri, Ali

    2018-01-01

    Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms' relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.

  18. On statistical properties of traded volume in financial markets

    NASA Astrophysics Data System (ADS)

    de Souza, J.; Moyano, L. G.; Duarte Queirós, S. M.

    2006-03-01

    In this article we study the dependence degree of the traded volume of the Dow Jones 30 constituent equities by using a nonextensive generalised form of the Kullback-Leibler information measure. Our results show a slow decay of the dependence degree as a function of the lag. This feature is compatible with the existence of non-linearities in this type time series. In addition, we introduce a dynamical mechanism whose associated stationary probability density function (PDF) presents a good agreement with the empirical results.

  19. Solubility of gas in confined systems. Nonextensive thermodynamics approach.

    PubMed

    Letellier, Pierre; Turmine, Mireille

    2013-02-15

    The use of the concepts of the nonextensive thermodynamics allows reconsidering the equilibrium of bubble solubilization and more commonly of gaseous aggregates in supersaturated solutions of gas. The introduced relations are general and include as particular cases the equations usually used to describe these phenomena. These equations are discussed. Especially, we specified the domain of application of Kelvin's relation which was illustrated by the solubility of gases in fogs and clouds. Various possibilities of thoughts on the behavior of the gaseous aggregates and nano-systems are proposed. Thus, the introduced relations permit to consider the presence of gaseous aggregates in equilibrium with the solution even for under-saturated solution. Nonextensive thermodynamics admits the notion of negative pressure at the inner of confined phases (solid or liquid). Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Comparing emerging and mature markets during times of crises: A non-extensive statistical approach

    NASA Astrophysics Data System (ADS)

    Namaki, A.; Koohi Lai, Z.; Jafari, G. R.; Raei, R.; Tehrani, R.

    2013-07-01

    One of the important issues in finance and economics for both scholars and practitioners is to describe the behavior of markets, especially during times of crises. In this paper, we analyze the behavior of some mature and emerging markets with a Tsallis entropy framework that is a non-extensive statistical approach based on non-linear dynamics. During the past decade, this technique has been successfully applied to a considerable number of complex systems such as stock markets in order to describe the non-Gaussian behavior of these systems. In this approach, there is a parameter q, which is a measure of deviation from Gaussianity, that has proved to be a good index for detecting crises. We investigate the behavior of this parameter in different time scales for the market indices. It could be seen that the specified pattern for q differs for mature markets with regard to emerging markets. The findings show the robustness of the stated approach in order to follow the market conditions over time. It is obvious that, in times of crises, q is much greater than in other times. In addition, the response of emerging markets to global events is delayed compared to that of mature markets, and tends to a Gaussian profile on increasing the scale. This approach could be very useful in application to risk and portfolio management in order to detect crises by following the parameter q in different time scales.

  1. Dust Ion-Acoustic Shock Waves in a Multicomponent Magnetorotating Plasma

    NASA Astrophysics Data System (ADS)

    Kaur, Barjinder; Saini, N. S.

    2018-02-01

    The nonlinear properties of dust ion-acoustic (DIA) shock waves in a magnetorotating plasma consisting of inertial ions, nonextensive electrons and positrons, and immobile negatively charged dust are examined. The effects of dust charge fluctuations are not included in the present investigation, but the ion kinematic viscosity (collisions) is a source of dissipation, leading to the formation of stable shock structures. The Zakharov-Kuznetsov-Burgers (ZKB) equation is derived using the reductive perturbation technique, and from its solution the effects of different physical parameters, i.e. nonextensivity of electrons and positrons, kinematic viscosity, rotational frequency, and positron and dust concentrations, on the characteristics of shock waves are examined. It is observed that physical parameters play a very crucial role in the formation of DIA shocks. This study could be useful in understanding the electrostatic excitations in dusty plasmas in space (e.g. interstellar medium).

  2. Dust ion acoustic freak waves in a plasma with two temperature electrons featuring Tsallis distribution

    NASA Astrophysics Data System (ADS)

    Chahal, Balwinder Singh; Singh, Manpreet; Shalini; Saini, N. S.

    2018-02-01

    We present an investigation for the nonlinear dust ion acoustic wave modulation in a plasma composed of charged dust grains, two temperature (cold and hot) nonextensive electrons and ions. For this purpose, the multiscale reductive perturbation technique is used to obtain a nonlinear Schrödinger equation. The critical wave number, which indicates where the modulational instability sets in, has been determined precisely for various regimes. The influence of plasma background nonextensivity on the growth rate of modulational instability is discussed. The modulated wavepackets in the form of either bright or dark type envelope solitons may exist. Formation of rogue waves from bright envelope solitons is also discussed. The investigation indicates that the structural characteristics of these envelope excitations (width, amplitude) are significantly affected by nonextensivity, dust concentration, cold electron-ion density ratio and temperature ratio.

  3. Thermostatistically approaching living systems: Boltzmann Gibbs or nonextensive statistical mechanics?

    NASA Astrophysics Data System (ADS)

    Tsallis, Constantino

    2006-03-01

    Boltzmann-Gibbs ( BG) statistical mechanics is, since well over one century, successfully used for many nonlinear dynamical systems which, in one way or another, exhibit strong chaos. A typical case is a classical many-body short-range-interacting Hamiltonian system (e.g., the Lennard-Jones model for a real gas at moderately high temperature). Its Lyapunov spectrum (which characterizes the sensitivity to initial conditions) includes positive values. This leads to ergodicity, the stationary state being thermal equilibrium, hence standard applicability of the BG theory is verified. The situation appears to be of a different nature for various phenomena occurring in living organisms. Indeed, such systems exhibit a complexity which does not really accommodate with this standard dynamical behavior. Life appears to emerge and evolve in a kind of delicate situation, at the frontier between large order (low adaptability and long memory; typically characterized by regular dynamics, hence only nonpositive Lyapunov exponents) and large disorder (high adaptability and short memory; typically characterized by strong chaos, hence at least one positive Lyapunov exponent). Along this frontier, the maximal relevant Lyapunov exponents are either zero or close to that, characterizing what is currently referred to as weak chaos. This type of situation is shared by a great variety of similar complex phenomena in economics, linguistics, to cite but a few. BG statistical mechanics is built upon the entropy S=-k∑plnp. A generalization of this form, S=k(1-∑piq)/(q-1) (with S=S), has been proposed in 1988 as a basis for formulating what is nowadays currently called nonextensive statistical mechanics. This theory appears to be particularly adapted for nonlinear dynamical systems exhibiting, precisely, weak chaos. Here, we briefly review the theory, its dynamical foundation, its applications in a variety of disciplines (with special emphasis to living systems), and its connections with the ubiquitous scale-free networks.

  4. PREFACE: Mathematical Aspects of Generalized Entropies and their Applications

    NASA Astrophysics Data System (ADS)

    Suyari, Hiroki; Ohara, Atsumi; Wada, Tatsuaki

    2010-01-01

    In the recent increasing interests in power-law behaviors beyond the usual exponential ones, there have been some concrete attempts in statistical physics to generalize the standard Boltzmann-Gibbs statistics. Among such generalizations, nonextensive statistical mechanics has been well studied for about the last two decades with many modifications and refinements. The generalization has provided not only a theoretical framework but also many applications such as chaos, multi-fractal, complex systems, nonequilibrium statistical mechanics, biophysics, econophysics, information theory and so on. At the same time as the developments in the generalization of statistical mechanics, the corresponding mathematical structures have also been required and uncovered. In particular, some deep connections to mathematical sciences such as q-analysis, information geometry, information theory and quantum probability theory have been revealed recently. These results obviously indicate an existence of the generalized mathematical structure including the mathematical framework for the exponential family as a special case, but the whole structure is still unclear. In order to make an opportunity to discuss the mathematical structure induced from generalized entropies by scientists in many fields, the international workshop 'Mathematical Aspects of Generalized Entropies and their Applications' was held on 7-9 July 2009 at Kyoto TERRSA, Kyoto, Japan. This volume is the proceedings of the workshop which consisted of 6 invited speakers, 14 oral presenters, 7 poster presenters and 63 other participants. The topics of the workshop cover the nonextensive statistical mechanics, chaos, cosmology, information geometry, divergence theory, econophysics, materials engineering, molecular dynamics and entropy theory, information theory and so on. The workshop was organized as the first attempt to discuss these mathematical aspects with leading experts in each area. We would like to express special thanks to all the invited speakers, the contributors and the participants at the workshop. We are also grateful to RIMS (Research Institute for Mathematical Science) in Kyoto University and the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (B), 18300003, 2009 for their support. Organizing Committee Editors of the Proceedings Hiroki Suyari (Chiba University, Japan) Atsumi Ohara (Osaka University, Japan) Tatsuaki Wada (Ibaraki University, Japan) Conference photograph

  5. Landau damping of dust acoustic waves in the presence of hybrid nonthermal nonextensive electrons

    NASA Astrophysics Data System (ADS)

    El-Taibany, W. F.; Zedan, N. A.; Taha, R. M.

    2018-06-01

    Based on the kinetic theory, Landau damping of dust acoustic waves (DAWs) propagating in a dusty plasma composed of hybrid nonthermal nonextensive distributed electrons, Maxwellian distributed ions and negatively charged dust grains is investigated using Vlasov-Poisson's equations. The characteristics of the DAWs Landau damping are discussed. It is found that the wave frequency increases by decreasing (increasing) the value of nonextensive (nonthermal) parameter, q (α ). It is recognized that α plays a significant role in observing damping or growing DAW oscillations. For small values of α , damping modes have been observed until reaching a certain value of α at which ω i vanishes, then a growing mode appears in the case of superextensive electrons. However, only damping DAW modes are observed in case of subextensive electrons. The present study is useful in the space situations where such distribution exists.

  6. Nonextensive GES instability with nonlinear pressure effects

    NASA Astrophysics Data System (ADS)

    Gohain, Munmi; Karmakar, Pralay Kumar

    2018-03-01

    We herein analyze the instability dynamics associated with the nonextensive nonthermal gravito-electrostatic sheath (GES) model for the perturbed solar plasma portraiture. The usual neutral gas approximation is herewith judiciously relaxed and the laboratory plasma-wall interaction physics is procedurally incorporated amid barotropic nonlinearity. The main motivation here stems from the true nature of the solar plasma system as a set of concentric nonlocal nonthermal sub-layers as evidenced from different multi-space satellite probes and missions. The formalism couples the solar interior plasma (SIP, bounded) and solar wind plasma (SWP, unbounded) via the diffused solar surface boundary (SSB) formed due to an exact long-range gravito-electrostatic force-equilibration. A linear normal mode ansatz reveals both dispersive and non-dispersive features of the modified GES collective wave excitations. It is seen that the thermostatistical GES stability depends solely on the electron-to-ion temperature ratio. The damping behavior on both the scales is more pronounced in the acoustic domain, K → ∞ , than the gravitational domain, K → 0 ; where, K is the Jeans-normalized angular wave number. It offers a unique quasi-linear coupling of the gravitational and acoustic fluctuations amid the GES force action. The results may be useful to see the excitation dynamics of natural normal modes in bounded nonextensive astero-environs from a new viewpoint of the plasma-wall coupling mechanism.

  7. Super-soliton dust-acoustic waves in four-component dusty plasma using non-extensive electrons and ions distributions

    NASA Astrophysics Data System (ADS)

    El-Wakil, S. A.; Abulwafa, Essam M.; Elhanbaly, Atalla A.

    2017-07-01

    Based on Sagdeev pseudo-potential and phase-portrait, the dynamics of four-component dust plasma with non-extensively distributed electrons and ions are investigated. Three distinct types of nonlinear waves, namely, soliton, double layer, and super-soliton, have been found. The basic features of such waves are high sensitivity to Mach number, non-extensive parameter, and dust temperature ratio. It is found that the multi-component plasma is a necessary condition for super-soliton's existence, having a wider amplitude and a larger width than the regular soliton. Super-solitons may also exist when the Sagdeev pseudo-potential curves admit at least four extrema and two roots. In our multi-component plasma system, the super-solitons can be found by increasing the Mach number and the non-extensive parameter beyond those of double-layers. On the contrary, the super-soliton can be produced by decreasing the dust temperature ratio. The conditions of the onset of such nonlinear waves and its merging to regular solitons have been studied. This work shows that the obtained nonlinear waves are found to exist only in the super-sonic Mach number regime. The obtained results may be of wide relevance in the field of space plasma and may also be helpful to better understand the nonlinear fluctuations in the Auroral-zone of the Earth's magnetosphere.

  8. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed distributions, implying the complexity of the spatiotemporal properties of seismicity and the usefulness of NESP in investigating such phenomena, exhibiting scale-free nature and long range memory effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).

  9. Exact solutions and phenomenological constraints from massive scalars in a gravity's rainbow spacetime

    NASA Astrophysics Data System (ADS)

    Bezerra, V. B.; Christiansen, H. R.; Cunha, M. S.; Muniz, C. R.

    2017-07-01

    We obtain the exact (confluent Heun) solutions to the massive scalar field in a gravity's rainbow Schwarzschild metric. With these solutions at hand, we study the Hawking radiation resulting from the tunneling rate through the event horizon. We show that the emission spectrum obeys nonextensive statistics and is halted when a certain mass remnant is reached. Next, we infer constraints on the rainbow parameters from recent LHC particle physics experiments and Hubble STIS astrophysics measurements. Finally, we study the low frequency limit in order to find the modified energy spectrum around the source.

  10. Dimensional crossover in fragmentation

    NASA Astrophysics Data System (ADS)

    Sotolongo-Costa, Oscar; Rodriguez, Arezky H.; Rodgers, G. J.

    2000-11-01

    Experiments in which thick clay plates and glass rods are fractured have revealed different behavior of fragment mass distribution function in the small and large fragment regions. In this paper we explain this behavior using non-extensive Tsallis statistics and show how the crossover between the two regions is caused by the change in the fragments’ dimensionality during the fracture process. We obtain a physical criterion for the position of this crossover and an expression for the change in the power-law exponent between the small and large fragment regions. These predictions are in good agreement with the experiments on thick clay plates.

  11. Driven phase space vortices in plasmas with nonextensive velocity distribution

    NASA Astrophysics Data System (ADS)

    Trivedi, Pallavi; Ganesh, Rajaraman

    2017-03-01

    The evolution of chirp-driven electrostatic waves in unmagnetized plasmas is numerically investigated by using a one-dimensional (1D) Vlasov-poisson solver with periodic boundary conditions. The initial velocity distribution of the 1D plasma is assumed to be governed by nonextensive q distribution [C. Tsallis, J. Stat. Phys. 52, 479 (1988)]. For an infinitesimal amplitude of an external drive, we investigate the effects of chirp driven dynamics that leads to the formation of giant phase space vortices (PSV) for both Maxwellian (q = 1) and non-Maxwellian ( q ≠ 1 ) plasmas. For non-Maxwellian plasmas, the formation of giant PSV with multiple extrema and phase velocities is shown to be dependent on the strength of "q". Novel features such as "shark"-like and transient "honeycomb"-like structures in phase space are discussed. Wherever relevant, we compare our results with previous work.

  12. Ion acoustic solitons in an electronegative plasma with electron trapping and nonextensivity effects

    NASA Astrophysics Data System (ADS)

    Ali Shan, S.

    2018-03-01

    The impact of electron trapping and nonextensivity on the low frequency ion acoustic solitary waves in an electronegative plasma is investigated. The energy integral equation with the Sagdeev truncated approach is derived, which is then solved with the help of suitable parameters and necessary conditions to get the solitary structures. The minimum Mach (M) number needed to calculate the solitary structures is found to be varying under the impact of trapping efficiency determining factor β and entropic index q. The results have been illustrated with the help of physically acceptable parameters and the amplitude of nonlinear solitary structures is found to be modified significantly because of electron trapping efficiency β and entropic index q. This study has been made with reference to Laboratory observation, which can also be helpful in Space and astrophysical plasmas where electronegative plasmas have been reported.

  13. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad range of spatial, temporal and magnitude scales. Provided that the multivariate empirical frequency distributions are based on a sufficient number of observations as an empirical lower limit, the results are stable and consistent with the established ken, irrespective of the magnitude and spatio-temporal range of the earthquake catalogue, or operations pertaining to re-sampling, bootstrapping or re-arrangement of the catalogue. It is also demonstrated that that the expression of the regional active tectonic grain may comprise a mixture of processes significantly dependent on Δd. The analysis of the size (energy) distribution of earthquakes yielded results consistent with a correlated sub-extensive system; the results are also consistent with conventional determinations of Frequency-Magnitude distributions. The analysis of interevent times, has determined the existence of sub-extensivity and near-field interaction (correlation) in the complete catalogue of Greek and western Turkish seismicity (mixed background earthquake activity and aftershock processes),as well as in the pure background process (declustered catalogue).This could be attributed to the joint effect of near-field interaction between neighbouring earthquakes or seismic areas and interaction within aftershock sequences. The background process appears to be moderately - weakly correlated at the far field. Formal random temporal processes have not been detected. A general syllogism affordable by the above observations is that aftershock sequences may be an integral part of the seismogenetic process, as they appear to partake in long-range interaction. A formal explanation of such an effect is pending, but may nevertheless involve delayed remote triggering of seismic activity by (transient or static) stress transfer from the main shocks and large aftershocks and/or cascading effects already discussed by Marsan and Lengliné (2008). In this view, the effect weakens when aftershocks are removed because aftershocks are the link between the main shocks and their remote offshoot. Overall, the above results compare well to the results of North Californian seismicity which have shown that the expression of seismicity at Northern California is generally consistent with non-extensive (sub-extensive) thermodynamics. Acknowledgments: This work was supported by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project "Integrated understanding of Seismicity, using innovative methodologies of Fracture Mechanics along with Earthquake and Non-Extensive Statistical Physics - Application to the geodynamic system of the Hellenic Arc - SEISMO FEAR HELLARC". References: Tzanis A., Vallianatos F., Efstathiou A., Multidimensional earthquake frequency distributions consistent with Non-Extensive Statistical Physics: the interdependence of magnitude, interevent time and interevent distance in North California. Bulletin of the Geological Society of Greece, vol. XLVII 2013. Proceedings of the 13th International Congress, Chania, Sept. 2013 Tzanis A., Vallianatos F., Efstathiou A., Generalized multidimensional earthquake frequency distributions consistent with Non-Extensive Statistical Physics: An appraisal of the universality in the interdependence of magnitude, interevent time and interevent distance Geophysical Research Abstracts, Vol. 15, EGU2013-628, 2013, EGU General Assembly 2013 Marsan, D. and Lengliné, O., 2008. Extending earthquakes's reach through cascading, Science, 319, 1076; doi: 10.1126/science.1148783 On-line Bulletin, http://www.isc.ac.uk, Internatl. Seis. Cent., Thatcham, United Kingdom, 2011.

  14. Worldwide seismicity in view of non-extensive statistical physics

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kaliopi; Vallianatos, Filippos; Michas, George

    2014-05-01

    In the present work we study the distribution of worldwide shallow seismic events occurred from 1981 to 2011 extracted from the CMT catalog, with magnitude equal or greater than Mw 5.0. Our analysis based on the subdivision of the Earth surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. To this direction we use the Flinn-Engdahl regionalization (Flinn and Engdahl, 1965), which consists of 50 seismic zones as modified by Lombardi and Marzocchi (2007), where grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method. As a result Lombardi and Marzocchi (2007), limit the initial 50 regions to 39 ones, in which we apply the non- extensive statistical physics approach. The non-extensive statistical physics seems to be the most adequate and promising methodological tool for analyzing complex systems, such as the Earth's interior. In this frame, we introduce the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). In the present work we analyze the interevent time distribution between successive earthquakes by a q-exponential function in each of the seismic zones defined by Lombardi and Marzocchi (2007).confirming the importance of long-range interactions and the existence of a power-law approximation in the distribution of the interevent times. Our findings supports the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence on temporal clustering of seismic activity in each of the tectonic zones analyzed. Our analysis as applied in worldwide seismicity with magnitude equal or greater than Mw 5.5 and 6.) is presented and the dependence of our result on the cut-off magnitude is discussed. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme.

  15. q-deformed Einstein's model to describe specific heat of solid

    NASA Astrophysics Data System (ADS)

    Guha, Atanu; Das, Prasanta Kumar

    2018-04-01

    Realistic phenomena can be described more appropriately using generalized canonical ensemble, with proper parameter sets involved. We have generalized the Einstein's theory for specific heat of solid in Tsallis statistics, where the temperature fluctuation is introduced into the theory via the fluctuation parameter q. At low temperature the Einstein's curve of the specific heat in the nonextensive Tsallis scenario exactly lies on the experimental data points. Consequently this q-modified Einstein's curve is found to be overlapping with the one predicted by Debye. Considering only the temperature fluctuation effect(even without considering more than one mode of vibration is being triggered) we found that the CV vs T curve is as good as obtained by considering the different modes of vibration as suggested by Debye. Generalizing the Einstein's theory in Tsallis statistics we found that a unique value of the Einstein temperature θE along with a temperature dependent deformation parameter q(T) , can well describe the phenomena of specific heat of solid i.e. the theory is equivalent to Debye's theory with a temperature dependent θD.

  16. Generalized statistical mechanics of cosmic rays: Application to positron-electron spectral indices.

    PubMed

    Yalcin, G Cigdem; Beck, Christian

    2018-01-29

    Cosmic ray energy spectra exhibit power law distributions over many orders of magnitude that are very well described by the predictions of q-generalized statistical mechanics, based on a q-generalized Hagedorn theory for transverse momentum spectra and hard QCD scattering processes. QCD at largest center of mass energies predicts the entropic index to be [Formula: see text]. Here we show that the escort duality of the nonextensive thermodynamic formalism predicts an energy split of effective temperature given by Δ [Formula: see text] MeV, where T H is the Hagedorn temperature. We carefully analyse the measured data of the AMS-02 collaboration and provide evidence that the predicted temperature split is indeed observed, leading to a different energy dependence of the e + and e - spectral indices. We also observe a distinguished energy scale E *  ≈ 50 GeV where the e + and e - spectral indices differ the most. Linear combinations of the escort and non-escort q-generalized canonical distributions yield excellent agreement with the measured AMS-02 data in the entire energy range.

  17. On q-non-extensive statistics with non-Tsallisian entropy

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2016-02-01

    We combine an axiomatics of Rényi with the q-deformed version of Khinchin axioms to obtain a measure of information (i.e., entropy) which accounts both for systems with embedded self-similarity and non-extensivity. We show that the entropy thus obtained is uniquely solved in terms of a one-parameter family of information measures. The ensuing maximal-entropy distribution is phrased in terms of a special function known as the Lambert W-function. We analyze the corresponding "high" and "low-temperature" asymptotics and reveal a non-trivial structure of the parameter space. Salient issues such as concavity and Schur concavity of the new entropy are also discussed.

  18. Nonextensive kinetic theory and H-theorem in general relativity

    NASA Astrophysics Data System (ADS)

    Santos, A. P.; Silva, R.; Alcaniz, J. S.; Lima, J. A. S.

    2017-11-01

    The nonextensive kinetic theory for degenerate quantum gases is discussed in the general relativistic framework. By incorporating nonadditive modifications in the collisional term of the relativistic Boltzmann equation and entropy current, it is shown that Tsallis entropic framework satisfies a H-theorem in the presence of gravitational fields. Consistency with the 2nd law of thermodynamics is obtained only whether the entropic q-parameter lies in the interval q ∈ [ 0 , 2 ] . As occurs in the absence of gravitational fields, it is also proved that the local collisional equilibrium is described by the extended Bose-Einstein (Fermi-Dirac) q-distributions.

  19. Non-extensive entropy and properties of polaron in RbCl delta quantum dot under an applied electric field and Coulombic impurity

    NASA Astrophysics Data System (ADS)

    Tiotsop, M.; Fotue, A. J.; Fotsin, H. B.; Fai, L. C.

    2017-08-01

    Bound polaron in RbCl delta quantum dot under electric field and Coulombic impurity were considered. The ground and first excited state energy were derived by employing Pekar variational and unitary transformation methods. Applying Fermi golden rule, the expression of temperature and polaron lifetime were derived. The decoherence was studied trough the Tsallis entropy. Results shows that decreasing (or increasing) the lifetime increases (or decreases) the temperature and delta parameter (electric field strength and hydrogenic impurity). This suggests that to accelerate quantum transition in nanostructure, temperature and delta have to be enhanced. The improvement of electric field and coulomb parameter, increases the lifetime of the delta quantum dot qubit. Energy spectrum of polaron increases with increase in temperature, electric field strength, Coulomb parameter, delta parameter, and polaronic radius. The control of the delta quantum dot energies can be done via the electric field, coulomb impurity, and delta parameter. Results also show that the non-extensive entropy is an oscillatory function of time. With the enhancement of delta parameter, non-extensive parameter, Coulombic parameter, and electric field strength, the entropy has a sinusoidal increase behavior with time. With the study of decoherence through the Tsallis entropy, it may be advised that to have a quantum system with efficient transmission of information, the non-extensive and delta parameters need to be significant. The study of the probability density showed an increase from the boundary to the center of the dot where it has its maximum value and oscillates with period T0 = ℏ / ΔE with the tunneling of the delta parameter, electric field strength, and Coulombic parameter. The results may be very helpful in the transmission of information in nanostructures and control of decoherence

  20. Origins and properties of kappa distributions in space plasmas

    NASA Astrophysics Data System (ADS)

    Livadiotis, George

    2016-07-01

    Classical particle systems reside at thermal equilibrium with their velocity distribution function stabilized into a Maxwell distribution. On the contrary, collisionless and correlated particle systems, such as the space and astrophysical plasmas, are characterized by a non-Maxwellian behavior, typically described by the so-called kappa distributions. Empirical kappa distributions have become increasingly widespread across space and plasma physics. However, a breakthrough in the field came with the connection of kappa distributions to the solid statistical framework of Tsallis non-extensive statistical mechanics. Understanding the statistical origin of kappa distributions was the cornerstone of further theoretical developments and applications, some of which will be presented in this talk: (i) The physical meaning of thermal parameters, e.g., temperature and kappa index; (ii) the multi-particle description of kappa distributions; (iii) the phase-space kappa distribution of a Hamiltonian with non-zero potential; (iv) the Sackur-Tetrode entropy for kappa distributions, and (v) the new quantization constant, h _{*}˜10 ^{-22} Js.

  1. What can nuclear collisions teach us about the boiling of water or the formation of multi-star systems

    NASA Astrophysics Data System (ADS)

    Gross, D. H. E.

    2001-11-01

    Phase transitions in nuclei, small atomic clusters and self-gravitating systems demand the extension of thermo-statistics to "Small" systems. The main obstacle is the thermodynamic limit. It is shown how the original definition of the entropy by Boltzmann as the volume of the energy-manifold of the N-body phase space allows a geometrical definition of the entropy as function of the conserved quantities. Without invoking the thermodynamic limit the whole "zoo" of phase transitions and critical points/lines can be unambiguously defined. The relation to the Yang-Lee singularities of the grand-canonical partition sum is pointed out. It is shown that just phase transitions in non-extensive systems give the complete set of characteristic parameters of the transition including the surface tension. Nuclear heavy-ion collisions are an experimental playground to explore this extension of thermo-statistics

  2. A social discounting model based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2010-09-01

    Social decision making (e.g. social discounting and social preferences) has been attracting attention in economics, econophysics, social physics, behavioral psychology, and neuroeconomics. This paper proposes a novel social discounting model based on the deformed algebra developed in the Tsallis’ non-extensive thermostatistics. Furthermore, it is suggested that this model can be utilized to quantify the degree of consistency in social discounting in humans and analyze the relationships between behavioral tendencies in social discounting and other-regarding economic decision making under game-theoretic conditions. Future directions in the application of the model to studies in econophysics, neuroeconomics, and social physics, as well as real-world problems such as the supply of live organ donations, are discussed.

  3. Evidence for criticality in financial data

    NASA Astrophysics Data System (ADS)

    Ruiz, G.; de Marcos, A. F.

    2018-01-01

    We provide evidence that cumulative distributions of absolute normalized returns for the 100 American companies with the highest market capitalization, uncover a critical behavior for different time scales Δt. Such cumulative distributions, in accordance with a variety of complex - and financial - systems, can be modeled by the cumulative distribution functions of q-Gaussians, the distribution function that, in the context of nonextensive statistical mechanics, maximizes a non-Boltzmannian entropy. These q-Gaussians are characterized by two parameters, namely ( q, β), that are uniquely defined by Δt. From these dependencies, we find a monotonic relationship between q and β, which can be seen as evidence of criticality. We numerically determine the various exponents which characterize this criticality.

  4. Tsallis statistics and neurodegenerative disorders

    NASA Astrophysics Data System (ADS)

    Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.

    2016-08-01

    In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.

  5. Predicting fracture of mortar beams under three-point bending using non-extensive statistical modeling of electric emissions

    NASA Astrophysics Data System (ADS)

    Stergiopoulos, Ch.; Stavrakas, I.; Triantis, D.; Vallianatos, F.; Stonham, J.

    2015-02-01

    Weak electric signals termed as 'Pressure Stimulated Currents, PSC' are generated and detected while cement based materials are found under mechanical load, related to the creation of cracks and the consequent evolution of cracks' network in the bulk of the specimen. During the experiment a set of cement mortar beams of rectangular cross-section were subjected to Three-Point Bending (3PB). For each one of the specimens an abrupt mechanical load step was applied, increased from the low load level (Lo) to a high final value (Lh) , where Lh was different for each specimen and it was maintained constant for long time. The temporal behavior of the recorded PSC show that during the load increase a spike-like PSC emission was recorded and consequently a relaxation of the PSC, after reaching its final value, follows. The relaxation process of the PSC was studied using non-extensive statistical physics (NESP) based on Tsallis entropy equation. The behavior of the Tsallis q parameter was studied in relaxation PSCs in order to investigate its potential use as an index for monitoring the crack evolution process with a potential use in non-destructive laboratory testing of cement-based specimens of unknown internal damage level. The dependence of the q-parameter on the Lh (when Lh <0.8Lf), where Lf represents the 3PB strength of the specimen, shows an increase on the q value when the specimens are subjected to gradually higher bending loadings and reaches a maximum value close to 1.4 when the applied Lh becomes higher than 0.8Lf. While the applied Lh becomes higher than 0.9Lf the value of the q-parameter gradually decreases. This analysis of the experimental data manifests that the value of the entropic index q obtains a characteristic decrease while reaching the ultimate strength of the specimen, and thus could be used as a forerunner of the expected failure.

  6. The cosmological lithium problem revisited

    NASA Astrophysics Data System (ADS)

    Bertulani, C. A.; Mukhamedzhanov, A. M.; Shubhchintak

    2016-07-01

    After a brief review of the cosmological lithium problem, we report a few recent attempts to find theoretical solutions by our group at Texas A&M University (Commerce & College Station). We will discuss our studies on the theoretical description of electron screening, the possible existence of parallel universes of dark matter, and the use of non-extensive statistics during the Big Bang nucleosynthesis epoch. Last but not least, we discuss possible solutions within nuclear physics realm. The impact of recent measurements of relevant nuclear reaction cross sections for the Big Bang nucleosynthesis based on indirect methods is also assessed. Although our attempts may not able to explain the observed discrepancies between theory and observations, they suggest theoretical developments that can be useful also for stellar nucleosynthesis.

  7. Nonequilibrium Probabilistic Dynamics of the Logistic Map at the Edge of Chaos

    NASA Astrophysics Data System (ADS)

    Borges, Ernesto P.; Tsallis, Constantino; Añaños, Garín F.; de Oliveira, Paulo Murilo

    2002-12-01

    We consider nonequilibrium probabilistic dynamics in logisticlike maps xt+1=1-a|xt|z, (z>1) at their chaos threshold: We first introduce many initial conditions within one among W>>1 intervals partitioning the phase space and focus on the unique value qsen<1 for which the entropic form Sq≡(1- ∑i=1Wpqi)/(q-1) linearly increases with time. We then verify that Sqsen(t)-Sqsen(∞) vanishes like t-1/[qrel(W)-1] [qrel(W)>1]. We finally exhibit a new finite-size scaling, qrel(∞)-qrel(W)~W- |qsen|. This establishes quantitatively, for the first time, a long pursued relation between sensitivity to the initial conditions and relaxation, concepts which play central roles in nonextensive statistical mechanics.

  8. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  9. Heavy ion-acoustic rogue waves in electron-positron multi-ion plasmas

    NASA Astrophysics Data System (ADS)

    Chowdhury, N. A.; Mannan, A.; Hasan, M. M.; Mamun, A. A.

    2017-09-01

    The nonlinear propagation of heavy-ion-acoustic (HIA) waves (HIAWs) in a four-component multi-ion plasma (containing inertial heavy negative ions and light positive ions, as well as inertialess nonextensive electrons and positrons) has been theoretically investigated. The nonlinear Schrödinger (NLS) equation is derived by employing the reductive perturbation method. It is found that the NLS equation leads to the modulational instability (MI) of HIAWs, and to the formation of HIA rogue waves (HIARWs), which are due to the effects of nonlinearity and dispersion in the propagation of HIAWs. The conditions for the MI of HIAWs and the basic properties of the generated HIARWs are identified. It is observed that the striking features (viz., instability criteria, growth rate of MI, amplitude and width of HIARWs, etc.) of the HIAWs are significantly modified by the effects of nonextensivity of electrons and positrons, the ratio of light positive ion mass to heavy negative ion mass, the ratio of electron number density to light positive ion number density, the ratio of electron temperature to positron temperature, etc. The relevancy of our present investigation to the observations in space (viz., cometary comae and earth's ionosphere) and laboratory (viz., solid-high intense laser plasma interaction experiments) plasmas is pointed out.

  10. Heavy ion-acoustic rogue waves in electron-positron multi-ion plasmas.

    PubMed

    Chowdhury, N A; Mannan, A; Hasan, M M; Mamun, A A

    2017-09-01

    The nonlinear propagation of heavy-ion-acoustic (HIA) waves (HIAWs) in a four-component multi-ion plasma (containing inertial heavy negative ions and light positive ions, as well as inertialess nonextensive electrons and positrons) has been theoretically investigated. The nonlinear Schrödinger (NLS) equation is derived by employing the reductive perturbation method. It is found that the NLS equation leads to the modulational instability (MI) of HIAWs, and to the formation of HIA rogue waves (HIARWs), which are due to the effects of nonlinearity and dispersion in the propagation of HIAWs. The conditions for the MI of HIAWs and the basic properties of the generated HIARWs are identified. It is observed that the striking features (viz., instability criteria, growth rate of MI, amplitude and width of HIARWs, etc.) of the HIAWs are significantly modified by the effects of nonextensivity of electrons and positrons, the ratio of light positive ion mass to heavy negative ion mass, the ratio of electron number density to light positive ion number density, the ratio of electron temperature to positron temperature, etc. The relevancy of our present investigation to the observations in space (viz., cometary comae and earth's ionosphere) and laboratory (viz., solid-high intense laser plasma interaction experiments) plasmas is pointed out.

  11. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  12. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  13. Arbitrary amplitude ion-acoustic solitary waves in electronegative plasmas with electrons featuring Tsallis distribution

    NASA Astrophysics Data System (ADS)

    Ghebache, Siham; Tribeche, Mouloud

    2017-10-01

    The problem of arbitrary amplitude ion-acoustic solitary waves (IASWs), which accompany electronegative plasmas having positive ions, negative ions, and nonextensive electrons is addressed. The energy integral equation with a new Sagdeev potential is analyzed to examine the existence regions of the IASWs. Different types of electronegative plasmas inspired from the experimental studies of Ichiki et al. (2001) are discussed. Our results show that in such plasmas IASWs, the amplitude and nature of which depend sensitively on the mass and density ratio of the positive and negative ions as well as the q-nonextensive parameter, can exist. Interestingly, one finds that our plasma model supports the coexistence of smooth rarefactive and spiky compressive IASWs. Our results complement and provide new insights on previously published findings on this problem.

  14. Tsallis’ non-extensive free energy as a subjective value of an uncertain reward

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2009-03-01

    Recent studies in neuroeconomics and econophysics revealed the importance of reward expectation in decision under uncertainty. Behavioral neuroeconomic studies have proposed that the unpredictability and the probability of an uncertain reward are distinctly encoded as entropy and a distorted probability weight, respectively, in the separate neural systems. However, previous behavioral economic and decision-theoretic models could not quantify reward-seeking and uncertainty aversion in a theoretically consistent manner. In this paper, we have: (i) proposed that generalized Helmholtz free energy in Tsallis’ non-extensive thermostatistics can be utilized to quantify a perceived value of an uncertain reward, and (ii) empirically examined the explanatory powers of the models. Future study directions in neuroeconomics and econophysics by utilizing the Tsallis’ free energy model are discussed.

  15. Nonlinear nature of composite structure induced by the interaction of nonplanar solitons in a nonextensive plasma

    NASA Astrophysics Data System (ADS)

    Han, Jiu-Ning; Luo, Jun-Hua; Liu, Zhen-Lai; Shi, Jun; Xiang, Gen-Xiang; Li, Jun-Xiu

    2015-06-01

    The nonlinear properties of composite structure induced by the head-on collision of electron-acoustic solitons in a general plasma composed of cold fluid electrons, hot nonextensive distributed electron, and stationary ions are studied. We have made a detailed investigation on the time-evolution process of this merged wave structure. It is found that the structure survives during some time interval, and there are obviously different for the properties of the composite structures which are induced in cylindrical and spherical geometries. Moreover, it is shown that there are both positive and negative phase shifts for each colliding soliton after the interaction. For fixed plasma parameters, the soliton received the largest phase shift in spherical geometry, followed by the cylindrical and one-dimensional planar geometries.

  16. The cosmological lithium problem revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertulani, C. A., E-mail: carlos.bertulani@tamuc.edu; Department of Physics and Astronomy, Texas A&M University, College Station, TX 75429; Mukhamedzhanov, A. M., E-mail: akram@comp.tamu.edu

    After a brief review of the cosmological lithium problem, we report a few recent attempts to find theoretical solutions by our group at Texas A&M University (Commerce & College Station). We will discuss our studies on the theoretical description of electron screening, the possible existence of parallel universes of dark matter, and the use of non-extensive statistics during the Big Bang nucleosynthesis epoch. Last but not least, we discuss possible solutions within nuclear physics realm. The impact of recent measurements of relevant nuclear reaction cross sections for the Big Bang nucleosynthesis based on indirect methods is also assessed. Although ourmore » attempts may not able to explain the observed discrepancies between theory and observations, they suggest theoretical developments that can be useful also for stellar nucleosynthesis.« less

  17. Tsallis thermostatistics for finite systems: a Hamiltonian approach

    NASA Astrophysics Data System (ADS)

    Adib, Artur B.; Moreira, Andrã© A.; Andrade, José S., Jr.; Almeida, Murilo P.

    2003-05-01

    The derivation of the Tsallis generalized canonical distribution from the traditional approach of the Gibbs microcanonical ensemble is revisited (Phys. Lett. A 193 (1994) 140). We show that finite systems whose Hamiltonians obey a generalized homogeneity relation rigorously follow the nonextensive thermostatistics of Tsallis. In the thermodynamical limit, however, our results indicate that the Boltzmann-Gibbs statistics is always recovered, regardless of the type of potential among interacting particles. This approach provides, moreover, a one-to-one correspondence between the generalized entropy and the Hamiltonian structure of a wide class of systems, revealing a possible origin for the intrinsic nonlinear features present in the Tsallis formalism that lead naturally to power-law behavior. Finally, we confirm these exact results through extensive numerical simulations of the Fermi-Pasta-Ulam chain of anharmonic oscillators.

  18. Tsallis entropy and complexity theory in the understanding of physics of precursory accelerating seismicity.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos; Chatzopoulos, George

    2014-05-01

    Strong observational indications support the hypothesis that many large earthquakes are preceded by accelerating seismic release rates which described by a power law time to failure relation. In the present work, a unified theoretical framework is discussed based on the ideas of non-extensive statistical physics along with fundamental principles of physics such as the energy conservation in a faulted crustal volume undergoing stress loading. We derive the time-to-failure power-law of: a) cumulative number of earthquakes, b) cumulative Benioff strain and c) cumulative energy released in a fault system that obeys a hierarchical distribution law extracted from Tsallis entropy. Considering the analytic conditions near the time of failure, we derive from first principles the time-to-failure power-law and show that a common critical exponent m(q) exists, which is a function of the non-extensive entropic parameter q. We conclude that the cumulative precursory parameters are function of the energy supplied to the system and the size of the precursory volume. In addition the q-exponential distribution which describes the fault system is a crucial factor on the appearance of power-law acceleration in the seismicity. Our results based on Tsallis entropy and the energy conservation gives a new view on the empirical laws derived by other researchers. Examples and applications of this technique to observations of accelerating seismicity will also be presented and discussed. This work was implemented through the project IMPACT-ARC in the framework of action "ARCHIMEDES III-Support of Research Teams at TEI of Crete" (MIS380353) of the Operational Program "Education and Lifelong Learning" and is co-financed by the European Union (European Social Fund) and Greek national funds

  19. Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kalliopi; Vallianatos, Filippos

    2017-04-01

    Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016.

  20. Global regionalized seismicity in view of Non-Extensive Statistical Physics

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kalliopi; Vallianatos, Filippos; Michas, Georgios

    2018-03-01

    In the present work we study the distribution of Earth's shallow seismicity on different seismic zones, as occurred from 1981 to 2011 and extracted from the Centroid Moment Tensor (CMT) catalog. Our analysis is based on the subdivision of the Earth's surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. For this, we use the Flinn-Engdahl regionalization (FE) (Flinn and Engdahl, 1965), which consists of fifty seismic zones as modified by Lombardi and Marzocchi (2007). The latter authors grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method, resulting into thirty-nine seismic zones. In each one of these seismic zones we study the distribution of seismicity in terms of the frequency-magnitude distribution and the inter-event time distribution between successive earthquakes, a task that is essential for hazard assessments and to better understand the global and regional geodynamics. In our analysis we use non-extensive statistical physics (NESP), which seems to be one of the most adequate and promising methodological tools for analyzing complex systems, such as the Earth's seismicity, introducing the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). The qE parameter is significantly greater than one for all the seismic regions analyzed with value range from 1.294 to 1.504, indicating that magnitude correlations are particularly strong. Furthermore, the qT parameter shows some temporal correlations but variations with cut-off magnitude show greater temporal correlations when the smaller magnitude earthquakes are included. The qT for earthquakes with magnitude greater than 5 takes values from 1.043 to 1.353 and as we increase the cut-off magnitude to 5.5 and 6 the qT value ranges from 1.001 to 1.242 and from 1.001 to 1.181 respectively, presenting a significant decrease. Our findings support the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence ontemporal clustering and long-range correlations of seismicity in each of the tectonic zonesanalyzed.

  1. On the rogue waves propagation in non-Maxwellian complex space plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El-Tantawy, S. A., E-mail: samireltantawy@yahoo.com; El-Awady, E. I., E-mail: eielawady@hotmail.com; Tribeche, M., E-mail: mouloudtribeche@yahoo.fr, E-mail: mtribeche@usthb.dz

    2015-11-15

    The implications of the non-Maxwellian electron distributions (nonthermal/or suprathermal/or nonextensive distributions) are examined on the dust-ion acoustic (DIA) rogue/freak waves in a dusty warm plasma. Using a reductive perturbation technique, the basic set of fluid equations is reduced to a nonlinear Schrödinger equation. The latter is used to study the nonlinear evolution of modulationally unstable DIA wavepackets and to describe the rogue waves (RWs) propagation. Rogue waves are large-amplitude short-lived wave groups, routinely observed in space plasmas. The possible region for the rogue waves to exist is defined precisely for typical parameters of space plasmas. It is shown that themore » RWs strengthen for decreasing plasma nonthermality and increasing superthermality. For nonextensive electrons, the RWs amplitude exhibits a bit more complex behavior, depending on the entropic index q. Moreover, our numerical results reveal that the RWs exist with all values of the ion-to-electron temperature ratio σ for nonthermal and superthermal distributions and there is no limitation for the freak waves to propagate in both two distributions in the present plasma system. But, for nonextensive electron distribution, the bright- and dark-type waves can propagate in this case, which means that there is a limitation for the existence of freak waves. Our systematic investigation should be useful in understanding the properties of DIA solitary waves that may occur in non-Maxwellian space plasmas.« less

  2. How multiplicity determines entropy and the derivation of the maximum entropy principle for complex systems.

    PubMed

    Hanel, Rudolf; Thurner, Stefan; Gell-Mann, Murray

    2014-05-13

    The maximum entropy principle (MEP) is a method for obtaining the most likely distribution functions of observables from statistical systems by maximizing entropy under constraints. The MEP has found hundreds of applications in ergodic and Markovian systems in statistical mechanics, information theory, and statistics. For several decades there has been an ongoing controversy over whether the notion of the maximum entropy principle can be extended in a meaningful way to nonextensive, nonergodic, and complex statistical systems and processes. In this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related to multiplicities of independent random processes. We then show how the relaxation of independence naturally leads to the most general entropies that are compatible with the first three Shannon-Khinchin axioms, the (c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept for nonergodic and complex statistical systems if their relative entropy can be factored into a generalized multiplicity and a constraint term. The problem of finding such a factorization reduces to finding an appropriate representation of relative entropy in a linear basis. In a particular example we show that path-dependent random processes with memory naturally require specific generalized entropies. The example is to our knowledge the first exact derivation of a generalized entropy from the microscopic properties of a path-dependent random process.

  3. Entropic lattice Boltzmann representations required to recover Navier-Stokes flows.

    PubMed

    Keating, Brian; Vahala, George; Yepez, Jeffrey; Soe, Min; Vahala, Linda

    2007-03-01

    There are two disparate formulations of the entropic lattice Boltzmann scheme: one of these theories revolves around the analog of the discrete Boltzmann H function of standard extensive statistical mechanics, while the other revolves around the nonextensive Tsallis entropy. It is shown here that it is the nonenforcement of the pressure tensor moment constraints that lead to extremizations of entropy resulting in Tsallis-like forms. However, with the imposition of the pressure tensor moment constraint, as is fundamentally necessary for the recovery of the Navier-Stokes equations, it is proved that the entropy function must be of the discrete Boltzmann form. Three-dimensional simulations are performed which illustrate some of the differences between standard lattice Boltzmann and entropic lattice Boltzmann schemes, as well as the role played by the number of phase-space velocities used in the discretization.

  4. Propagation of Electron Acoustic Soliton, Periodic and Shock Waves in Dissipative Plasma with a q-Nonextensive Electron Velocity Distribution

    NASA Astrophysics Data System (ADS)

    El-Hanbaly, A. M.; El-Shewy, E. K.; Elgarayhi, A.; Kassem, A. I.

    2015-11-01

    The nonlinear properties of small amplitude electron-acoustic (EA) solitary and shock waves in a homogeneous system of unmagnetized collisionless plasma with nonextensive distribution for hot electrons have been investigated. A reductive perturbation method used to obtain the Kadomstev-Petviashvili-Burgers equation. Bifurcation analysis has been discussed for non-dissipative system in the absence of Burgers term and reveals different classes of the traveling wave solutions. The obtained solutions are related to periodic and soliton waves and their behavior are shown graphically. In the presence of the Burgers term, the EXP-function method is used to solve the Kadomstev-Petviashvili-Burgers equation and the obtained solution is related to shock wave. The obtained results may be helpful in better conception of waves propagation in various space plasma environments as well as in inertial confinement fusion laboratory plasmas.

  5. Swine influenza and vaccines: an alternative approach for decision making about pandemic prevention.

    PubMed

    Basili, Marcello; Ferrini, Silvia; Montomoli, Emanuele

    2013-08-01

    During the global pandemic of A/H1N1/California/07/2009 (A/H1N1/Cal) influenza, many governments signed contracts with vaccine producers for a universal influenza immunization program and bought hundreds of millions of vaccines doses. We argue that, as Health Ministers assumed the occurrence of the worst possible scenario (generalized pandemic influenza) and followed the strong version of the Precautionary Principle, they undervalued the possibility of mild or weak pandemic wave. An alternative decision rule, based on the non-extensive entropy principle, is introduced, and a different Precautionary Principle characterization is applied. This approach values extreme negative results (catastrophic events) in a different way and predicts more plausible and mild events. It introduces less pessimistic forecasts in the case of uncertain influenza pandemic outbreaks. A simplified application is presented using seasonal data of morbidity and severity among Italian children influenza-like illness for the period 2003-10. Established literature results predict an average attack rate of not less than 15% for the next pandemic influenza [Meltzer M, Cox N, Fukuda K. The economic impact of pandemic influenza in the United States: implications for setting priorities for interventions. Emerg Infect Dis 1999;5:659-71; Meltzer M, Cox N, Fukuda K. Modeling the Economic Impact of Pandemic Influenza in the United States: Implications for Setting Priorities for Intervention. Background paper. Atlanta, GA: CDC, 1999. Available at: http://www.cdc.gov/ncidod/eid/vol5no5/melt_back.htm (7 January 2011, date last accessed))]. The strong version of the Precautionary Principle would suggest using this prediction for vaccination campaigns. On the contrary, the non-extensive maximum entropy principle predicts a lower attack rate, which induces a 20% saving in public funding for vaccines doses. The need for an effective influenza pandemic prevention program, coupled with an efficient use of public funding, calls for a rethinking of the Precautionary Principle. The non-extensive maximum entropy principle, which incorporates vague and incomplete information available to decision makers, produces a more coherent forecast of possible influenza pandemic and a conservative spending in public funding.

  6. Generalized permutation entropy analysis based on the two-index entropic form S q , δ

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian

    2015-05-01

    Permutation entropy (PE) is a novel measure to quantify the complexity of nonlinear time series. In this paper, we propose a generalized permutation entropy ( P E q , δ ) based on the recently postulated entropic form, S q , δ , which was proposed as an unification of the well-known Sq of nonextensive-statistical mechanics and S δ , a possibly appropriate candidate for the black-hole entropy. We find that P E q , δ with appropriate parameters can amplify minor changes and trends of complexities in comparison to PE. Experiments with this generalized permutation entropy method are performed with both synthetic and stock data showing its power. Results show that P E q , δ is an exponential function of q and the power ( k ( δ ) ) is a constant if δ is determined. Some discussions about k ( δ ) are provided. Besides, we also find some interesting results about power law.

  7. On the emergence of a generalised Gamma distribution. Application to traded volume in financial markets

    NASA Astrophysics Data System (ADS)

    Duarte Queirós, S. M.

    2005-08-01

    This letter reports on a stochastic dynamical scenario whose associated stationary probability density function is exactly a generalised form, with a power law instead of exponencial decay, of the ubiquitous Gamma distribution. This generalisation, also known as F-distribution, was empirically proposed for the first time to adjust for high-frequency stock traded volume distributions in financial markets and verified in experiments with granular material. The dynamical assumption presented herein is based on local temporal fluctuations of the average value of the observable under study. This proposal is related to superstatistics and thus to the current nonextensive statistical mechanics framework. For the specific case of stock traded volume, we connect the local fluctuations in the mean stock traded volume with the typical herding behaviour presented by financial traders. Last of all, NASDAQ 1 and 2 minute stock traded volume sequences and probability density functions are numerically reproduced.

  8. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    NASA Astrophysics Data System (ADS)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q<1) or large (when q>1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  9. Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis

    PubMed Central

    Ré, Miguel A.; Azad, Rajeev K.

    2014-01-01

    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms. PMID:24728338

  10. Generalization of entropy based divergence measures for symbolic sequence analysis.

    PubMed

    Ré, Miguel A; Azad, Rajeev K

    2014-01-01

    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.

  11. Nonlinear excitations for the positron acoustic shock waves in dissipative nonextensive electron-positron-ion plasmas

    NASA Astrophysics Data System (ADS)

    Saha, Asit

    2017-03-01

    Positron acoustic shock waves (PASHWs) in unmagnetized electron-positron-ion (e-p-i) plasmas consisting of mobile cold positrons, immobile positive ions, q-nonextensive distributed electrons, and hot positrons are studied. The cold positron kinematic viscosity is considered and the reductive perturbation technique is used to derive the Burgers equation. Applying traveling wave transformation, the Burgers equation is transformed to a one dimensional dynamical system. All possible vector fields corresponding to the dynamical system are presented. We have analyzed the dynamical system with the help of potential energy, which helps to identify the stability and instability of the equilibrium points. It is found that the viscous force acting on cold mobile positron fluid is a source of dissipation and is responsible for the formation of the PASHWs. Furthermore, fully nonlinear arbitrary amplitude positron acoustic waves are also studied applying the theory of planar dynamical systems. It is also observed that the fundamental features of the small amplitude and arbitrary amplitude PASHWs are significantly affected by the effect of the physical parameters q e , q h , μ e , μ h , σ , η , and U. This work can be useful to understand the qualitative changes in the dynamics of nonlinear small amplitude and fully nonlinear arbitrary amplitude PASHWs in solar wind, ionosphere, lower part of magnetosphere, and auroral acceleration regions.

  12. Avalanches and generalized memory associativity in a network model for conscious and unconscious mental functioning

    NASA Astrophysics Data System (ADS)

    Siddiqui, Maheen; Wedemann, Roseli S.; Jensen, Henrik Jeldtoft

    2018-01-01

    We explore statistical characteristics of avalanches associated with the dynamics of a complex-network model, where two modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's ideas regarding the neuroses and that consciousness is related with symbolic and linguistic memory activity in the brain. It incorporates the Stariolo-Tsallis generalization of the Boltzmann Machine in order to model memory retrieval and associativity. In the present work, we define and measure avalanche size distributions during memory retrieval, in order to gain insight regarding basic aspects of the functioning of these complex networks. The avalanche sizes defined for our model should be related to the time consumed and also to the size of the neuronal region which is activated, during memory retrieval. This allows the qualitative comparison of the behaviour of the distribution of cluster sizes, obtained during fMRI measurements of the propagation of signals in the brain, with the distribution of avalanche sizes obtained in our simulation experiments. This comparison corroborates the indication that the Nonextensive Statistical Mechanics formalism may indeed be more well suited to model the complex networks which constitute brain and mental structure.

  13. Using Non-Extension Volunteering as an Experiential Learning Activity for Extension Professionals

    ERIC Educational Resources Information Center

    Andrews, Kevin B.; Lockett, Landry L.

    2013-01-01

    Extension professionals can gain much-needed competencies in volunteer administration through experiential learning by participating in volunteer activities. Experiential learning is a means of behavior change that allows the individual learner to reflect on, abstract, and apply their experiences to new situations. This article expands on…

  14. Dynamic properties of small-scale solar wind plasma fluctuations.

    PubMed

    Riazantseva, M O; Budaev, V P; Zelenyi, L M; Zastenker, G N; Pavlos, G P; Safrankova, J; Nemecek, Z; Prech, L; Nemec, F

    2015-05-13

    The paper presents the latest results of the studies of small-scale fluctuations in a turbulent flow of solar wind (SW) using measurements with extremely high temporal resolution (up to 0.03 s) of the bright monitor of SW (BMSW) plasma spectrometer operating on astrophysical SPECTR-R spacecraft at distances up to 350,000 km from the Earth. The spectra of SW ion flux fluctuations in the range of scales between 0.03 and 100 s are systematically analysed. The difference of slopes in low- and high-frequency parts of spectra and the frequency of the break point between these two characteristic slopes was analysed for different conditions in the SW. The statistical properties of the SW ion flux fluctuations were thoroughly analysed on scales less than 10 s. A high level of intermittency is demonstrated. The extended self-similarity of SW ion flux turbulent flow is constantly observed. The approximation of non-Gaussian probability distribution function of ion flux fluctuations by the Tsallis statistics shows the non-extensive character of SW fluctuations. Statistical characteristics of ion flux fluctuations are compared with the predictions of a log-Poisson model. The log-Poisson parametrization of the structure function scaling has shown that well-defined filament-like plasma structures are, as a rule, observed in the turbulent SW flows. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  15. A Snapshot of Organizational Climate: Perceptions of Extension Faculty

    ERIC Educational Resources Information Center

    Tower, Leslie E.; Bowen, Elaine; Alkadry, Mohamad G.

    2011-01-01

    This article provides a snapshot of the perceptions of workplace climate of Extension faculty at a land-grant, research-high activity university, compared with the perceptions of non-Extension faculty at the same university. An online survey was conducted with a validated instrument. The response rate for university faculty was 44% (968); the…

  16. Generalized Entanglement Entropy and Holography

    NASA Astrophysics Data System (ADS)

    Obregón, O.

    2018-04-01

    A nonextensive statistical mechanics entropy that depends only on the probability distribution is proposed in the framework of superstatistics. It is based on a Γ(χ 2) distribution that depends on β and also on pl . The corresponding modified von Neumann entropy is constructed; it is shown that it can also be obtained from a generalized Replica trick. We address the question whether the generalized entanglement entropy can play a role in the gauge/gravity duality. We pay attention to 2dCFT and their gravity duals. The correction terms to the von Neumann entropy result more relevant than the usual UV (for c = 1) ones and also than those due to the area dependent AdS 3 entropy which result comparable to the UV ones. Then the correction terms due to the new entropy would modify the Ryu-Takayanagi identification between the CFT entanglement entropy and the AdS entropy in a different manner than the UV ones or than the corrections to the AdS 3 area dependent entropy.

  17. Phase-space interference in extensive and nonextensive quantum heat engines

    NASA Astrophysics Data System (ADS)

    Hardal, Ali Ü. C.; Paternostro, Mauro; Müstecaplıoǧlu, Özgür E.

    2018-04-01

    Quantum interference is at the heart of what sets the quantum and classical worlds apart. We demonstrate that quantum interference effects involving a many-body working medium is responsible for genuinely nonclassical features in the performance of a quantum heat engine. The features with which quantum interference manifests itself in the work output of the engine depends strongly on the extensive nature of the working medium. While identifying the class of work substances that optimize the performance of the engine, our results shed light on the optimal size of such media of quantum workers to maximize the work output and efficiency of quantum energy machines.

  18. Impurity effects in highly frustrated diamond-lattice antiferromagnets

    NASA Astrophysics Data System (ADS)

    Savary, Lucile; Gull, Emanuel; Trebst, Simon; Alicea, Jason; Bergman, Doron; Balents, Leon

    2011-08-01

    We consider the effects of local impurities in highly frustrated diamond-lattice antiferromagnets, which exhibit large but nonextensive ground-state degeneracies. Such models are appropriate to many A-site magnetic spinels. We argue very generally that sufficiently dilute impurities induce an ordered magnetic ground state and provide a mechanism of degeneracy breaking. The states that are selected can be determined by a “swiss cheese model” analysis, which we demonstrate numerically for a particular impurity model in this case. Moreover, we present criteria for estimating the stability of the resulting ordered phase to a competing frozen (spin glass) one. The results may explain the contrasting finding of frozen and ordered ground states in CoAl2O4 and MnSc2S4, respectively.

  19. Impurity Effects in Highly Frustrated Diamond-Lattice Antiferromagnets

    NASA Astrophysics Data System (ADS)

    Savary, Lucile

    2012-02-01

    We consider the effects of local impurities in highly frustrated diamond lattice antiferromagnets, which exhibit large but non-extensive ground state degeneracies. Such models are appropriate to many A-site magnetic spinels. We argue very generally that sufficiently dilute impurities induce an ordered magnetic ground state, and provide a mechanism of degeneracy breaking. The states which are selected can be determined by a ``swiss cheese model'' analysis, which we demonstrate numerically for a particular impurity model in this case. Moreover, we present criteria for estimating the stability of the resulting ordered phase to a competing frozen (spin glass) one. The results may explain the contrasting finding of frozen and ordered ground states in CoAl2O4 and MnSc2S4, respectively.

  20. The United Nations Basic Space Science Initiative

    NASA Astrophysics Data System (ADS)

    Haubold, H. J.

    2006-08-01

    Pursuant to recommendations of the United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space (UNCOPUOS), annual UN/ European Space Agency workshops on basic space science have been held around the world since 1991. These workshops contribute to the development of astrophysics and space science, particularly in developing nations. Following a process of prioritization, the workshops identified the following elements as particularly important for international cooperation in the field: (i) operation of astronomical telescope facilities implementing TRIPOD, (ii) virtual observatories, (iii) astrophysical data systems, (iv) concurrent design capabilities for the development of international space missions, and (v) theoretical astrophysics such as applications of nonextensive statistical mechanics. Beginning in 2005, the workshops focus on preparations for the International Heliophysical Year 2007 (IHY2007). The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost, ground-based, world-wide instrument arrays as lead by the IHY secretariat. Wamsteker, W., Albrecht, R. and Haubold, H.J.: Developing Basic Space Science World-Wide: A Decade of UN/ESA Workshops. Kluwer Academic Publishers, Dordrecht 2004. http://ihy2007.org http://www.unoosa.org/oosa/en/SAP/bss/ihy2007/index.html http://www.cbpf.br/GrupPesq/StatisticalPhys/biblio.htm

  1. Differentiation and concordance in smallholder land use strategies in southern Mexico's conservation frontier.

    PubMed

    Roy Chowdhury, Rinku

    2010-03-30

    Forest cover transitions in the developing tropics are conditioned by agricultural change. The expansion, intensification, and diversification of agricultural land uses are tied to regional economic/environmental regimes and decisions of local farming households. Land change science and agrarian systems research share an interest in the drivers of household strategies, land use impacts, and typologies of those land uses/drivers. This study derives a typology of farming households in southern Mexico based on emergent patterns in their land use combinations and analyzes their household and policy drivers. The results reveal broadly diversified household land use portfolios as well as three emergent clusters of farmstead production orientation: (i) extensive subsistence-oriented conservationists, (ii), dual extensive-intensive farmers, and (iii) nonextensive diversified land users. Household membership in these clusters is uneven and strongly related to tenancy, land endowments, wage labor, and policy subsidies. Although most households are following a nonextensive agricultural strategy incorporating off-farm incomes, the likelihood of a regional forest transition remains debatable because of the disproportionate deforestation impacts of the less common strategies. Conservation development policies in the region need to accommodate diverse smallholder farming rationales, increase off-farm opportunities, and target sustainable development with the assistance of community conservation leaders.

  2. Carnot cycle for interacting particles in the absence of thermal noise.

    PubMed

    Curado, Evaldo M F; Souza, Andre M C; Nobre, Fernando D; Andrade, Roberto F S

    2014-02-01

    A thermodynamic formalism is developed for a system of interacting particles under overdamped motion, which has been recently analyzed within the framework of nonextensive statistical mechanics. It amounts to expressing the interaction energy of the system in terms of a temperature θ, conjugated to a generalized entropy s(q), with q = 2. Since θ assumes much higher values than those of typical room temperatures T ≪ θ, the thermal noise can be neglected for this system (T/θ ≃ 0). This framework is now extended by the introduction of a work term δW which, together with the formerly defined heat contribution (δ Q = θ ds(q)), allows for the statement of a proper energy conservation law that is analogous to the first law of thermodynamics. These definitions lead to the derivation of an equation of state and to the characterization of s(q) adiabatic and θ isothermic transformations. On this basis, a Carnot cycle is constructed, whose efficiency is shown to be η = 1-(θ(2)/θ(1)), where θ(1) and θ(2) are the effective temperatures of the two isothermic transformations, with θ(1)>θ(2). The results for a generalized thermodynamic description of this system open the possibility for further physical consequences, like the realization of a thermal engine based on energy exchanges gauged by the temperature θ.

  3. Teacher Factors Associated with Innovative Curriculum Goals and Pedagogical Practices: Differences between Extensive and Non-Extensive ICT-Using Science Teachers

    ERIC Educational Resources Information Center

    Voogt, J.

    2010-01-01

    Second Information Technology in Education Study (SITES) 2006 was an international study about pedagogical practices and the use of information and communication technology (ICT) in math and science classrooms. One of the findings of SITES 2006 was that--across educational systems--a proportion of the math and science teachers in the 22 countries…

  4. The United Nations Basic Space Science Initiative (UNBSSI): A Historical Introduction

    NASA Astrophysics Data System (ADS)

    Haubold, H. J.

    2006-11-01

    Pursuant to recommendations of the Third United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space (UNCOPUOS), annual UN/European Space Agency workshops on basic space science have been held around the world since 1991. These workshops contributed to the development of astrophysics and space science, particularly in developing nations. Following a process of prioritization, the workshops identified the following elements as particularly important for international cooperation in the field: (i) operation of astronomical telescope facilities implementing TRIPOD, (ii) virtual observatories, (iii) astrophysical data systems, (iv) con-current design capabilities for the development of international space missions, and (v) theoretical astrophysics such as applications of non-extensive statistical mechanics. Beginning in 2005, the workshops are focusing on preparations for the International Heliophysical Year 2007 (IHY2007). The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost, ground-based, world- wide instrument arrays as led by the IHY secretariat. Wamsteker, W., Albrecht, R. and Haubold, H.J.: Developing Basic Space Science World-Wide: A Decade of UN/ESA Workshops: Kluwer Academic Publishers, Dordrecht 2004. http://ihy2007.org http://www.unoosa.org/oosa/en/SAP/bss/ihy2007/index.html http://www.cbpf.br/GrupPesq/StatisticalPhys/biblio.htm

  5. The United Nations Basic Space Science Initiative

    NASA Astrophysics Data System (ADS)

    Haubold, H. J.

    Pursuant to recommendations of the United Nations Conference on the Exploration and Peaceful Uses of Outer Space UNISPACE III and deliberations of the United Nations Committee on the Peaceful Uses of Outer Space UNCOPUOS annual UN European Space Agency workshops on basic space science have been held around the world since 1991 These workshops contribute to the development of astrophysics and space science particularly in developing nations Following a process of prioritization the workshops identified the following elements as particularly important for international cooperation in the field i operation of astronomical telescope facilities implementing TRIPOD ii virtual observatories iii astrophysical data systems iv concurrent design capabilities for the development of international space missions and v theoretical astrophysics such as applications of nonextensive statistical mechanics Beginning in 2005 the workshops focus on preparations for the International Heliophysical Year 2007 IHY2007 The workshops continue to facilitate the establishment of astronomical telescope facilities as pursued by Japan and the development of low-cost ground-based world-wide instrument arrays as lead by the IHY secretariat Further information Wamsteker W Albrecht R and Haubold H J Developing Basic Space Science World-Wide A Decade of UN ESA Workshops Kluwer Academic Publishers Dordrecht 2004 http ihy2007 org http www oosa unvienna org SAP bss ihy2007 index html http www cbpf br GrupPesq StatisticalPhys biblio htm

  6. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  7. A non-extensive thermodynamic theory of ecological systems

    NASA Astrophysics Data System (ADS)

    Van Xuan, Le; Khac Ngoc, Nguyen; Lan, Nguyen Tri; Viet, Nguyen Ai

    2017-06-01

    After almost 30 years of development, it is not controversial issue that the so-called Tsallis entropy provides a useful approach to studying the complexity where the non-additivity of the systems under consideration is frequently met. Also, in the ecological research, Tsallis entropy, or in other words, q-entropy has been found itself as a generalized approach to define a range of diversity indices including Shannon-Wiener and Simpson indices. As a further stage of development in theoretical research, a thermodynamic theory based on Tsallis entropy or diversity indices in ecology has to be constructed for ecological systems to provide knowledge of ecological macroscopic behaviors. The standard method of theoretical physics is used in the manipulation and the equivalence between phenomenological thermodynamics and ecological aspects is the purpose of the ongoing research. The present work is in the line of the authors research to implement Tsallis non-extensivity approach to obtain the most important thermodynamic quantities of ecological systems such as internal energy Uq and temperature Tq based on a given modeled truncated Boltzmann distribution of the Whittaker plot for a dataset. These quantities have their own ecological meaning, especially the temperature Tq provides the insight of equilibrium condition among ecological systems as it is well-known in 0th law of thermodynamics.

  8. Thermodynamic framework for compact q-Gaussian distributions

    NASA Astrophysics Data System (ADS)

    Souza, Andre M. C.; Andrade, Roberto F. S.; Nobre, Fernando D.; Curado, Evaldo M. F.

    2018-02-01

    Recent works have associated systems of particles, characterized by short-range repulsive interactions and evolving under overdamped motion, to a nonlinear Fokker-Planck equation within the class of nonextensive statistical mechanics, with a nonlinear diffusion contribution whose exponent is given by ν = 2 - q. The particular case ν = 2 applies to interacting vortices in type-II superconductors, whereas ν > 2 covers systems of particles characterized by short-range power-law interactions, where correlations among particles are taken into account. In the former case, several studies presented a consistent thermodynamic framework based on the definition of an effective temperature θ (presenting experimental values much higher than typical room temperatures T, so that thermal noise could be neglected), conjugated to a generalized entropy sν (with ν = 2). Herein, the whole thermodynamic scheme is revisited and extended to systems of particles interacting repulsively, through short-ranged potentials, described by an entropy sν, with ν > 1, covering the ν = 2 (vortices in type-II superconductors) and ν > 2 (short-range power-law interactions) physical examples. One basic requirement concerns a cutoff in the equilibrium distribution Peq(x) , approached due to a confining external harmonic potential, ϕ(x) = αx2 / 2 (α > 0). The main results achieved are: (a) The definition of an effective temperature θ conjugated to the entropy sν; (b) The construction of a Carnot cycle, whose efficiency is shown to be η = 1 -(θ2 /θ1) , where θ1 and θ2 are the effective temperatures associated with two isothermal transformations, with θ1 >θ2; (c) Thermodynamic potentials, Maxwell relations, and response functions. The present thermodynamic framework, for a system of interacting particles under the above-mentioned conditions, and associated to an entropy sν, with ν > 1, certainly enlarges the possibility of experimental verifications.

  9. Repulsive particles under a general external potential: Thermodynamics by neglecting thermal noise.

    PubMed

    Ribeiro, Mauricio S; Nobre, Fernando D

    2016-08-01

    A recent proposal of an effective temperature θ, conjugated to a generalized entropy s_{q}, typical of nonextensive statistical mechanics, has led to a consistent thermodynamic framework in the case q=2. The proposal was explored for repulsively interacting vortices, currently used for modeling type-II superconductors. In these systems, the variable θ presents values much higher than those of typical room temperatures T, so that the thermal noise can be neglected (T/θ≃0). The whole procedure was developed for an equilibrium state obtained after a sufficiently long-time evolution, associated with a nonlinear Fokker-Planck equation and approached due to a confining external harmonic potential, ϕ(x)=αx^{2}/2 (α>0). Herein, the thermodynamic framework is extended to a quite general confining potential, namely ϕ(x)=α|x|^{z}/z (z>1). It is shown that the main results of the previous analyses hold for any z>1: (i) The definition of the effective temperature θ conjugated to the entropy s_{2}. (ii) The construction of a Carnot cycle, whose efficiency is shown to be η=1-(θ_{2}/θ_{1}), where θ_{1} and θ_{2} are the effective temperatures associated with two isothermal transformations, with θ_{1}>θ_{2}. The special character of the Carnot cycle is indicated by analyzing another cycle that presents an efficiency depending on z. (iii) Applying Legendre transformations for a distinct pair of variables, different thermodynamic potentials are obtained, and furthermore, Maxwell relations and response functions are derived. The present approach shows a consistent thermodynamic framework, suggesting that these results should hold for a general confining potential ϕ(x), increasing the possibility of experimental verifications.

  10. On the origin of non-exponential fluorescence decays in enzyme-ligand complex

    NASA Astrophysics Data System (ADS)

    Wlodarczyk, Jakub; Kierdaszuk, Borys

    2004-05-01

    Complex fluorescence decays have usually been analyzed with the aid of a multi-exponential model, but interpretation of the individual exponential terms has not been adequately characterized. In such cases the intensity decays were also analyzed in terms of the continuous lifetime distribution as a consequence of an interaction of fluorophore with environment, conformational heterogeneity or their dynamical nature. We show that non-exponential fluorescence decay of the enzyme-ligand complexes may results from time dependent energy transport. The latter, to our opinion, may be accounted for by electron transport from the protein tyrosines to their neighbor residues. We introduce the time-dependent hopping rate in the form v(t)~(a+bt)-1. This in turn leads to the luminescence decay function in the form I(t)=Ioexp(-t/τ1)(1+lt/γτ2)-γ. Such a decay function provides good fits to highly complex fluorescence decays. The power-like tail implies the time hierarchy in migration energy process due to the hierarchical energy-level structure. Moreover, such a power-like term is a manifestation of so called Tsallis nonextensive statistic and is suitable for description of the systems with long-range interactions, memory effect as well as with fluctuations of characteristic lifetime of fluorescence. The proposed decay function was applied in analysis of fluorescence decays of tyrosine protein, i.e. the enzyme purine nucleoside phosphorylase from E. coli in a complex with formycin A (an inhibitor) and orthophosphate (a co-substrate).

  11. Innovative techniques to analyze time series of geomagnetic activity indices

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  12. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  13. Sensitivity to initial conditions in the Bak-Sneppen model of biological evolution

    NASA Astrophysics Data System (ADS)

    Tamarit, F. A.; Cannas, S. A.; Tsallis, C.

    1998-03-01

    We consider biological evolution as described within the Bak and Sneppen 1993 model. We exhibit, at the self-organized critical state, a power-law sensitivity to the initial conditions, calculate the associated exponent, and relate it to the recently introduced nonextensive thermostatistics. The scenario which here emerges without tuning strongly reminds of that of the tuned onset of chaos in say logistic-like one-dimensional maps. We also calculate the dynamical exponent z.

  14. Modeling statistics and kinetics of the natural aggregation structures and processes with the solution of generalized logistic equation

    NASA Astrophysics Data System (ADS)

    Maslov, Lev A.; Chebotarev, Vladimir I.

    2017-02-01

    The generalized logistic equation is proposed to model kinetics and statistics of natural processes such as earthquakes, forest fires, floods, landslides, and many others. This equation has the form dN(A)/dA = s dot (1-N(A)) dot N(A)q dot A-α, q>0q>0 and A>0A>0 is the size of an element of a structure, and α≥0. The equation contains two exponents α and q taking into account two important properties of elements of a system: their fractal geometry, and their ability to interact either to enhance or to damp the process of aggregation. The function N(A)N(A) can be understood as an approximation to the number of elements the size of which is less than AA. The function dN(A)/dAdN(A)/dA where N(A)N(A) is the general solution of this equation for q=1 is a product of an increasing bounded function and power-law function with stretched exponential cut-off. The relation with Tsallis non-extensive statistics is demonstrated by solving the generalized logistic equation for q>0q>0. In the case 01q>1 it models sub-additive structures. The Gutenberg-Richter (G-R) formula results from interpretation of empirical data as a straight line in the area of stretched exponent with small α. The solution is applied for modeling distribution of foreshocks and aftershocks in the regions of Napa Valley 2014, and Sumatra 2004 earthquakes fitting the observed data well, both qualitatively and quantitatively.

  15. Closer look at time averages of the logistic map at the edge of chaos

    NASA Astrophysics Data System (ADS)

    Tirnakli, Ugur; Tsallis, Constantino; Beck, Christian

    2009-05-01

    The probability distribution of sums of iterates of the logistic map at the edge of chaos has been recently shown [U. Tirnakli , Phys. Rev. E 75, 040106(R) (2007)] to be numerically consistent with a q -Gaussian, the distribution which—under appropriate constraints—maximizes the nonadditive entropy Sq , which is the basis of nonextensive statistical mechanics. This analysis was based on a study of the tails of the distribution. We now check the entire distribution, in particular, its central part. This is important in view of a recent q generalization of the central limit theorem, which states that for certain classes of strongly correlated random variables the rescaled sum approaches a q -Gaussian limit distribution. We numerically investigate for the logistic map with a parameter in a small vicinity of the critical point under which conditions there is convergence to a q -Gaussian both in the central region and in the tail region and find a scaling law involving the Feigenbaum constant δ . Our results are consistent with a large number of already available analytical and numerical evidences that the edge of chaos is well described in terms of the entropy Sq and its associated concepts.

  16. Distinguishing Functional DNA Words; A Method for Measuring Clustering Levels

    NASA Astrophysics Data System (ADS)

    Moghaddasi, Hanieh; Khalifeh, Khosrow; Darooneh, Amir Hossein

    2017-01-01

    Functional DNA sub-sequences and genome elements are spatially clustered through the genome just as keywords in literary texts. Therefore, some of the methods for ranking words in texts can also be used to compare different DNA sub-sequences. In analogy with the literary texts, here we claim that the distribution of distances between the successive sub-sequences (words) is q-exponential which is the distribution function in non-extensive statistical mechanics. Thus the q-parameter can be used as a measure of words clustering levels. Here, we analyzed the distribution of distances between consecutive occurrences of 16 possible dinucleotides in human chromosomes to obtain their corresponding q-parameters. We found that CG as a biologically important two-letter word concerning its methylation, has the highest clustering level. This finding shows the predicting ability of the method in biology. We also proposed that chromosome 18 with the largest value of q-parameter for promoters of genes is more sensitive to dietary and lifestyle. We extended our study to compare the genome of some selected organisms and concluded that the clustering level of CGs increases in higher evolutionary organisms compared to lower ones.

  17. Information and complexity measures in the interface of a metal and a superconductor

    NASA Astrophysics Data System (ADS)

    Moustakidis, Ch. C.; Panos, C. P.

    2018-06-01

    Fisher information, Shannon information entropy and Statistical Complexity are calculated for the interface of a normal metal and a superconductor, as a function of the temperature for several materials. The order parameter Ψ (r) derived from the Ginzburg-Landau theory is used as an input together with experimental values of critical transition temperature Tc and the superconducting coherence length ξ0. Analytical expressions are obtained for information and complexity measures. Thus Tc is directly related in a simple way with disorder and complexity. An analytical relation is found of the Fisher Information with the energy profile of superconductivity i.e. the ratio of surface free energy and the bulk free energy. We verify that a simple relation holds between Shannon and Fisher information i.e. a decomposition of a global information quantity (Shannon) in terms of two local ones (Fisher information), previously derived and verified for atoms and molecules by Liu et al. Finally, we find analytical expressions for generalized information measures like the Tsallis entropy and Fisher information. We conclude that the proper value of the non-extensivity parameter q ≃ 1, in agreement with previous work using a different model, where q ≃ 1.005.

  18. A RCT comparing lumbosacral orthosis to routine physical therapy on postural stability in patients with chronic low back pain.

    PubMed

    Azadinia, Fatemeh; Ebrahimi-Takamjani, Ismail; Kamyab, Mojtaba; Parnianpour, Mohamad; Asgari, Morteza

    2017-01-01

    Background: Poor balance performance and impaired postural control have been frequently reported in patients with low back pain. However, postural control is rarely monitored during the course of treatment even though poor postural control may contribute to chronicity and recurrence of symptoms. Therefore, the present study aimed at investigating the effect of a nonextensible lumbosacral orthosis (LSO) versus routine physical therapy on postural stability of patients with nonspecific chronic low back pain. Methods: This was a randomized controlled trial conducted between November 2015 and May 2016 at the outpatient physical therapy clinic of the School of Rehabilitation Sciences. Patients with nonspecific chronic low back pain aged 20 to 55 years were randomly allocated to the intervention and control groups. Both groups received 8 sessions of physical therapy twice weekly for 4 weeks. The intervention group received nonextensible LSO in addition to routine physical therapy. Pain intensity, functional disability, fear of movement/ (re)injury, and postural stability in 3 levels of postural difficulty were measured before and after 4 weeks of intervention. A 2×2×3 mixed model of analysis of variance (ANOVA) was used to determine the main and interactive effects of the 3 factors including group, time, and postural difficulty conditions for each variable of postural stability. Results: The LSO and control groups displayed significant improvement in postural stability at the most difficult postural task conditions (P-value for 95% area ellipse was 0.003; and for phase plane, the mean total velocity and standard deviation of velocity was <0.001). Both groups exhibited a decrease in pain intensity, Oswestry Disability Index, and Tampa Scale of Kinesiophobia after 4 weeks of intervention. A significant difference between groups was found only for functional disability, with greater improvement in the orthosis group (t = 3.60, P<0.001). Conclusion: Both routine physical therapy and LSO significantly improved clinical and postural stability outcomes immediately after 4 weeks of intervention. The orthosis group did not display superior outcomes, except for functional disability.

  19. A RCT comparing lumbosacral orthosis to routine physical therapy on postural stability in patients with chronic low back pain

    PubMed Central

    Azadinia, Fatemeh; Ebrahimi-Takamjani, Ismail; Kamyab, Mojtaba; Parnianpour, Mohamad; Asgari, Morteza

    2017-01-01

    Background: Poor balance performance and impaired postural control have been frequently reported in patients with low back pain. However, postural control is rarely monitored during the course of treatment even though poor postural control may contribute to chronicity and recurrence of symptoms. Therefore, the present study aimed at investigating the effect of a nonextensible lumbosacral orthosis (LSO) versus routine physical therapy on postural stability of patients with nonspecific chronic low back pain. Methods: This was a randomized controlled trial conducted between November 2015 and May 2016 at the outpatient physical therapy clinic of the School of Rehabilitation Sciences. Patients with nonspecific chronic low back pain aged 20 to 55 years were randomly allocated to the intervention and control groups. Both groups received 8 sessions of physical therapy twice weekly for 4 weeks. The intervention group received nonextensible LSO in addition to routine physical therapy. Pain intensity, functional disability, fear of movement/ (re)injury, and postural stability in 3 levels of postural difficulty were measured before and after 4 weeks of intervention. A 2×2×3 mixed model of analysis of variance (ANOVA) was used to determine the main and interactive effects of the 3 factors including group, time, and postural difficulty conditions for each variable of postural stability. Results: The LSO and control groups displayed significant improvement in postural stability at the most difficult postural task conditions (P-value for 95% area ellipse was 0.003; and for phase plane, the mean total velocity and standard deviation of velocity was <0.001). Both groups exhibited a decrease in pain intensity, Oswestry Disability Index, and Tampa Scale of Kinesiophobia after 4 weeks of intervention. A significant difference between groups was found only for functional disability, with greater improvement in the orthosis group (t = 3.60, P<0.001). Conclusion: Both routine physical therapy and LSO significantly improved clinical and postural stability outcomes immediately after 4 weeks of intervention. The orthosis group did not display superior outcomes, except for functional disability. PMID:29445655

  20. PREFACE: 4th International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS)

    NASA Astrophysics Data System (ADS)

    Wang, Alexandre; Abe, Sumiyoshi; Li, Wei

    2015-04-01

    This volume contains 24 contributed papers presented at the 4th International Workshop on Statistical Physics and Mathematics for Complex Systems (SPMCS) held during October 12-16, 2014 in Yichang, China. Each paper was peer-reviewed by at least one referee chosen from a distinguished international panel. The previous three workshops of this series were organized in 2008, 2010, and 2012, in Le Mans, France, Wuhan, China, and Kazan, Russia, respectively. The SPMCS international workshop series is destined mainly to communicate and exchange research results and information on the fundamental challenges and questions in the vanguard of statistical physics, thermodynamics and mathematics for complex systems. More specifically, the topics of interest touch, but are not limited to, the following: • Fundamental aspects in the application of statistical physics and thermodynamics to complex systems and their modeling • Finite size and non-extensive system • Fluctuation theorems and equalities, quantum thermodynamics • Variational principle for random dynamics • Fractal geometry, fractional mathematics More than 50 participants from 7 countries participated in SPMCS-2014. 35 oral contributions were presented at the workshop. We would like to take this opportunity to thank the members of the Scientific Program Committee, many of whom acted as reviewers of the papers and responded promptly. We would also like to thank the organizing committee, the session chairs, the technicians and the students for the smooth running of the whole workshop. Thanks also go to China Three Gorges University who provided generous support for the conference venue, as well as exquisite refreshments for the tea breaks. The workshop was also partially supported by Central China Normal University and the Programme of Introducing Talents of Discipline to Universities under grant NO. B08033. Special thanks are due to Ms Juy Zhu who has done excellent editing work with great effort.

  1. Cell biology perspectives in phage biology.

    PubMed

    Ansaldi, Mireille

    2012-01-01

    Cellular biology has long been restricted to large cellular organisms. However, as the resolution of microscopic methods increased, it became possible to study smaller cells, in particular bacterial cells. Bacteriophage biology is one aspect of bacterial cell biology that has recently gained insight from cell biology. Despite their small size, bacteriophages could be successfully labeled and their cycle studied in the host cells. This review aims to put together, although non-extensively, several cell biology studies that recently pushed the elucidation of key mechanisms in phage biology, such as the lysis-lysogeny decision in temperate phages or genome replication and transcription, one step further.

  2. A maximum (non-extensive) entropy approach to equity options bid-ask spread

    NASA Astrophysics Data System (ADS)

    Tapiero, Oren J.

    2013-07-01

    The cross-section of options bid-ask spreads with their strikes are modelled by maximising the Kaniadakis entropy. A theoretical model results with the bid-ask spread depending explicitly on the implied volatility; the probability of expiring at-the-money and an asymmetric information parameter (κ). Considering AIG as a test case for the period between January 2006 and October 2008, we find that information flows uniquely from the trading activity in the underlying asset to its derivatives. Suggesting that κ is possibly an option implied measure of the current state of trading liquidity in the underlying asset.

  3. The effects of variable dust size and charge on dust acoustic waves propagating in a hybrid Cairns–Tsallis complex plasma

    NASA Astrophysics Data System (ADS)

    El-Taibany, W. F.; El-Siragy, N. M.; Behery, E. E.; Elbendary, A. A.; Taha, R. M.

    2018-05-01

    The propagation characteristics of dust acoustic waves (DAWs) in a dusty plasma consisting of variable size dust grains, hybrid Cairns-Tsallis-distributed electrons, and nonthermal ions are studied. The charging of the dust grains is described by the orbital-motion-limited theory and the size of the dust grains obeys the power law dust size distribution. To describe the nonlinear propagation of the DAWs, a Zakharov-Kuznetsov equation is derived using a reductive perturbation method. It is found that the nonthermal and nonextensive parameters influence the main properties of DAWs. Moreover, our results reveal that the rarefactive waves can propagate mainly in the proposed plasma model while compressive waves can be detected for a very small range of the distribution parameters of plasma species, and the DAWs are faster and wider for smaller size dust grains. Applications of the present results to dusty plasma observations are briefly discussed.

  4. Role of dimensionality in preferential attachment growth in the Bianconi-Barabási model

    NASA Astrophysics Data System (ADS)

    Nunes, Thiago C.; Brito, Samurai; da Silva, Luciano R.; Tsallis, Constantino

    2017-09-01

    Scale-free networks are quite popular nowadays since many systems are well represented by such structures. In order to study these systems, several models were proposed. However, most of them do not take into account the node-to-node Euclidean distance, i.e. the geographical distance. In real networks, the distance between sites can be very relevant, e.g. those cases where it is intended to minimize costs. Within this scenario we studied the role of dimensionality d in the Bianconi-Barabási model with a preferential attachment growth involving Euclidean distances. The preferential attachment in this model follows the rule \\Pii \\propto ηi k_i/rijα_A (1 ≤slant i < j; αA ≥slant 0) , where ηi characterizes the fitness of the ith site and is randomly chosen within the (0, 1] interval. We verified that the degree distribution P(k) for dimensions d=1, 2, 3, 4 are well fitted by P(k) \\propto e_q-k/κ , where e_q-k/κ is the q-exponential function naturally emerging within nonextensive statistical mechanics. We determine the index q and κ as functions of the quantities αA and d, and numerically verify that both present a universal behavior with respect to the scaled variable α_A/d . The same behavior also has been displayed by the dynamical β exponent which characterizes the steadily growing number of links of a given site.

  5. Fracture and earthquake physics in a non extensive view

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.

    2009-04-01

    It is well known that the Gutenberg-Richter (G-R) power law distribution has to be modified for large seismic moments because of energy conservation and geometrical reasons. Several models have been proposed, either in terms of a second power law with a larger b value beyond a crossover magnitude, or based on a magnidute cut-off using an exponential taper. In the present work we point out that the non extensivity viewpoint is applicable to seismic processes. In the frame of a non-extensive approach which is based on Tsallis entropy we construct a generalized expression of Gutenberg-Richter (GGR) law. The existence of lower or/and upper bound to magnitude is discussed and the conditions under which GGR lead to classical GR law are analysed. For the lowest earthquake size (i.e., energy level) the correlation between the different parts of elements involved in the evolution of an earthquake are short-ranged and GR can be deduced on the basis of the maximum entropy principle using BG statistics. As the size (i.e., energy) increases, long range correlation becomes much more important, implying the necessity of using Tsallis entropy as an appropriate generalization of BG entropy. The power law behaviour is derived as a special case, leading to b-values being functions of the non-extensivity parameter q. Furthermore a theoretical analysis of similarities presented in stress stimulated electric and acoustic emissions and earthquakes are discussed not only in the frame of GGR but taking into account a universality in the description of intrevent times distribution. Its particular form can be well expressed in the frame of a non extensive approach. This formulation is very different from an exponential distribution expected for simple random Poisson processes and indicates the existence of a nontrivial universal mechanism in the generation process. All the aforementioned similarities within stress stimulated electrical and acoustic emissions and seismicity suggests a connection with fracture phenomena at much larger scales implying that a basic general mechanism is "actively hidden" behind all this phenomena. Acknowledgements: This work is partially supported by the Greek General Secretariat of Research and Technology in the frame of Crete Regional Project 2000- 2006 (M1.2): "TALOS: An integrated system of seismic hazard monitoring and management in the front of the Hellenic Arc", CRETE PEP 7 (KP 7). [1] F. Vallianatos and A. Tzanis, Phys. Chem. Earth 23, 933 (1998). [2] A. Tzanis and F. Vallianatos, Natural Hazards and Earth Syst, Sciences 3 (2003). [3] F. Vallianatos, D. Triantis, A. Tzanis, C. Anastasiadis and I. Stavrakas, Phys. Chem. Earth 29, 339 (2004). [4] C. Anastasiadis, D. Triantis, I. Stavrakas and F. Vallianatos, Ann. Geophys. 47, 21 (2004). [5] F. Vallianatos Proc., 2nd WSEAS Int. Conference on Seismology, (Cambridge, UK, 2008). [6] F. Vallianatos, Natural Hazards and Earth Syst, Sciences (2009).

  6. Fundamental properties of fracture and seismicity in a non extensive statistical physics framework.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2010-05-01

    A fundamental challenge in many scientific disciplines concerns upscaling, that is, of determining the regularities and laws of evolution at some large scale, from those known at a lower scale. Earthquake physics is no exception, with the challenge of understanding the transition from the laboratory scale to the scale of fault networks and large earthquakes. In this context, statistical physics has a remarkably successful work record in addressing the upscaling problem in physics. It is natural then to consider that the physics of many earthquakes has to be studied with a different approach than the physics of one earthquake and in this sense we can consider the use of statistical physics not only appropriate but necessary to understand the collective properties of earthquakes [see Corral 2004, 2005a,b,c;]. A significant attempt is given in a series of works [Main 1996; Rundle et al., 1997; Main et al., 2000; Main and Al-Kindy, 2002; Rundle et al., 2003; Vallianatos and Triantis, 2008a] that uses classical statistical physics to describe seismicity. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from fracture level to seismicity scale?? The application of non extensive statistical physics offers a consistent theoretical framework, based on a generalization of entropy, to analyze the behavior of natural systems with fractal or multi-fractal distribution of their elements. Such natural systems where long - range interactions or intermittency are important, lead to power law behavior. We note that this is consistent with a classical thermodynamic approach to natural systems that rapidly attain equilibrium, leading to exponential-law behavior. In the frame of non extensive statistical physics approach, the probability function p(X) is calculated using the maximum entropy formulation of Tsallis entropy which involves the introduction of at least two constraints (Tsallis et al., 1998). The first one is the classical normalization of p(X). The second one is based on the definition of the expectation value which has to be generalized to the "q-expectation value", according to the generalization of the entropy [Abe and Suzuki, 2003]. In order to calculate p(X) we apply the technique of Langrange multipliers maximizing an appropriate functional and leading tο maximization of the Tsallis entropy under the constraints on the normalization and the q-expectation value. It is well known that the Gutenberg-Richter (G-R) power law distribution has to be modified for large seismic moments because of energy conservation and geometrical reasons. Several models have been proposed, either in terms of a second power law with a larger b value beyond a crossover magnitude, or based on a magnidute cut-off using an exponential taper. In the present work we point out that the non extensivity viewpoint is applicable to seismic processes. In the frame of a non-extensive approach which is based on Tsallis entropy we construct a generalized expression of Gutenberg-Richter (GGR) law [Vallianatos, 2008]. The existence of lower or/and upper bound to magnitude is discussed and the conditions under which GGR lead to classical GR law are analysed. For the lowest earthquake size (i.e., energy level) the correlation between the different parts of elements involved in the evolution of an earthquake are short-ranged and GR can be deduced on the basis of the maximum entropy principle using BG statistics. As the size (i.e., energy) increases, long range correlation becomes much more important, implying the necessity of using Tsallis entropy as an appropriate generalization of BG entropy. The power law behaviour is derived as a special case, leading to b-values being functions of the non-extensivity parameter q. Furthermore a theoretical analysis of similarities presented in stress stimulated electric and acoustic emissions and earthquakes are discussed not only in the frame of GGR but taking into account a universality in the description of intrevent times distribution. Its particular form can be well expressed in the frame of a non extensive approach. This formulation is very different from an exponential distribution expected for simple random Poisson processes and indicates the existence of a nontrivial universal mechanism in the generation process. All the aforementioned similarities within stress stimulated electrical and acoustic emissions and seismicity suggests a connection with fracture phenomena at much larger scales implying that a basic general mechanism is "actively hidden" behind all this phenomena [Vallianatos and Triantis, 2008b]. Examples from S.Aegean seismicity are given. Acknowledgements: This work is partially supported by the "NEXT EARTH" project FP7-PEOPLE, 2009-2011 References Abe S. and Suzuki N., J. Goephys. Res. 108 (B2), 2113, 2003. Corral A., Phys. Rev. Lett. 92, 108501, 2004. Corral A., Nonlinear Proc. Geophys. 12, 89, 2005a. Corral A., Phys. Rev. E 71, 017101, 2005b. Corral A., Phys. Rev. Lett. 95, 028501, 2005c. Main I. G., Rev. of Geoph., 34, 433, 1996. Main I. G., O' Brien G. And Henderson R., J. Geoph. Res., 105, 6105, 2000. Main I. G. and Al-Kindy F. H., Geoph. Res. Let., 29, 7, 2002. Rundle J. B., Gross S., Klein W., Fergunson C. and Turcotte D., Tectonophysics, 277, 147-164, 1997. Rundle J. B., Turcotte D. L., Shcherbakov R., Klein W. and Sammis C., Rev. Geophys. 41, 1019, 2003. Tsallis C., J. Stat. Phys. 52, 479, 1988; See also http://tsallis.cat.cbpf.br/biblio.htm for an updated bibliography. Vallianatos, F., 2th IASME/WSEAS International Conference on Geology and Seismology (GES08), Cambridge, U.K, 2008. Vallianatos F. and Triantis D., Physica A, 387, 4940-4946, 2008a.

  7. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    NASA Astrophysics Data System (ADS)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  8. Pattern formation in nonextensive thermodynamics: selection criterion based on the Renyi entropy production.

    PubMed

    Cybulski, Olgierd; Matysiak, Daniel; Babin, Volodymyr; Holyst, Robert

    2005-05-01

    We analyze a system of two different types of Brownian particles confined in a cubic box with periodic boundary conditions. Particles of different types annihilate when they come into close contact. The annihilation rate is matched by the birth rate, thus the total number of each kind of particles is conserved. When in a stationary state, the system is divided by an interface into two subregions, each occupied by one type of particles. All possible stationary states correspond to the Laplacian eigenfunctions. We show that the system evolves towards those stationary distributions of particles which minimize the Renyi entropy production. In all cases, the Renyi entropy production decreases monotonically during the evolution despite the fact that the topology and geometry of the interface exhibit abrupt and violent changes.

  9. PIC simulation of compressive and rarefactive dust ion-acoustic solitary waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhong-Zheng; Zhang, Heng; Hong, Xue-Ren

    The nonlinear propagations of dust ion-acoustic solitary waves in a collisionless four-component unmagnetized dusty plasma system containing nonextensive electrons, inertial negative ions, Maxwellian positive ions, and negatively charged static dust grains have been investigated by the particle-in-cell method. By comparing the simulation results with those obtained from the traditional reductive perturbation method, it is observed that the rarefactive KdV solitons propagate stably at a low amplitude, and when the amplitude is increased, the prime wave form evolves and then gradually breaks into several small amplitude solitary waves near the tail of soliton structure. The compressive KdV solitons propagate unstably andmore » oscillation arises near the tail of soliton structure. The finite amplitude rarefactive and compressive Gardner solitons seem to propagate stably.« less

  10. The limit behavior of the evolution of the Tsallis entropy in self-gravitating systems

    NASA Astrophysics Data System (ADS)

    Zheng, Yahui; Du, Jiulin; Liang, Faku

    2017-06-01

    In this letter, we study the limit behavior of the evolution of the Tsallis entropy in self-gravitating systems. The study is carried out under two different situations, drawing the same conclusion. No matter in the energy transfer process or in the mass transfer process inside the system, when the nonextensive parameter q is more than unity, the total entropy is bounded; on the contrary, when this parameter is less than unity, the total entropy is unbounded. There are proofs in both theory and observation that the q is always more than unity. So the Tsallis entropy in self-gravitating systems generally exhibits a bounded property. This indicates the existence of a global maximum of the Tsallis entropy. It is possible for self-gravitating systems to evolve to thermodynamically stable states.

  11. Optimal percolation on multiplex networks.

    PubMed

    Osat, Saeed; Faqeeh, Ali; Radicchi, Filippo

    2017-11-16

    Optimal percolation is the problem of finding the minimal set of nodes whose removal from a network fragments the system into non-extensive disconnected clusters. The solution to this problem is important for strategies of immunization in disease spreading, and influence maximization in opinion dynamics. Optimal percolation has received considerable attention in the context of isolated networks. However, its generalization to multiplex networks has not yet been considered. Here we show that approximating the solution of the optimal percolation problem on a multiplex network with solutions valid for single-layer networks extracted from the multiplex may have serious consequences in the characterization of the true robustness of the system. We reach this conclusion by extending many of the methods for finding approximate solutions of the optimal percolation problem from single-layer to multiplex networks, and performing a systematic analysis on synthetic and real-world multiplex networks.

  12. An experimental approach to non - extensive statistical physics and Epidemic Type Aftershock Sequence (ETAS) modeling. The case of triaxially deformed sandstones using acoustic emissions.

    NASA Astrophysics Data System (ADS)

    Stavrianaki, K.; Vallianatos, F.; Sammonds, P. R.; Ross, G. J.

    2014-12-01

    Fracturing is the most prevalent deformation mechanism in rocks deformed in the laboratory under simulated upper crustal conditions. Fracturing produces acoustic emissions (AE) at the laboratory scale and earthquakes on a crustal scale. The AE technique provides a means to analyse microcracking activity inside the rock volume and since experiments can be performed under confining pressure to simulate depth of burial, AE can be used as a proxy for natural processes such as earthquakes. Experimental rock deformation provides us with several ways to investigate time-dependent brittle deformation. Two main types of experiments can be distinguished: (1) "constant strain rate" experiments in which stress varies as a result of deformation, and (2) "creep" experiments in which deformation and deformation rate vary over time as a result of an imposed constant stress. We conducted constant strain rate experiments on air-dried Darley Dale sandstone samples in a variety of confining pressures (30MPa, 50MPa, 80MPa) and in water saturated samples with 20 MPa initial pore fluid pressure. The results from these experiments used to determine the initial loading in the creep experiments. Non-extensive statistical physics approach was applied to the AE data in order to investigate the spatio-temporal pattern of cracks close to failure. A more detailed study was performed for the data from the creep experiments. When axial stress is plotted against time we obtain the trimodal creep curve. Calculation of Tsallis entropic index q is performed to each stage of the curve and the results are compared with the ones from the constant strain rate experiments. The Epidemic Type Aftershock Sequence model (ETAS) is also applied to each stage of the creep curve and the ETAS parameters are calculated. We investigate whether these parameters are constant across all stages of the curve, or whether there are interesting patterns of variation. This research has been co-funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme.

  13. Financial market dynamics: superdiffusive or not?

    NASA Astrophysics Data System (ADS)

    Devi, Sandhya

    2017-08-01

    The behavior of stock market returns over a period of 1-60 d has been investigated for S&P 500 and Nasdaq within the framework of nonextensive Tsallis statistics. Even for such long terms, the distributions of the returns are non-Gaussian. They have fat tails indicating that the stock returns do not follow a random walk model. In this work, a good fit to a Tsallis q-Gaussian distribution is obtained for the distributions of all the returns using the method of Maximum Likelihood Estimate. For all the regions of data considered, the values of the scaling parameter q, estimated from 1 d returns, lie in the range 1.4-1.65. The estimated inverse mean square deviations (beta) show a power law behavior in time with exponent values between  -0.91 and  -1.1 indicating normal to mildly subdiffusive behavior. Quite often, the dynamics of market return distributions is modelled by a Fokker-Plank (FP) equation either with a linear drift and a nonlinear diffusion term or with just a nonlinear diffusion term. Both of these cases support a q-Gaussian distribution as a solution. The distributions obtained from current estimated parameters are compared with the solutions of the FP equations. For negligible drift term, the inverse mean square deviations (betaFP) from the FP model follow a power law with exponent values between  -1.25 and  -1.48 indicating superdiffusion. When the drift term is non-negligible, the corresponding betaFP do not follow a power law and become stationary after certain characteristic times that depend on the values of the drift parameter and q. Neither of these behaviors is supported by the results of the empirical fit.

  14. Physical approach to complex systems

    NASA Astrophysics Data System (ADS)

    Kwapień, Jarosław; Drożdż, Stanisław

    2012-06-01

    Typically, complex systems are natural or social systems which consist of a large number of nonlinearly interacting elements. These systems are open, they interchange information or mass with environment and constantly modify their internal structure and patterns of activity in the process of self-organization. As a result, they are flexible and easily adapt to variable external conditions. However, the most striking property of such systems is the existence of emergent phenomena which cannot be simply derived or predicted solely from the knowledge of the systems’ structure and the interactions among their individual elements. This property points to the holistic approaches which require giving parallel descriptions of the same system on different levels of its organization. There is strong evidence-consolidated also in the present review-that different, even apparently disparate complex systems can have astonishingly similar characteristics both in their structure and in their behaviour. One can thus expect the existence of some common, universal laws that govern their properties. Physics methodology proves helpful in addressing many of the related issues. In this review, we advocate some of the computational methods which in our opinion are especially fruitful in extracting information on selected-but at the same time most representative-complex systems like human brain, financial markets and natural language, from the time series representing the observables associated with these systems. The properties we focus on comprise the collective effects and their coexistence with noise, long-range interactions, the interplay between determinism and flexibility in evolution, scale invariance, criticality, multifractality and hierarchical structure. The methods described either originate from “hard” physics-like the random matrix theory-and then were transmitted to other fields of science via the field of complex systems research, or they originated elsewhere but turned out to be very useful also in physics - like, for example, fractal geometry. Further methods discussed borrow from the formalism of complex networks, from the theory of critical phenomena and from nonextensive statistical mechanics. Each of these methods is helpful in analyses of specific aspects of complexity and all of them are mutually complementary.

  15. Anomalous diffusion associated with nonlinear fractional derivative fokker-planck-like equation: exact time-dependent solutions

    PubMed

    Bologna; Tsallis; Grigolini

    2000-08-01

    We consider the d=1 nonlinear Fokker-Planck-like equation with fractional derivatives ( partial differential/ partial differentialt)P(x,t)=D( partial differential(gamma)/ partial differentialx(gamma))[P(x,t)](nu). Exact time-dependent solutions are found for nu=(2-gamma)/(1+gamma)(-infinity

  16. The complex networks approach for authorship attribution of books

    NASA Astrophysics Data System (ADS)

    Mehri, Ali; Darooneh, Amir H.; Shariati, Ashrafalsadat

    2012-04-01

    Authorship analysis by means of textual features is an important task in linguistic studies. We employ complex networks theory to tackle this disputed problem. In this work, we focus on some measurable quantities of word co-occurrence network of each book for authorship characterization. Based on the network features, attribution probability is defined for authorship identification. Furthermore, two scaling exponents, q-parameter and α-exponent, are combined to classify personal writing style with acceptable high resolution power. The q-parameter, generally known as the nonextensivity measure, is calculated for degree distribution and the α-exponent comes from a power law relationship between number of links and number of nodes in the co-occurrence network constructed for different books written by each author. The applicability of the presented method is evaluated in an experiment with thirty six books of five Persian litterateurs. Our results show high accuracy rate in authorship attribution.

  17. Feasibility study and preliminary design of load-assisting clothes for lumbar protection inspired by human musculoskeletal systems

    NASA Astrophysics Data System (ADS)

    Hashimoto, Riho; Masuda, Arata; Chen, Hao; Kobayashi, Sou

    2016-04-01

    The purpose of this paper is to develop load assisting clothes for caregivers. Low back pain is one of the most major reasons for caregivers to leave their jobs. In this study, load assisting clothes which reduce the risks of low back pain of caregivers are designed and manufactured, targeting at the use in small care-houses and family caregiving. The load assisting clothes should have two functions. One is to reduce the compressive load acting on the lumbar spine as well as the tensile load on the lumbar muscles by providing an appropriate assisting force. The other is not to interfere with wearers' motion. The proposed approach in this study is to put elastic compressive members and tensioner belts integrated in the garment to provide the assisting forces without hindering natural movement and comfortable feeling. We study human musculoskeletal systems in the lumbar part, and consider to construct a parallel reinforcement of it on the body surface by embedding passive support structures. The arrangement of those elements is determined based on the study of the principal strain directions and the non-extension directions of the body surface to manage the appropriate assisting force without spoiling the mobility. The effectiveness of the proposed support principle is verified through experimental studies.

  18. Universalities of thermodynamic signatures in topological phases

    PubMed Central

    Kempkes, S. N.; Quelle, A.; Smith, C. Morais

    2016-01-01

    Topological insulators (superconductors) are materials that host symmetry-protected metallic edge states in an insulating (superconducting) bulk. Although they are well understood, a thermodynamic description of these materials remained elusive, firstly because the edges yield a non-extensive contribution to the thermodynamic potential, and secondly because topological field theories involve non-local order parameters, and cannot be captured by the Ginzburg-Landau formalism. Recently, this challenge has been overcome: by using Hill thermodynamics to describe the Bernevig-Hughes-Zhang model in two dimensions, it was shown that at the topological phase transition the thermodynamic potential does not scale extensively due to boundary effects. Here, we extend this approach to different topological models in various dimensions (the Kitaev chain and Su-Schrieffer-Heeger model in one dimension, the Kane-Mele model in two dimensions and the Bernevig-Hughes-Zhang model in three dimensions) at zero temperature. Surprisingly, all models exhibit the same universal behavior in the order of the topological-phase transition, depending on the dimension. Moreover, we derive the topological phase diagram at finite temperature using this thermodynamic description, and show that it displays a good agreement with the one calculated from the Uhlmann phase. Our work reveals unexpected universalities and opens the path to a thermodynamic description of systems with a non-local order parameter. PMID:27929041

  19. Universalities of thermodynamic signatures in topological phases.

    PubMed

    Kempkes, S N; Quelle, A; Smith, C Morais

    2016-12-08

    Topological insulators (superconductors) are materials that host symmetry-protected metallic edge states in an insulating (superconducting) bulk. Although they are well understood, a thermodynamic description of these materials remained elusive, firstly because the edges yield a non-extensive contribution to the thermodynamic potential, and secondly because topological field theories involve non-local order parameters, and cannot be captured by the Ginzburg-Landau formalism. Recently, this challenge has been overcome: by using Hill thermodynamics to describe the Bernevig-Hughes-Zhang model in two dimensions, it was shown that at the topological phase transition the thermodynamic potential does not scale extensively due to boundary effects. Here, we extend this approach to different topological models in various dimensions (the Kitaev chain and Su-Schrieffer-Heeger model in one dimension, the Kane-Mele model in two dimensions and the Bernevig-Hughes-Zhang model in three dimensions) at zero temperature. Surprisingly, all models exhibit the same universal behavior in the order of the topological-phase transition, depending on the dimension. Moreover, we derive the topological phase diagram at finite temperature using this thermodynamic description, and show that it displays a good agreement with the one calculated from the Uhlmann phase. Our work reveals unexpected universalities and opens the path to a thermodynamic description of systems with a non-local order parameter.

  20. Developing an agenda for research about policies to improve access to healthy foods in rural communities: a concept mapping study

    PubMed Central

    2014-01-01

    Background Policies that improve access to healthy, affordable foods may improve population health and reduce health disparities. In the United States most food access policy research focuses on urban communities even though residents of rural communities face disproportionately higher risk for nutrition-related chronic diseases compared to residents of urban communities. The purpose of this study was to (1) identify the factors associated with access to healthy, affordable food in rural communities in the United States; and (2) prioritize a meaningful and feasible rural food policy research agenda. Methods This study was conducted by the Rural Food Access Workgroup (RFAWG), a workgroup facilitated by the Nutrition and Obesity Policy Research and Evaluation Network. A national sample of academic and non-academic researchers, public health and cooperative extension practitioners, and other experts who focus on rural food access and economic development was invited to complete a concept mapping process that included brainstorming the factors that are associated with rural food access, sorting and organizing the factors into similar domains, and rating the importance of policies and research to address these factors. As a last step, RFAWG members convened to interpret the data and establish research recommendations. Results Seventy-five participants in the brainstorming exercise represented the following sectors: non-extension research (n = 27), non-extension program administration (n = 18), “other” (n = 14), policy advocacy (n = 10), and cooperative extension service (n = 6). The brainstorming exercise generated 90 distinct statements about factors associated with rural food access in the United States; these were sorted into 5 clusters. Go Zones were established for the factors that were rated highly as both a priority policy target and a priority for research. The highest ranked policy and research priorities include strategies designed to build economic viability in rural communities, improve access to federal food and nutrition assistance programs, improve food retail systems, and increase the personal food production capacity of rural residents. Respondents also prioritized the development of valid and reliable research methodologies to measure variables associated with rural food access. Conclusions This collaborative, trans-disciplinary, participatory process, created a map to guide and prioritize research about polices to improve healthy, affordable food access in rural communities. PMID:24919425

  1. Developing an agenda for research about policies to improve access to healthy foods in rural communities: a concept mapping study.

    PubMed

    Johnson, Donna B; Quinn, Emilee; Sitaker, Marilyn; Ammerman, Alice; Byker, Carmen; Dean, Wesley; Fleischhacker, Sheila; Kolodinsky, Jane; Pinard, Courtney; Pitts, Stephanie B Jilcott; Sharkey, Joseph

    2014-06-12

    Policies that improve access to healthy, affordable foods may improve population health and reduce health disparities. In the United States most food access policy research focuses on urban communities even though residents of rural communities face disproportionately higher risk for nutrition-related chronic diseases compared to residents of urban communities. The purpose of this study was to (1) identify the factors associated with access to healthy, affordable food in rural communities in the United States; and (2) prioritize a meaningful and feasible rural food policy research agenda. This study was conducted by the Rural Food Access Workgroup (RFAWG), a workgroup facilitated by the Nutrition and Obesity Policy Research and Evaluation Network. A national sample of academic and non-academic researchers, public health and cooperative extension practitioners, and other experts who focus on rural food access and economic development was invited to complete a concept mapping process that included brainstorming the factors that are associated with rural food access, sorting and organizing the factors into similar domains, and rating the importance of policies and research to address these factors. As a last step, RFAWG members convened to interpret the data and establish research recommendations. Seventy-five participants in the brainstorming exercise represented the following sectors: non-extension research (n = 27), non-extension program administration (n = 18), "other" (n = 14), policy advocacy (n = 10), and cooperative extension service (n = 6). The brainstorming exercise generated 90 distinct statements about factors associated with rural food access in the United States; these were sorted into 5 clusters. Go Zones were established for the factors that were rated highly as both a priority policy target and a priority for research. The highest ranked policy and research priorities include strategies designed to build economic viability in rural communities, improve access to federal food and nutrition assistance programs, improve food retail systems, and increase the personal food production capacity of rural residents. Respondents also prioritized the development of valid and reliable research methodologies to measure variables associated with rural food access. This collaborative, trans-disciplinary, participatory process, created a map to guide and prioritize research about polices to improve healthy, affordable food access in rural communities.

  2. Prediction of Reverse Remodeling at Cardiac MR Imaging Soon after First ST-Segment-Elevation Myocardial Infarction: Results of a Large Prospective Registry.

    PubMed

    Bodi, Vicente; Monmeneu, Jose V; Ortiz-Perez, Jose T; Lopez-Lereu, Maria P; Bonanad, Clara; Husser, Oliver; Minana, Gemma; Gomez, Cristina; Nunez, Julio; Forteza, Maria J; Hervas, Arantxa; de Dios, Elena; Moratal, David; Bosch, Xavier; Chorro, Francisco J

    2016-01-01

    To assess predictors of reverse remodeling by using cardiac magnetic resonance (MR) imaging soon after ST-segment-elevation myocardial infarction (STEMI). Written informed consent was obtained from all patients, and the study protocol was approved by the institutional committee on human research, ensuring that it conformed to the ethical guidelines of the 1975 Declaration of Helsinki. Five hundred seven patients (mean age, 58 years; age range, 24-89 years) with a first STEMI were prospectively studied. Infarct size and microvascular obstruction (MVO) were quantified at late gadolinium-enhanced imaging. Reverse remodeling was defined as a decrease in left ventricular (LV) end-systolic volume index (LVESVI) of more than 10% from 1 week to 6 months after STEMI. For statistical analysis, a simple (from a clinical perspective) multiple regression model preanalyzing infarct size and MVO were applied via univariate receiver operating characteristic techniques. Patients with reverse remodeling (n = 211, 42%) had a lesser extent (percentage of LV mass) of 1-week infarct size (mean ± standard deviation: 18% ± 13 vs 23% ± 14) and MVO (median, 0% vs 0%; interquartile range, 0%-1% vs 0%-4%) than those without reverse remodeling (n = 296, 58%) (P < .001 in pairwise comparisons). The independent predictors of reverse remodeling were infarct size (odds ratio, 0.98; 95% confidence interval [CI]: 0.97, 0.99; P = .04) and MVO (odds ratio, 0.92; 95% CI: 0.86, 0.99; P = .03). Once infarct size and MVO were dichotomized by using univariate receiver operating characteristic techniques, the only independent predictor of reverse remodeling was the presence of simultaneous nonextensive infarct-size MVO (infarct size < 30% of LV mass and MVO < 2.5% of LV mass) (odds ratio, 3.2; 95% CI: 1.8, 5.7; P < .001). Assessment of infarct size and MVO with cardiac MR imaging soon after STEMI enables one to make a decision in the prediction of reverse remodeling. © RSNA, 2015

  3. Effects of dynamical paths on the energy gap and the corrections to the free energy in path integrals of mean-field quantum spin systems

    NASA Astrophysics Data System (ADS)

    Koh, Yang Wei

    2018-03-01

    In current studies of mean-field quantum spin systems, much attention is placed on the calculation of the ground-state energy and the excitation gap, especially the latter, which plays an important role in quantum annealing. In pure systems, the finite gap can be obtained by various existing methods such as the Holstein-Primakoff transform, while the tunneling splitting at first-order phase transitions has also been studied in detail using instantons in many previous works. In disordered systems, however, it remains challenging to compute the gap of large-size systems with specific realization of disorder. Hitherto, only quantum Monte Carlo techniques are practical for such studies. Recently, Knysh [Nature Comm. 7, 12370 (2016), 10.1038/ncomms12370] proposed a method where the exponentially large dimensionality of such systems is condensed onto a random potential of much lower dimension, enabling efficient study of such systems. Here we propose a slightly different approach, building upon the method of static approximation of the partition function widely used for analyzing mean-field models. Quantum effects giving rise to the excitation gap and nonextensive corrections to the free energy are accounted for by incorporating dynamical paths into the path integral. The time-dependence of the trace of the time-ordered exponential of the effective Hamiltonian is calculated by solving a differential equation perturbatively, yielding a finite-size series expansion of the path integral. Formulae for the first excited-state energy are proposed to aid in computing the gap. We illustrate our approach using the infinite-range ferromagnetic Ising model and the Hopfield model, both in the presence of a transverse field.

  4. Towards a Unified Source-Propagation Model of Cosmic Rays

    NASA Astrophysics Data System (ADS)

    Taylor, M.; Molla, M.

    2010-07-01

    It is well known that the cosmic ray energy spectrum is multifractal with the analysis of cosmic ray fluxes as a function of energy revealing a first “knee” slightly below 1016 eV, a second knee slightly below 1018 eV and an “ankle” close to 1019 eV. The behaviour of the highest energy cosmic rays around and above the ankle is still a mystery and precludes the development of a unified source-propagation model of cosmic rays from their source origin to Earth. A variety of acceleration and propagation mechanisms have been proposed to explain different parts of the spectrum the most famous of course being Fermi acceleration in magnetised turbulent plasmas (Fermi 1949). Many others have been proposd for energies at and below the first knee (Peters & Cimento (1961); Lagage & Cesarsky (1983); Drury et al. (1984); Wdowczyk & Wolfendale (1984); Ptuskin et al. (1993); Dova et al. (0000); Horandel et al. (2002); Axford (1991)) as well as at higher energies between the first knee and the ankle (Nagano & Watson (2000); Bhattacharjee & Sigl (2000); Malkov & Drury (2001)). The recent fit of most of the cosmic ray spectrum up to the ankle using non-extensive statistical mechanics (NESM) (Tsallis et al. (2003)) provides what may be the strongest evidence for a source-propagation system deviating significantly from Boltmann statistics. As Tsallis has shown (Tsallis et al. (2003)), the knees appear as crossovers between two fractal-like thermal regimes. In this work, we have developed a generalisation of the second order NESM model (Tsallis et al. (2003)) to higher orders and we have fit the complete spectrum including the ankle with third order NESM. We find that, towards the GDZ limit, a new mechanism comes into play. Surprisingly it also presents as a modulation akin to that in our own local neighbourhood of cosmic rays emitted by the sun. We propose that this is due to modulation at the source and is possibly due to processes in the shell of the originating supernova. We report that the entire spectrum, spanning cosmic rays of local solar origin and those eminating from galactic and extra-galactic sources can be explained using a new diagnostic — the gradient of the log-log plot. This diagnostic reveals the known Boltmann statistics in the solar-terrestrial neighbourhood but at the highest energies — presumably at the cosmic ray source, with clearly separated fractal scales in between. We interpret this as modulation at the source followed by Fermi acceleration facilitated by galactic and extra-galactic magnetic fields with a final modulation in the solar-terrestrial neighbourhood. We conclude that the gradient of multifractal curves appears to be an excellent detector of fractality.

  5. Dynamical complexity detection in geomagnetic activity indices using wavelet transforms and Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Daglis, I. A.; Papadimitriou, C.; Kalimeri, M.; Anastasiadis, A.; Eftaxias, K.

    2008-12-01

    Dynamical complexity detection for output time series of complex systems is one of the foremost problems in physics, biology, engineering, and economic sciences. Especially in magnetospheric physics, accurate detection of the dissimilarity between normal and abnormal states (e.g. pre-storm activity and magnetic storms) can vastly improve space weather diagnosis and, consequently, the mitigation of space weather hazards. Herein, we examine the fractal spectral properties of the Dst data using a wavelet analysis technique. We show that distinct changes in associated scaling parameters occur (i.e., transition from anti- persistent to persistent behavior) as an intense magnetic storm approaches. We then analyze Dst time series by introducing the non-extensive Tsallis entropy, Sq, as an appropriate complexity measure. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). The Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization.

  6. CE microchips: an opened gate to food analysis.

    PubMed

    Escarpa, Alberto; González, María Cristina; Crevillén, Agustín González; Blasco, Antonio Javier

    2007-03-01

    CE microchips are the first generation of micrototal analysis systems (-TAS) emerging in the miniaturization scene of food analysis. CE microchips for food analysis are fabricated in both glass and polymer materials, such as PDMS and poly(methyl methacrylate) (PMMA), and use simple layouts of simple and double T crosses. Nowadays, the detection route preferred is electrochemical in both, amperometry and conductivity modes, using end-channel and contactless configurations, respectively. Food applications using CE microchips are now emerging since food samples present complex matrices, the selectivity being a very important challenge because the total integration of analytical steps into microchip format is very difficult. As a consequence, the first contributions that have recently appeared in the relevant literature are based primarily on fast separations of analytes of high food significance. These protocols are combined with different strategies to achieve selectivity using a suitable nonextensive sample preparation and/or strategically choosing detection routes. Polyphenolic compounds, amino acids, preservatives, and organic and inorganic ions have been studied using CE microchips. Thus, new and exciting future expectations arise in the domain of food analysis. However, several drawbacks could easily be found and assumed within the miniaturization map.

  7. Evolutionary pulsational mode dynamics in nonthermal turbulent viscous astrofluids

    NASA Astrophysics Data System (ADS)

    Karmakar, Pralay Kumar; Dutta, Pranamika

    2017-11-01

    The pulsational mode of gravitational collapse in a partially ionized self-gravitating inhomogeneous viscous nonthermal nonextensive astrofluid in the presence of turbulence pressure is illustratively analyzed. The constitutive thermal species, lighter electrons and ions, are thermostatistically treated with the nonthermal κ-distribution laws. The inertial species, such as identical heavier neutral and charged dust microspheres, are modelled in the turbulent fluid framework. All the possible linear processes responsible for dust-dust collisions are accounted. The Larson logatropic equations of state relating the dust thermal (linear) and turbulence (nonlinear) pressures with dust densities are included. A regular linear normal perturbation analysis (local) over the complex astrocloud ensues in a generalized quartic dispersion relation with unique nature of plasma-dependent multi-parametric coefficients. A numerical standpoint is provided to showcase the basic mode features in a judicious astronomical paradigm. It is shown that both the kinematic viscosity of the dust fluids and nonthermality parameter (kappa, the power-law tail index) of the thermal species act as stabilizing (damping) agent against the gravity; and so forth. The underlying evolutionary microphysics is explored. The significance of redistributing astrofluid material via waveinduced accretion in dynamic nonhomologic structureless cloud collapse leading to hierarchical astrostructure formation is actualized.

  8. Tsallis entropy and decoherence of CsI quantum pseudo dot qubit

    NASA Astrophysics Data System (ADS)

    Tiotsop, M.; Fotue, A. J.; Fotsin, H. B.; Fai, L. C.

    2017-05-01

    Polaron in CsI quantum pseudo dot under an electromagnetic field was considered, and the ground and first excited state energies were derived by employing the combining Pekar variational and unitary transformation methods. With the two-level system obtained, single qubit was envisioned and the decoherence was studied using non-extensive entropy (Tsallis entropy). Numerical results showed: (i) the increase (decrease) of the energy levels (period of oscillation) with the increase of chemical potential, the zero point of pseudo dot, cyclotron frequency, and transverse and longitudinal confinements; (ii) the Tsallis entropy evolved as a wave envelop that increase with the increase of non-extenxive parameter and with the increase of electric field strength, zero point of pseudo dot and cyclotron frequency the wave envelop evolve periodically with reduction of period; (iii) The transition probability increases from the boundary to the centre of the dot where it has its maximum value. It was also noted that the probability density oscillate with period T0 = ℏ / Δ Ε with the tunnelling of the chemical potential and zero point of the pseudo dot. These results are helpful in the control of decoherence in quantum systems and may also be useful for the design of quantum computers.

  9. Fixing the Big Bang Theory's Lithium Problem

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines) successfully predicts a lower abundance of the beryllium isotope which eventually decays into lithium relative to the classical Maxwell-Boltzmann distribution (solid lines), without changing the predicted abundances of deuterium or helium. [Hou et al. 2017]Questioning StatisticsHou and collaborators questioned a key assumption in Big Bang nucleosynthesis theory: that the nuclei involved in the process are all in thermodynamic equilibrium, and their velocities which determine the thermonuclear reaction rates are described by the classical Maxwell-Boltzmann distribution.But do nuclei still obey this classical distribution in the extremely complex, fast-expanding Big Bang hot plasma? Hou and collaborators propose that the lithium nuclei dont, and that they must instead be described by a slightly modified version of the classical distribution, accounted for using whats known as non-extensive statistics.The authors show that using the modified velocity distributions described by these statistics, they can successfully predict the observed primordial abundances of deuterium, helium, and lithium simultaneously. If this solution to the cosmological lithium problem is correct, the Big Bang theory is now one step closer to fully describing the formation of our universe.CitationS. Q. Hou et al 2017 ApJ 834 165. doi:10.3847/1538-4357/834/2/165

  10. Investigating dynamical complexity in the magnetosphere using various entropy measures

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos

    2009-09-01

    The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather applications.

  11. Thermodynamics of an ideal generalized gas: II. Means of order alpha.

    PubMed

    Lavenda, B H

    2005-11-01

    The property that power means are monotonically increasing functions of their order is shown to be the basis of the second laws not only for processes involving heat conduction, but also for processes involving deformations. This generalizes earlier work involving only pure heat conduction and underlines the incomparability of the internal energy and adiabatic potentials when expressed as powers of the adiabatic variable. In an L-potential equilibration, the final state will be one of maximum entropy, whereas in an entropy equilibration, the final state will be one of minimum L. Unlike classical equilibrium thermodynamic phase space, which lacks an intrinsic metric structure insofar as distances and other geometrical concepts do not have an intrinsic thermodynamic significance in such spaces, a metric space can be constructed for the power means: the distance between means of different order is related to the Carnot efficiency. In the ideal classical gas limit, the average change in the entropy is shown to be proportional to the difference between the Shannon and Rényi entropies for nonextensive systems that are multifractal in nature. The L potential, like the internal energy, is a Schur convex function of the empirical temperature, which satisfies Jensen's inequality, and serves as a measure of the tendency to uniformity in processes involving pure thermal conduction.

  12. Green functions and Langevin equations for nonlinear diffusion equations: A comment on ‘Markov processes, Hurst exponents, and nonlinear diffusion equations’ by Bassler et al.

    NASA Astrophysics Data System (ADS)

    Frank, T. D.

    2008-02-01

    We discuss two central claims made in the study by Bassler et al. [K.E. Bassler, G.H. Gunaratne, J.L. McCauley, Physica A 369 (2006) 343]. Bassler et al. claimed that Green functions and Langevin equations cannot be defined for nonlinear diffusion equations. In addition, they claimed that nonlinear diffusion equations are linear partial differential equations disguised as nonlinear ones. We review bottom-up and top-down approaches that have been used in the literature to derive Green functions for nonlinear diffusion equations and, in doing so, show that the first claim needs to be revised. We show that the second claim as well needs to be revised. To this end, we point out similarities and differences between non-autonomous linear Fokker-Planck equations and autonomous nonlinear Fokker-Planck equations. In this context, we raise the question whether Bassler et al.’s approach to financial markets is physically plausible because it necessitates the introduction of external traders and causes. Such external entities can easily be eliminated when taking self-organization principles and concepts of nonextensive thermostatistics into account and modeling financial processes by means of nonlinear Fokker-Planck equations.

  13. Minimization of the Renyi entropy production in the space-partitioning process.

    PubMed

    Cybulski, O; Babin, V; Hołyst, R

    2005-04-01

    The spontaneous division of space in Fleming-Viot processes is studied in terms of non-extensive thermodynamics. We analyze a system of n different types of Brownian particles confined in a box. Particles of different types annihilate each other when they come into close contact. Each process of annihilation is accompanied by a simultaneous nucleation of a particle of the same type, so that the number of particles of each component remains constant. The system eventually reaches a stationary state, in which the available space is divided into n separate subregions, each occupied by particles of one type. Within each subregion, the particle density distribution minimizes the Renyi entropy production. We show that the sum of these entropy productions in the stationary state is also minimized, i.e., the resulting boundaries between different components adopt a configuration which minimizes the total entropy production. The evolution of the system leads to decreasing of the total entropy production monotonically in time, irrespective of the initial conditions. In some circumstances, the stationary state is not unique-the entropy production may have several local minima for different configurations. In the case of a rectangular box, the existence and stability of different stationary states are studied as a function of the aspect ratio of the rectangle.

  14. Topical petrolatum gel alone versus topical silver sulfadiazine with standard gauze dressings for the treatment of superficial partial thickness burns in adults: a randomized controlled trial.

    PubMed

    Genuino, Glenn Angelo S; Baluyut-Angeles, Kathrina Victoria; Espiritu, Andre Paolo T; Lapitan, Marie Carmela M; Buckley, Brian S

    2014-11-01

    Non-extensive superficial partial thickness burns constitute a major proportion of burns. Conventional treatment involves regular changing of absorptive dressings including the application of a topical antimicrobial, commonly silver sulfadiazine. A systematic review has found insufficient evidence to support or refute such antimicrobial prophylaxis. Another review compared silver sulfadiazine dressings with other occlusive and non-antimicrobial dressings and found insufficient evidence to guide practice. Other research has suggested that dressings with petrolatum gel are as effective as silver sulfadiazine. Single-center, randomized, controlled parallel group trial comparing conventional silver sulfadiazine dressings with treatment with petrolatum gel alone. Consenting adults 18-45 years old with superficial partial thickness burns≤10% total body surface area seen within 24h of the injury were randomized to daily dressing either with petrolatum gel without top dressings or conventional silver sulfadiazine treatment with gauze dressings. Primary outcomes were blinded assessment of time to complete re-epithelialization, wound infection or allergic contact dermatitis. Secondary outcomes included assessment of ease, time and pain of dressing changes. 26 patients were randomized to petrolatum and 24 to silver sulfadiazine dressings. Follow up data available for 19 in each group. Mean time to re-epithelialization was 6.2 days (SD 2.8) in the petrolatum group and 7.8 days (SD 2.1) in the silver sulfadiazine group (p=0.050). No wound infection or dermatitis was observed in either group. Scores for adherence to wound, ease of dressing removal and time required to change dressings were significantly better in the petrolatum treatment arm (p<0.01). Petrolatum gel without top dressings may be at least as effective as silver sulfadiazine gauze dressings with regard to time to re-epithelialization, and incidence of infection and allergic contact dermatitis. Petrolatum gel appears to be an effective, affordable and widely available alternative in the treatment of minor superficial partial thickness burns in adults. Copyright © 2014 Elsevier Ltd and ISBI. All rights reserved.

  15. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone

    NASA Astrophysics Data System (ADS)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    The Hellenic Subduction Zone (HSZ) is the most seismically active region in Europe. Many destructive earthquakes have taken place along the HSZ in the past. The evolution of such active regions is expressed through seismicity and is characterized by complex phenomenology. The understanding of the tectonic evolution process and the physical state of subducting regimes is crucial in earthquake prediction. In recent years, there is a growing interest concerning an approach to seismicity based on the science of complex systems (Papadakis et al., 2013; Vallianatos et al., 2012). In this study we calculate the fractal dimension of the spatial distribution of earthquakes along the HSZ and we aim to understand the significance of the obtained values to the tectonic and geodynamic evolution of this area. We use the external seismic sources provided by Papaioannou and Papazachos (2000) to create a dataset regarding the subduction zone. According to the aforementioned authors, we define five seismic zones. Then, we structure an earthquake dataset which is based on the updated and extended earthquake catalogue for Greece and the adjacent areas by Makropoulos et al. (2012), covering the period 1976-2009. The fractal dimension of the spatial distribution of earthquakes is calculated for each seismic zone and for the HSZ as a unified system using the box-counting method (Turcotte, 1997; Robertson et al., 1995; Caneva and Smirnov, 2004). Moreover, the variation of the fractal dimension is demonstrated in different time windows. These spatiotemporal variations could be used as an additional index to inform us about the physical state of each seismic zone. As a precursor in earthquake forecasting, the use of the fractal dimension appears to be a very interesting future work. Acknowledgements Giorgos Papadakis wish to acknowledge the Greek State Scholarships Foundation (IKY). References Caneva, A., Smirnov, V., 2004. Using the fractal dimension of earthquake distributions and the slope of the recurrence curve to forecast earthquakes in Colombia. Earth Sci. Res. J., 8, 3-9. Makropoulos, K., Kaviris, G., Kouskouna, V., 2012. An updated and extended earthquake catalogue for Greece and adjacent areas since 1900. Nat. Hazards Earth Syst. Sci., 12, 1425-1430. Papadakis, G., Vallianatos, F., Sammonds, P., 2013. Evidence of non extensive statistical physics behavior of the Hellenic Subduction Zone seismicity. Tectonophysics, 608, 1037-1048. Papaioannou, C.A., Papazachos, B.C., 2000. Time-independent and time-dependent seismic hazard in Greece based on seismogenic sources. Bull. Seismol. Soc. Am., 90, 22-33. Robertson, M.C., Sammis, C.G., Sahimi, M., Martin, A.J., 1995. Fractal analysis of three-dimensional spatial distributions of earthquakes with a percolation interpretation. J. Geophys. Res., 100, 609-620. Turcotte, D.L., 1997. Fractals and chaos in geology and geophysics. Second Edition, Cambridge University Press. Vallianatos, F., Michas, G., Papadakis, G., Sammonds, P., 2012. A non-extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece). Acta Geophys., 60, 758-768.

  16. 3D skin length deformation of lower body during knee joint flexion for the practical application of functional sportswear.

    PubMed

    Choi, Jiyoung; Hong, Kyunghi

    2015-05-01

    With the advent of 3D technology in the design process, a tremendous amount of scanned data is available. However, it is difficult to trace the quantitative skin deformation of a designated location on the 3D body surface data during movement. Without identical landmarks or reflective markers, tracing the same reference points on the different body postures is not easy because of the complex shape change of the body. To find the least deformed location on the body, which is regarded as the optimal position of seams for the various lengths of functional compression pants, landmarks were directly marked on the skin of six subjects and scanned during knee joint flexion. Lines of non-extension (LoNE) and maximum stretch (LoMS) were searched for, both by tracing landmarks and newly drawn guidelines based on ratio division in various directions. Considering the waist as the anchoring position of the pants, holistic changes were quantified and visualized from the waistline in lengthwise and curvilinear deformation along the dermatomes of the lower body for various lengths of pants. Widthwise and unit area skin deformation data of the skin were also provided as guidelines for further use such as streamlined pants or design of other local wearing devices. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Multifractals embedded in short time series: An unbiased estimation of probability moment

    NASA Astrophysics Data System (ADS)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  18. Ideas for Effective Communication of Statistical Results

    DOE PAGES

    Anderson-Cook, Christine M.

    2015-03-01

    Effective presentation of statistical results to those with less statistical training, including managers and decision-makers requires planning, anticipation and thoughtful delivery. Here are several recommendations for effectively presenting statistical results.

  19. Low-Cost Methodology for Skin Strain Measurement of a Flexed Biological Limb.

    PubMed

    Lin, Bevin; Moerman, Kevin M; McMahan, Connor G; Pasch, Kenneth A; Herr, Hugh M

    2017-12-01

    The purpose of this manuscript is to compute skin strain data from a flexed biological limb, using portable, inexpensive, and easily available resources. We apply and evaluate this approach on a person with bilateral transtibial amputations, imaging left and right residual limbs in extended and flexed knee postures. We map 3-D deformations to a flexed biological limb using freeware and a simple point-and-shoot camera. Mean principal strain, maximum shear strain, as well as lines of maximum, minimum, and nonextension are computed from 3-D digital models to inform directional mappings of the strain field for an unloaded residual limb. Peak tensile strains are ∼0.3 on the anterior surface of the knee in the proximal region of the patella, whereas peak compressive strains are ∼ -0.5 on the posterior surface of the knee. Peak maximum shear strains are ∼0.3 on the posterior surface of the knee. The accuracy and precision of this methodology are assessed for a ground-truth model. The mean point location distance is found to be 0.08 cm, and the overall standard deviation for point location difference vectors is 0.05 cm. This low-cost and mobile methodology may prove critical for applications such as the prosthetic socket interface where whole-limb skin strain data are required from patients in the field outside of traditional, large-scale clinical centers. Such data may inform the design of wearable technologies that directly interface with human skin.

  20. Transition from the Unipolar Region to the Sector Zone: Voyager 2, 2013 and 2014

    NASA Astrophysics Data System (ADS)

    Burlaga, L. F.; Ness, N. F.; Richardson, J. D.

    2017-05-01

    We discuss magnetic field and plasma observations of the heliosheath made by Voyager 2 (V2) during 2013 and 2014 near solar maximum. A transition from a unipolar region to a sector zone was observed in the azimuthal angle λ between ˜2012.45 and 2013.82. The distribution of λ was strongly singly peaked at 270^\\circ in the unipolar region and double peaked in the sector zone. The δ-distribution was strongly peaked in the unipolar region and very broad in the sector zone. The distribution of daily averages of the magnetic field strength B was Gaussian in the unipolar region and lognormal in the sector zone. The correlation function of B was exponential with an e-folding time of ˜5 days in both regions. The distribution of hourly increments of B was a Tsallis distribution with nonextensivity parameter q = 1.7 ± 0.04 in the unipolar region and q = 1.44 ± 0.12 in the sector zone. The CR-B relationship qualitatively describes the 2013 observations, but not the 2014 observations. A 40 km s-1 increase in the bulk speed associated with an increase in B near 2013.5 might have been produced by the merging of streams. A “D sheet” (a broad depression in B containing a current sheet moved past V2 from days 320 to 345, 2013. The R- and N-components of the plasma velocity changed across the current sheet.

  1. Nonlinear Landau damping and formation of Bernstein-Greene-Kruskal structures for plasmas with q-nonextensive velocity distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raghunathan, M.; Ganesh, R.

    2013-03-15

    In the past, long-time evolution of an initial perturbation in collisionless Maxwellian plasma (q = 1) has been simulated numerically. The controversy over the nonlinear fate of such electrostatic perturbations was resolved by Manfredi [Phys. Rev. Lett. 79, 2815-2818 (1997)] using long-time simulations up to t=1600{omega}{sub p}{sup -1}. The oscillations were found to continue indefinitely leading to Bernstein-Greene-Kruskal (BGK)-like phase-space vortices (from here on referred as 'BGK structures'). Using a newly developed, high resolution 1D Vlasov-Poisson solver based on piecewise-parabolic method (PPM) advection scheme, we investigate the nonlinear Landau damping in 1D plasma described by toy q-distributions for long times,more » up to t=3000{omega}{sub p}{sup -1}. We show that BGK structures are found only for a certain range of q-values around q = 1. Beyond this window, for the generic parameters, no BGK structures were observed. We observe that for values of q<1 where velocity distributions have long tails, strong Landau damping inhibits the formation of BGK structures. On the other hand, for q>1 where distribution has a sharp fall in velocity, the formation of BGK structures is rendered difficult due to high wave number damping imposed by the steep velocity profile, which had not been previously reported. Wherever relevant, we compare our results with past work.« less

  2. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size.

    PubMed

    Heidel, R Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  3. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    PubMed

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  4. A General Class of Test Statistics for Van Valen’s Red Queen Hypothesis

    PubMed Central

    Wiltshire, Jelani; Huffer, Fred W.; Parker, William C.

    2014-01-01

    Van Valen’s Red Queen hypothesis states that within a homogeneous taxonomic group the age is statistically independent of the rate of extinction. The case of the Red Queen hypothesis being addressed here is when the homogeneous taxonomic group is a group of similar species. Since Van Valen’s work, various statistical approaches have been used to address the relationship between taxon age and the rate of extinction. We propose a general class of test statistics that can be used to test for the effect of age on the rate of extinction. These test statistics allow for a varying background rate of extinction and attempt to remove the effects of other covariates when assessing the effect of age on extinction. No model is assumed for the covariate effects. Instead we control for covariate effects by pairing or grouping together similar species. Simulations are used to compare the power of the statistics. We apply the test statistics to data on Foram extinctions and find that age has a positive effect on the rate of extinction. A derivation of the null distribution of one of the test statistics is provided in the supplementary material. PMID:24910489

  5. A General Class of Test Statistics for Van Valen's Red Queen Hypothesis.

    PubMed

    Wiltshire, Jelani; Huffer, Fred W; Parker, William C

    2014-09-01

    Van Valen's Red Queen hypothesis states that within a homogeneous taxonomic group the age is statistically independent of the rate of extinction. The case of the Red Queen hypothesis being addressed here is when the homogeneous taxonomic group is a group of similar species. Since Van Valen's work, various statistical approaches have been used to address the relationship between taxon age and the rate of extinction. We propose a general class of test statistics that can be used to test for the effect of age on the rate of extinction. These test statistics allow for a varying background rate of extinction and attempt to remove the effects of other covariates when assessing the effect of age on extinction. No model is assumed for the covariate effects. Instead we control for covariate effects by pairing or grouping together similar species. Simulations are used to compare the power of the statistics. We apply the test statistics to data on Foram extinctions and find that age has a positive effect on the rate of extinction. A derivation of the null distribution of one of the test statistics is provided in the supplementary material.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vignat, C.; Bercher, J.-F.

    The family of Tsallis entropies was introduced by Tsallis in 1988. The Shannon entropy belongs to this family as the limit case q{yields}1. The canonical distributions in R{sup n} that maximize this entropy under a covariance constraint are easily derived as Student-t (q<1) and Student-r (q>1) multivariate distributions. A nice geometrical result about these Student-r distributions is that they are marginal of uniform distributions on a sphere of larger dimension d with the relationship p = n+2+(2/q-1). As q{yields}1, we recover the famous Poincare's observation according to which a Gaussian vector can be viewed as the projection of a vectormore » uniformly distributed on the infinite dimensional sphere. A related property in the case q<1 is also available. Often associated to Renyi-Tsallis entropies is the notion of escort distributions. We provide here a geometric interpretation of these distributions. Another result concerns a universal system in physics, the harmonic oscillator: in the usual quantum context, the waveform of the n-th state of the harmonic oscillator is a Gaussian waveform multiplied by the degree n Hermite polynomial. We show, starting from recent results by Carinena et al., that the quantum harmonic oscillator on spaces with constant curvature is described by maximal Tsallis entropy waveforms multiplied by the extended Hermite polynomials derived from this measure. This gives a neat interpretation of the non-extensive parameter q in terms of the curvature of the space the oscillator evolves on; as q{yields}1, the curvature of the space goes to 0 and we recover the classical harmonic oscillator in R{sup 3}.« less

  7. Kappa and other nonequilibrium distributions from the Fokker-Planck equation and the relationship to Tsallis entropy.

    PubMed

    Shizgal, Bernie D

    2018-05-01

    This paper considers two nonequilibrium model systems described by linear Fokker-Planck equations for the time-dependent velocity distribution functions that yield steady state Kappa distributions for specific system parameters. The first system describes the time evolution of a charged test particle in a constant temperature heat bath of a second charged particle. The time dependence of the distribution function of the test particle is given by a Fokker-Planck equation with drift and diffusion coefficients for Coulomb collisions as well as a diffusion coefficient for wave-particle interactions. A second system involves the Fokker-Planck equation for electrons dilutely dispersed in a constant temperature heat bath of atoms or ions and subject to an external time-independent uniform electric field. The momentum transfer cross section for collisions between the two components is assumed to be a power law in reduced speed. The time-dependent Fokker-Planck equations for both model systems are solved with a numerical finite difference method and the approach to equilibrium is rationalized with the Kullback-Leibler relative entropy. For particular choices of the system parameters for both models, the steady distribution is found to be a Kappa distribution. Kappa distributions were introduced as an empirical fitting function that well describe the nonequilibrium features of the distribution functions of electrons and ions in space science as measured by satellite instruments. The calculation of the Kappa distribution from the Fokker-Planck equations provides a direct physically based dynamical approach in contrast to the nonextensive entropy formalism by Tsallis [J. Stat. Phys. 53, 479 (1988)JSTPBS0022-471510.1007/BF01016429].

  8. Kappa and other nonequilibrium distributions from the Fokker-Planck equation and the relationship to Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Shizgal, Bernie D.

    2018-05-01

    This paper considers two nonequilibrium model systems described by linear Fokker-Planck equations for the time-dependent velocity distribution functions that yield steady state Kappa distributions for specific system parameters. The first system describes the time evolution of a charged test particle in a constant temperature heat bath of a second charged particle. The time dependence of the distribution function of the test particle is given by a Fokker-Planck equation with drift and diffusion coefficients for Coulomb collisions as well as a diffusion coefficient for wave-particle interactions. A second system involves the Fokker-Planck equation for electrons dilutely dispersed in a constant temperature heat bath of atoms or ions and subject to an external time-independent uniform electric field. The momentum transfer cross section for collisions between the two components is assumed to be a power law in reduced speed. The time-dependent Fokker-Planck equations for both model systems are solved with a numerical finite difference method and the approach to equilibrium is rationalized with the Kullback-Leibler relative entropy. For particular choices of the system parameters for both models, the steady distribution is found to be a Kappa distribution. Kappa distributions were introduced as an empirical fitting function that well describe the nonequilibrium features of the distribution functions of electrons and ions in space science as measured by satellite instruments. The calculation of the Kappa distribution from the Fokker-Planck equations provides a direct physically based dynamical approach in contrast to the nonextensive entropy formalism by Tsallis [J. Stat. Phys. 53, 479 (1988), 10.1007/BF01016429].

  9. The Effect Size Statistic: Overview of Various Choices.

    ERIC Educational Resources Information Center

    Mahadevan, Lakshmi

    Over the years, methodologists have been recommending that researchers use magnitude of effect estimates in result interpretation to highlight the distinction between statistical and practical significance (cf. R. Kirk, 1996). A magnitude of effect statistic (i.e., effect size) tells to what degree the dependent variable can be controlled,…

  10. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  11. Modeling Ka-band low elevation angle propagation statistics

    NASA Technical Reports Server (NTRS)

    Russell, Thomas A.; Weinfield, John; Pearson, Chris; Ippolito, Louis J.

    1995-01-01

    The statistical variability of the secondary atmospheric propagation effects on satellite communications cannot be ignored at frequencies of 20 GHz or higher, particularly if the propagation margin allocation is such that link availability falls below 99 percent. The secondary effects considered in this paper are gaseous absorption, cloud absorption, and tropospheric scintillation; rain attenuation is the primary effect. Techniques and example results are presented for estimation of the overall combined impact of the atmosphere on satellite communications reliability. Statistical methods are employed throughout and the most widely accepted models for the individual effects are used wherever possible. The degree of correlation between the effects is addressed and some bounds on the expected variability in the combined effects statistics are derived from the expected variability in correlation. Example estimates are presented of combined effects statistics in the Washington D.C. area of 20 GHz and 5 deg elevation angle. The statistics of water vapor are shown to be sufficient for estimation of the statistics of gaseous absorption at 20 GHz. A computer model based on monthly surface weather is described and tested. Significant improvement in prediction of absorption extremes is demonstrated with the use of path weather data instead of surface data.

  12. An R2 statistic for fixed effects in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Muller, Keith E; Wolfinger, Russell D; Qaqish, Bahjat F; Schabenberger, Oliver

    2008-12-20

    Statisticians most often use the linear mixed model to analyze Gaussian longitudinal data. The value and familiarity of the R(2) statistic in the linear univariate model naturally creates great interest in extending it to the linear mixed model. We define and describe how to compute a model R(2) statistic for the linear mixed model by using only a single model. The proposed R(2) statistic measures multivariate association between the repeated outcomes and the fixed effects in the linear mixed model. The R(2) statistic arises as a 1-1 function of an appropriate F statistic for testing all fixed effects (except typically the intercept) in a full model. The statistic compares the full model with a null model with all fixed effects deleted (except typically the intercept) while retaining exactly the same covariance structure. Furthermore, the R(2) statistic leads immediately to a natural definition of a partial R(2) statistic. A mixed model in which ethnicity gives a very small p-value as a longitudinal predictor of blood pressure (BP) compellingly illustrates the value of the statistic. In sharp contrast to the extreme p-value, a very small R(2) , a measure of statistical and scientific importance, indicates that ethnicity has an almost negligible association with the repeated BP outcomes for the study.

  13. Pore Pressure Diffusion as a possible mechanism for the Ag. Ioanis 2001 earthquake swarm activity (Gulf of Corinth, Central Greece).

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Michas, G.; Papadakis, G.; Sammonds, P.

    2012-04-01

    The Gulf of Corinth rift (Central Greece) is one of the most seismotectonically active areas in Europe (Ambraseys and Jackson, 1990; 1997), with an important continental N-S extension of about 13 mm/yr and 6 mm/yr at the west and east part respectively (Clarke et al., 1997a). The seismicity of the area includes 5 main earthquakes of magnitude greater than 5.8 since 1960. In the western part of the rift, where the extension reaches its maximum value, earthquake swarms are often being observed (Bourouis and Cornet, 2009). Such an earthquake crisis has been occurred on 2001 at the southern margin of the west part of the rift. The crisis lasted about 100 days with a major event the Ag. Ioanis earthquake (4.3 Mw) on 8th of April 2001 (Pacchiani and Lyon-Caen, 2010). The possible relation between fluids flow and the observed earthquake swarms at the west part of the Gulf of Corinth rift has been discussed in the works of Bourouis and Cornet (2009) and Pacchiani and Lyon-Caen (2010). In the present work we examine the spatiotemporal properties of the Ag. Ioanis 2001 earthquake swarm, using data from the CRL network (http://crlab.eu/). We connect these properties to a mechanism due to pore pressure diffusion (Shapiro et al., 1997) and we estimate the hydraulic diffusivity and the permeability of the surrounding rocks. A back front of the seismicity (Parotidis et al., 2004) is also been observed, related to the migration of seismicity and the development of a quiescence region near the area of the initial pore pressure perturbation. Moreover, anisotropy of the hydraulic diffusivity has been observed, revealing the heterogeneity of the surrounding rocks and the fracture systems. This anisotropy is consistent in direction with the fault zone responsible for the Ag. Ioanis earthquake (Pacchiani and Lyon-Caen, 2010). Our results indicate that fluids flow and pore pressure perturbations are possible mechanisms for the initiation and the evolution of the Ag. Ioanis 2001 earthquake swarm activity and reveal the possible connection of the complex fracture network to the spatial evolution of seismicity on an active tectonic region as is the Gulf of Corinth rift. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non-extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  14. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    ERIC Educational Resources Information Center

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  15. Quantifying the impact of between-study heterogeneity in multivariate meta-analyses

    PubMed Central

    Jackson, Dan; White, Ian R; Riley, Richard D

    2012-01-01

    Measures that quantify the impact of heterogeneity in univariate meta-analysis, including the very popular I2 statistic, are now well established. Multivariate meta-analysis, where studies provide multiple outcomes that are pooled in a single analysis, is also becoming more commonly used. The question of how to quantify heterogeneity in the multivariate setting is therefore raised. It is the univariate R2 statistic, the ratio of the variance of the estimated treatment effect under the random and fixed effects models, that generalises most naturally, so this statistic provides our basis. This statistic is then used to derive a multivariate analogue of I2, which we call . We also provide a multivariate H2 statistic, the ratio of a generalisation of Cochran's heterogeneity statistic and its associated degrees of freedom, with an accompanying generalisation of the usual I2 statistic, . Our proposed heterogeneity statistics can be used alongside all the usual estimates and inferential procedures used in multivariate meta-analysis. We apply our methods to some real datasets and show how our statistics are equally appropriate in the context of multivariate meta-regression, where study level covariate effects are included in the model. Our heterogeneity statistics may be used when applying any procedure for fitting the multivariate random effects model. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22763950

  16. The Effect of Student-Driven Projects on the Development of Statistical Reasoning

    ERIC Educational Resources Information Center

    Sovak, Melissa M.

    2010-01-01

    Research has shown that even if students pass a standard introductory statistics course, they often still lack the ability to reason statistically. Many instructional techniques for enhancing the development of statistical reasoning have been discussed, although there is often little to no experimental evidence that they produce effective results…

  17. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  18. Perceived Effectiveness among College Students of Selected Statistical Measures in Motivating Exercise Behavior

    ERIC Educational Resources Information Center

    Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.

    2005-01-01

    This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…

  19. The Influence of Statistical versus Exemplar Appeals on Indian Adults' Health Intentions: An Investigation of Direct Effects and Intervening Persuasion Processes.

    PubMed

    McKinley, Christopher J; Limbu, Yam; Jayachandran, C N

    2017-04-01

    In two separate investigations, we examined the persuasive effectiveness of statistical versus exemplar appeals on Indian adults' smoking cessation and mammography screening intentions. To more comprehensively address persuasion processes, we explored whether message response and perceived message effectiveness functioned as antecedents to persuasive effects. Results showed that statistical appeals led to higher levels of health intentions than exemplar appeals. In addition, findings from both studies indicated that statistical appeals stimulated more attention and were perceived as more effective than anecdotal accounts. Among male smokers, statistical appeals also generated greater cognitive processing than exemplar appeals. Subsequent mediation analyses revealed that message response and perceived message effectiveness fully carried the influence of appeal format on health intentions. Given these findings, future public health initiatives conducted among similar populations should design messages that include substantive factual information while ensuring that this content is perceived as credible and valuable.

  20. Designing a Course in Statistics for a Learning Health Systems Training Program

    ERIC Educational Resources Information Center

    Samsa, Gregory P.; LeBlanc, Thomas W.; Zaas, Aimee; Howie, Lynn; Abernethy, Amy P.

    2014-01-01

    The core pedagogic problem considered here is how to effectively teach statistics to physicians who are engaged in a "learning health system" (LHS). This is a special case of a broader issue--namely, how to effectively teach statistics to academic physicians for whom research--and thus statistics--is a requirement for professional…

  1. Statistical Learning Effects in Musicians and Non-Musicians: An MEG Study

    ERIC Educational Resources Information Center

    Paraskevopoulos, Evangelos; Kuchenbuch, Anja; Herholz, Sibylle C.; Pantev, Christo

    2012-01-01

    This study aimed to assess the effect of musical training in statistical learning of tone sequences using Magnetoencephalography (MEG). Specifically, MEG recordings were used to investigate the neural and functional correlates of the pre-attentive ability for detection of deviance, from a statistically learned tone sequence. The effect of…

  2. Effects of drain bias on the statistical variation of double-gate tunnel field-effect transistors

    NASA Astrophysics Data System (ADS)

    Choi, Woo Young

    2017-04-01

    The effects of drain bias on the statistical variation of double-gate (DG) tunnel field-effect transistors (TFETs) are discussed in comparison with DG metal-oxide-semiconductor FETs (MOSFETs). Statistical variation corresponds to the variation of threshold voltage (V th), subthreshold swing (SS), and drain-induced barrier thinning (DIBT). The unique statistical variation characteristics of DG TFETs and DG MOSFETs with the variation of drain bias are analyzed by using full three-dimensional technology computer-aided design (TCAD) simulation in terms of the three dominant variation sources: line-edge roughness (LER), random dopant fluctuation (RDF) and workfunction variation (WFV). It is observed than DG TFETs suffer from less severe statistical variation as drain voltage increases unlike DG MOSFETs.

  3. The large sample size fallacy.

    PubMed

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  4. Effect of numbers vs pictures on perceived effectiveness of a public safety awareness advertisement.

    PubMed

    Bochniak, S; Lammers, H B

    1991-08-01

    In a 2 x 2 completely randomized factorial experiment, 24 women and 16 men rated the perceived effectiveness of an earthquake preparedness advertisement which contained either a picture or no picture of prior earthquake damage and contained either statistics or no statistics on likelihood of an earthquake. A main effect for superiority of the picture was found. The presence of statistics had no main or interactive effects on the perceived effectiveness of the advertisement.

  5. On P values and effect modification.

    PubMed

    Mayer, Martin

    2017-12-01

    A crucial element of evidence-based healthcare is the sound understanding and use of statistics. As part of instilling sound statistical knowledge and practice, it seems useful to highlight instances of unsound statistical reasoning or practice, not merely in captious or vitriolic spirit, but rather, to use such error as a springboard for edification by giving tangibility to the concepts at hand and highlighting the importance of avoiding such error. This article aims to provide an instructive overview of two key statistical concepts: effect modification and P values. A recent article published in the Journal of the American College of Cardiology on side effects related to statin therapy offers a notable example of errors in understanding effect modification and P values, and although not so critical as to entirely invalidate the article, the errors still demand considerable scrutiny and correction. In doing so, this article serves as an instructive overview of the statistical concepts of effect modification and P values. Judicious handling of statistics is imperative to avoid muddying their utility. This article contributes to the body of literature aiming to improve the use of statistics, which in turn will help facilitate evidence appraisal, synthesis, translation, and application.

  6. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    ERIC Educational Resources Information Center

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  7. Needle Acupuncture for Substance Use Disorders: A Systematic Review

    DTIC Science & Technology

    2015-01-01

    RCTs). We did identify statistically significant, clinically medium effects in favor of acupuncture (as an adjunctive or monotherapy) versus any...statistically significant, clinically medium effects in favor of acupuncture (as an adjunctive or monotherapy) versus any comparator at...at postintervention. We did identify statistically significant, clinically medium effects in favor of acupuncture (as an adjunctive or monotherapy

  8. Can statistical linkage of missing variables reduce bias in treatment effect estimates in comparative effectiveness research studies?

    PubMed

    Crown, William; Chang, Jessica; Olson, Melvin; Kahler, Kristijan; Swindle, Jason; Buzinec, Paul; Shah, Nilay; Borah, Bijan

    2015-09-01

    Missing data, particularly missing variables, can create serious analytic challenges in observational comparative effectiveness research studies. Statistical linkage of datasets is a potential method for incorporating missing variables. Prior studies have focused upon the bias introduced by imperfect linkage. This analysis uses a case study of hepatitis C patients to estimate the net effect of statistical linkage on bias, also accounting for the potential reduction in missing variable bias. The results show that statistical linkage can reduce bias while also enabling parameter estimates to be obtained for the formerly missing variables. The usefulness of statistical linkage will vary depending upon the strength of the correlations of the missing variables with the treatment variable, as well as the outcome variable of interest.

  9. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  10. Statistical Power of Alternative Structural Models for Comparative Effectiveness Research: Advantages of Modeling Unreliability.

    PubMed

    Coman, Emil N; Iordache, Eugen; Dierker, Lisa; Fifield, Judith; Schensul, Jean J; Suggs, Suzanne; Barbour, Russell

    2014-05-01

    The advantages of modeling the unreliability of outcomes when evaluating the comparative effectiveness of health interventions is illustrated. Adding an action-research intervention component to a regular summer job program for youth was expected to help in preventing risk behaviors. A series of simple two-group alternative structural equation models are compared to test the effect of the intervention on one key attitudinal outcome in terms of model fit and statistical power with Monte Carlo simulations. Some models presuming parameters equal across the intervention and comparison groups were underpowered to detect the intervention effect, yet modeling the unreliability of the outcome measure increased their statistical power and helped in the detection of the hypothesized effect. Comparative Effectiveness Research (CER) could benefit from flexible multi-group alternative structural models organized in decision trees, and modeling unreliability of measures can be of tremendous help for both the fit of statistical models to the data and their statistical power.

  11. Measuring an Effect Size from Dichotomized Data: Contrasted Results Whether Using a Correlation or an Odds Ratio

    ERIC Educational Resources Information Center

    Rousson, Valentin

    2014-01-01

    It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association…

  12. Extensive BMI Gain in Puberty is Associated with Lower Increments in Bone Mineral Density in Estonian Boys with Overweight and Obesity: A 3-Year Longitudinal Study.

    PubMed

    Mengel, Eva; Tillmann, Vallo; Remmel, Liina; Kool, Pille; Purge, Priit; Lätt, Evelin; Jürimäe, Jaak

    2017-08-01

    The aim of this 3-year prospective study was to examine changes in bone mineral characteristics during pubertal maturation in boys with different BMI values at the beginning of puberty and with different BMI increments during puberty. 26 boys with overweight and obesity (OWB) and 29 normal weight boys (NWB) were studied yearly for 3 years from the age of 11 years to measure the changes in different bone mineral characteristics. The OWB group was further divided into two subgroups according to extensive or non-extensive BMI increment during 3-year period. OWB had higher (P < 0.01) baseline total body (TB) bone mineral density (BMD), TB bone mineral content (BMC), TB BMC for height, lumbar spine (LS) BMD, and LS BMC compared to NWB. Throughout the study period, OWB gained more TB BMD (P = 0.0001), TB BMC (P = 0.0048), TB BMC for height (P = 0.0124), LS BMD (P = 0.0029), and LS BMC (P = 0.0022) compared to NWB. Also during the study period, TB BMD (P = 0.0065), TB BMC (P = 0.0141), TB BMC for height (P = 0.0199), LS BMD (P = 0.0066), LS apparent volumetric BMD (BMAD) (P = 0.0075), and LS BMC (P = 0.017) increased significantly less in those OWB whose BMI increased more extensively. Extensive BMI gain is associated with lower increments in bone mineral characteristics in boys with overweight and obesity. Unfavorable increment in total body fat mass and percentage during pubertal years could be one reason for that.

  13. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    PubMed

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  14. Suspended Draft: Effects on the Composition and Quality of the Military Workforce in the German Armed Forces

    DTIC Science & Technology

    2016-06-01

    14  Table 2.  Summary of Statistics from GGSS Data ........................................ 35  Table 3.  Summary of Statistics from...similar approach are unsurprisingly quite consistent in outcomes within statistical variance. The model is used to estimate the effects of exogenous...of German residents (~82 million), excluding diplomats, foreign military and homeless persons. (German Federal Office of Statistics , 2013, p. 475

  15. A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism.

    PubMed

    Levman, Jacob; Takahashi, Emi; Forgeron, Cynthia; MacDonald, Patrick; Stewart, Natalie; Lim, Ashley; Martel, Anne

    2018-01-01

    Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital , Harvard Medical School , demonstrating that it can potentially play a constructive role in future healthcare technologies.

  16. A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism

    PubMed Central

    Takahashi, Emi; Lim, Ashley; Martel, Anne

    2018-01-01

    Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital, Harvard Medical School, demonstrating that it can potentially play a constructive role in future healthcare technologies. PMID:29796236

  17. A critique of the usefulness of inferential statistics in applied behavior analysis

    PubMed Central

    Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.

    1998-01-01

    Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304

  18. A proposal for the measurement of graphical statistics effectiveness: Does it enhance or interfere with statistical reasoning?

    NASA Astrophysics Data System (ADS)

    Agus, M.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.

    2015-02-01

    Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics.

  19. Improved Statistics for Genome-Wide Interaction Analysis

    PubMed Central

    Ueki, Masao; Cordell, Heather J.

    2012-01-01

    Recently, Wu and colleagues [1] proposed two novel statistics for genome-wide interaction analysis using case/control or case-only data. In computer simulations, their proposed case/control statistic outperformed competing approaches, including the fast-epistasis option in PLINK and logistic regression analysis under the correct model; however, reasons for its superior performance were not fully explored. Here we investigate the theoretical properties and performance of Wu et al.'s proposed statistics and explain why, in some circumstances, they outperform competing approaches. Unfortunately, we find minor errors in the formulae for their statistics, resulting in tests that have higher than nominal type 1 error. We also find minor errors in PLINK's fast-epistasis and case-only statistics, although theory and simulations suggest that these errors have only negligible effect on type 1 error. We propose adjusted versions of all four statistics that, both theoretically and in computer simulations, maintain correct type 1 error rates under the null hypothesis. We also investigate statistics based on correlation coefficients that maintain similar control of type 1 error. Although designed to test specifically for interaction, we show that some of these previously-proposed statistics can, in fact, be sensitive to main effects at one or both loci, particularly in the presence of linkage disequilibrium. We propose two new “joint effects” statistics that, provided the disease is rare, are sensitive only to genuine interaction effects. In computer simulations we find, in most situations considered, that highest power is achieved by analysis under the correct genetic model. Such an analysis is unachievable in practice, as we do not know this model. However, generally high power over a wide range of scenarios is exhibited by our joint effects and adjusted Wu statistics. We recommend use of these alternative or adjusted statistics and urge caution when using Wu et al.'s originally-proposed statistics, on account of the inflated error rate that can result. PMID:22496670

  20. A composite six bp in-frame deletion in the melanocortin 1 receptor (MC1R) gene is associated with the Japanese brindling coat colour in rabbits (Oryctolagus cuniculus).

    PubMed

    Fontanesi, Luca; Scotti, Emilio; Colombo, Michela; Beretti, Francesca; Forestier, Lionel; Dall'Olio, Stefania; Deretz, Séverine; Russo, Vincenzo; Allain, Daniel; Oulmouden, Ahmad

    2010-07-01

    In the domestic rabbit (Oryctolagus cuniculus), classical genetic studies have identified five alleles at the Extension locus: ED (dominant black), ES (steel, weaker version of ED), E (wild type, normal extension of black), eJ(Japanese brindling, mosaic distribution of black and yellow) and e (non-extension of black, yellow/red with white belly). Sequencing almost the complete coding sequence (CDS) of the rabbit MC1R gene, we recently identified two in-frame deletions associated with dominant black (c.280_285del6; alleles ED or ES) and recessive red (c.304_333del30; allele e) coat colours. It remained to characterize the eJallele whose phenotypic effect is similar to the Orange and Sex-linked yellow loci of cat and Syrian hamster. We sequenced the whole CDS in 25 rabbits of different coat colours including 10 Japanese and 10 Rhinelander (tricolour) rabbits and identified another 6 bp-in frame deletion flanked by a G > A transition in 5' (c.[124G>A;125_130del6]) that was present in all animals with Japanese brindling coat colour and pattern. These mutations eliminate two amino acids in the first transmembrane domain and, in addition, cause an amino acid substitution at position 44 of the wild type sequence. Genotyping 371 rabbits of 31 breeds with different coat colour this allele (eJ) was present in homozygous state in Japanese, Rhinelander and Dutch tricolour rabbits only (except one albino rabbit). Rabbits with eJ/eJ genotype were non fixed at the non-agouti mutation we previously identified in the ASIP gene. Segregation in F1 and F2 families confirmed the order of dominance already determined by classical genetic experiments with a possible dose effect evident comparing eJ/eJ and eJ/e animals. MC1R mRNA was expressed in black hair skin regions only. The c.[124A;125_130del6] allele may be responsible for a MC1R variant determining eumelanin production in the black areas. However, the mechanism determining the presence of both red and black hairs in the same animal seems more complex. Expression analyses of the c.[124A;125_130del6] allele suggest that MC1R transcription may be regulated epigenetically in rabbits with the Japanese brindling phenotype. Further studies are needed to clarify this issue.

  1. New heterogeneous test statistics for the unbalanced fixed-effect nested design.

    PubMed

    Guo, Jiin-Huarng; Billard, L; Luh, Wei-Ming

    2011-05-01

    When the underlying variances are unknown or/and unequal, using the conventional F test is problematic in the two-factor hierarchical data structure. Prompted by the approximate test statistics (Welch and Alexander-Govern methods), the authors develop four new heterogeneous test statistics to test factor A and factor B nested within A for the unbalanced fixed-effect two-stage nested design under variance heterogeneity. The actual significance levels and statistical power of the test statistics were compared in a simulation study. The results show that the proposed procedures maintain better Type I error rate control and have greater statistical power than those obtained by the conventional F test in various conditions. Therefore, the proposed test statistics are recommended in terms of robustness and easy implementation. ©2010 The British Psychological Society.

  2. Key statistical and analytical issues for evaluating treatment effects in periodontal research.

    PubMed

    Tu, Yu-Kang; Gilthorpe, Mark S

    2012-06-01

    Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.

  3. The Importance of Teaching Power in Statistical Hypothesis Testing

    ERIC Educational Resources Information Center

    Olinsky, Alan; Schumacher, Phyllis; Quinn, John

    2012-01-01

    In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…

  4. Confidence Intervals for Effect Sizes: Applying Bootstrap Resampling

    ERIC Educational Resources Information Center

    Banjanovic, Erin S.; Osborne, Jason W.

    2016-01-01

    Confidence intervals for effect sizes (CIES) provide readers with an estimate of the strength of a reported statistic as well as the relative precision of the point estimate. These statistics offer more information and context than null hypothesis statistic testing. Although confidence intervals have been recommended by scholars for many years,…

  5. Knowledge level of effect size statistics, confidence intervals and meta-analysis in Spanish academic psychologists.

    PubMed

    Badenes-Ribera, Laura; Frias-Navarro, Dolores; Pascual-Soler, Marcos; Monterde-I-Bort, Héctor

    2016-11-01

    The statistical reform movement and the American Psychological Association (APA) defend the use of estimators of the effect size and its confidence intervals, as well as the interpretation of the clinical significance of the findings. A survey was conducted in which academic psychologists were asked about their behavior in designing and carrying out their studies. The sample was composed of 472 participants (45.8% men). The mean number of years as a university professor was 13.56 years (SD= 9.27). The use of effect-size estimators is becoming generalized, as well as the consideration of meta-analytic studies. However, several inadequate practices still persist. A traditional model of methodological behavior based on statistical significance tests is maintained, based on the predominance of Cohen’s d and the unadjusted R2/η2, which are not immune to outliers or departure from normality and the violations of statistical assumptions, and the under-reporting of confidence intervals of effect-size statistics. The paper concludes with recommendations for improving statistical practice.

  6. Quantifying, displaying and accounting for heterogeneity in the meta-analysis of RCTs using standard and generalised Q statistics

    PubMed Central

    2011-01-01

    Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747

  7. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  8. Standardized Effect Sizes for Moderated Conditional Fixed Effects with Continuous Moderator Variables

    PubMed Central

    Bodner, Todd E.

    2017-01-01

    Wilkinson and Task Force on Statistical Inference (1999) recommended that researchers include information on the practical magnitude of effects (e.g., using standardized effect sizes) to distinguish between the statistical and practical significance of research results. To date, however, researchers have not widely incorporated this recommendation into the interpretation and communication of the conditional effects and differences in conditional effects underlying statistical interactions involving a continuous moderator variable where at least one of the involved variables has an arbitrary metric. This article presents a descriptive approach to investigate two-way statistical interactions involving continuous moderator variables where the conditional effects underlying these interactions are expressed in standardized effect size metrics (i.e., standardized mean differences and semi-partial correlations). This approach permits researchers to evaluate and communicate the practical magnitude of particular conditional effects and differences in conditional effects using conventional and proposed guidelines, respectively, for the standardized effect size and therefore provides the researcher important supplementary information lacking under current approaches. The utility of this approach is demonstrated with two real data examples and important assumptions underlying the standardization process are highlighted. PMID:28484404

  9. The Effect of a Student-Designed Data Collection: Project on Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Carnell, Lisa J.

    2008-01-01

    Students often enter an introductory statistics class with less than positive attitudes about the subject. They tend to believe statistics is difficult and irrelevant to their lives. Observational evidence from previous studies suggests including projects in a statistics course may enhance students' attitudes toward statistics. This study examines…

  10. Identifying Measurement Disturbance Effects Using Rasch Item Fit Statistics and the Logit Residual Index.

    ERIC Educational Resources Information Center

    Mount, Robert E.; Schumacker, Randall E.

    1998-01-01

    A Monte Carlo study was conducted using simulated dichotomous data to determine the effects of guessing on Rasch item fit statistics and the Logit Residual Index. Results indicate that no significant differences were found between the mean Rasch item fit statistics for each distribution type as the probability of guessing the correct answer…

  11. Teaching Statistical Inference for Causal Effects in Experiments and Observational Studies

    ERIC Educational Resources Information Center

    Rubin, Donald B.

    2004-01-01

    Inference for causal effects is a critical activity in many branches of science and public policy. The field of statistics is the one field most suited to address such problems, whether from designed experiments or observational studies. Consequently, it is arguably essential that departments of statistics teach courses in causal inference to both…

  12. Advances in Testing the Statistical Significance of Mediation Effects

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.

    2006-01-01

    P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…

  13. The effect of normalization of Partial Directed Coherence on the statistical assessment of connectivity patterns: a simulation study.

    PubMed

    Toppi, J; Petti, M; Vecchiato, G; Cincotti, F; Salinari, S; Mattia, D; Babiloni, F; Astolfi, L

    2013-01-01

    Partial Directed Coherence (PDC) is a spectral multivariate estimator for effective connectivity, relying on the concept of Granger causality. Even if its original definition derived directly from information theory, two modifies were introduced in order to provide better physiological interpretations of the estimated networks: i) normalization of the estimator according to rows, ii) squared transformation. In the present paper we investigated the effect of PDC normalization on the performances achieved by applying the statistical validation process on investigated connectivity patterns under different conditions of Signal to Noise ratio (SNR) and amount of data available for the analysis. Results of the statistical analysis revealed an effect of PDC normalization only on the percentages of type I and type II errors occurred by using Shuffling procedure for the assessment of connectivity patterns. No effects of the PDC formulation resulted on the performances achieved during the validation process executed instead by means of Asymptotic Statistic approach. Moreover, the percentages of both false positives and false negatives committed by Asymptotic Statistic are always lower than those achieved by Shuffling procedure for each type of normalization.

  14. [A Review on the Use of Effect Size in Nursing Research].

    PubMed

    Kang, Hyuncheol; Yeon, Kyupil; Han, Sang Tae

    2015-10-01

    The purpose of this study was to introduce the main concepts of statistical testing and effect size and to provide researchers in nursing science with guidance on how to calculate the effect size for the statistical analysis methods mainly used in nursing. For t-test, analysis of variance, correlation analysis, regression analysis which are used frequently in nursing research, the generally accepted definitions of the effect size were explained. Some formulae for calculating the effect size are described with several examples in nursing research. Furthermore, the authors present the required minimum sample size for each example utilizing G*Power 3 software that is the most widely used program for calculating sample size. It is noted that statistical significance testing and effect size measurement serve different purposes, and the reliance on only one side may be misleading. Some practical guidelines are recommended for combining statistical significance testing and effect size measure in order to make more balanced decisions in quantitative analyses.

  15. Whose statistical reasoning is facilitated by a causal structure intervention?

    PubMed

    McNair, Simon; Feeney, Aidan

    2015-02-01

    People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430-450, 2007) proposed that a causal Bayesian framework accounts for peoples' errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics.

  16. An Exploration of Student Attitudes and Satisfaction in a GAISE-Influenced Introductory Statistics Course

    ERIC Educational Resources Information Center

    Paul, Warren; Cunnington, R. Clare

    2017-01-01

    We used the Survey of Attitudes Toward Statistics to (1) evaluate using presemester data the Students' Attitudes Toward Statistics Model (SATS-M), and (2) test the effect on attitudes of an introductory statistics course redesigned according to the Guidelines for Assessment and Instruction in Statistics Education (GAISE) by examining the change in…

  17. The Effect on Prospective Teachers of the Learning Environment Supported by Dynamic Statistics Software

    ERIC Educational Resources Information Center

    Koparan, Timur

    2016-01-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study…

  18. Predicting Acquisition of Learning Outcomes: A Comparison of Traditional and Activity-Based Instruction in an Introductory Statistics Course.

    ERIC Educational Resources Information Center

    Geske, Jenenne A.; Mickelson, William T.; Bandalos, Deborah L.; Jonson, Jessica; Smith, Russell W.

    The bulk of experimental research related to reforms in the teaching of statistics concentrates on the effects of alternative teaching methods on statistics achievement. This study expands on that research by including an examination of the effects of instructor and the interaction between instructor and method on achievement as well as attitudes,…

  19. The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…

  20. The Effects of Pre-Lecture Quizzes on Test Anxiety and Performance in a Statistics Course

    ERIC Educational Resources Information Center

    Brown, Michael J.; Tallon, Jennifer

    2015-01-01

    The purpose of our study was to examine the effects of pre-lecture quizzes in a statistics course. Students (N = 70) from 2 sections of an introductory statistics course served as participants in this study. One section completed pre-lecture quizzes whereas the other section did not. Completing pre-lecture quizzes was associated with improved exam…

  1. The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning

    ERIC Educational Resources Information Center

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…

  2. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  3. Nonparametric estimation and testing of fixed effects panel data models

    PubMed Central

    Henderson, Daniel J.; Carroll, Raymond J.; Li, Qi

    2009-01-01

    In this paper we consider the problem of estimating nonparametric panel data models with fixed effects. We introduce an iterative nonparametric kernel estimator. We also extend the estimation method to the case of a semiparametric partially linear fixed effects model. To determine whether a parametric, semiparametric or nonparametric model is appropriate, we propose test statistics to test between the three alternatives in practice. We further propose a test statistic for testing the null hypothesis of random effects against fixed effects in a nonparametric panel data regression model. Simulations are used to examine the finite sample performance of the proposed estimators and the test statistics. PMID:19444335

  4. Effect Size as the Essential Statistic in Developing Methods for mTBI Diagnosis.

    PubMed

    Gibson, Douglas Brandt

    2015-01-01

    The descriptive statistic known as "effect size" measures the distinguishability of two sets of data. Distingishability is at the core of diagnosis. This article is intended to point out the importance of effect size in the development of effective diagnostics for mild traumatic brain injury and to point out the applicability of the effect size statistic in comparing diagnostic efficiency across the main proposed TBI diagnostic methods: psychological, physiological, biochemical, and radiologic. Comparing diagnostic approaches is difficult because different researcher in different fields have different approaches to measuring efficacy. Converting diverse measures to effect sizes, as is done in meta-analysis, is a relatively easy way to make studies comparable.

  5. A powerful approach for association analysis incorporating imprinting effects

    PubMed Central

    Xia, Fan; Zhou, Ji-Yuan; Fung, Wing Kam

    2011-01-01

    Motivation: For a diallelic marker locus, the transmission disequilibrium test (TDT) is a simple and powerful design for genetic studies. The TDT was originally proposed for use in families with both parents available (complete nuclear families) and has further been extended to 1-TDT for use in families with only one of the parents available (incomplete nuclear families). Currently, the increasing interest of the influence of parental imprinting on heritability indicates the importance of incorporating imprinting effects into the mapping of association variants. Results: In this article, we extend the TDT-type statistics to incorporate imprinting effects and develop a series of new test statistics in a general two-stage framework for association studies. Our test statistics enjoy the nature of family-based designs that need no assumption of Hardy–Weinberg equilibrium. Also, the proposed methods accommodate complete and incomplete nuclear families with one or more affected children. In the simulation study, we verify the validity of the proposed test statistics under various scenarios, and compare the powers of the proposed statistics with some existing test statistics. It is shown that our methods greatly improve the power for detecting association in the presence of imprinting effects. We further demonstrate the advantage of our methods by the application of the proposed test statistics to a rheumatoid arthritis dataset. Contact: wingfung@hku.hk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21798962

  6. A powerful approach for association analysis incorporating imprinting effects.

    PubMed

    Xia, Fan; Zhou, Ji-Yuan; Fung, Wing Kam

    2011-09-15

    For a diallelic marker locus, the transmission disequilibrium test (TDT) is a simple and powerful design for genetic studies. The TDT was originally proposed for use in families with both parents available (complete nuclear families) and has further been extended to 1-TDT for use in families with only one of the parents available (incomplete nuclear families). Currently, the increasing interest of the influence of parental imprinting on heritability indicates the importance of incorporating imprinting effects into the mapping of association variants. In this article, we extend the TDT-type statistics to incorporate imprinting effects and develop a series of new test statistics in a general two-stage framework for association studies. Our test statistics enjoy the nature of family-based designs that need no assumption of Hardy-Weinberg equilibrium. Also, the proposed methods accommodate complete and incomplete nuclear families with one or more affected children. In the simulation study, we verify the validity of the proposed test statistics under various scenarios, and compare the powers of the proposed statistics with some existing test statistics. It is shown that our methods greatly improve the power for detecting association in the presence of imprinting effects. We further demonstrate the advantage of our methods by the application of the proposed test statistics to a rheumatoid arthritis dataset. wingfung@hku.hk Supplementary data are available at Bioinformatics online.

  7. Corneal biomechanical parameters and intraocular pressure: the effect of topical anesthesia

    PubMed Central

    Ogbuehi, Kelechi C

    2012-01-01

    Background The intraocular pressures and biomechanical parameters measured by the ocular response analyzer make the analyzer a useful tool for the diagnosis and management of anterior segment disease. This observational study was designed to investigate the effect of topical anesthesia on the parameters measured by the ocular response analyzer: corneal hysteresis, corneal resistance factor, Goldmann-correlated intraocular pressure (IOPg), and corneal-compensated intraocular pressure (IOPcc). Methods Two sets of measurements were made for 78 eyes of 39 subjects, approximately 1 week apart. In session 1, each eye of each subject was randomized into one of three groups: polyvinyl alcohol (0.5%), tetracaine hydrochloride (0.5%), or oxybuprocaine hydrochloride (0.4%). In session 2, eyes that were in the polyvinyl alcohol group in session 1 were assigned to the tetracaine group, those in the tetracaine group in session 1 were assigned to oxybuprocaine group, and those in the oxybuprocaine group in session 1 were assigned to the polyvinyl alcohol group. For both sessions, each subject first had his or her central corneal thickness assessed with a specular microscope, followed by measurements of intraocular pressure and corneal biomechanical parameters with the Ocular Response Analyzer. All measurements were repeated for 2 minutes and 5 minutes following the instillation of either polyvinyl alcohol, tetracaine, or oxybuprocaine. The level of statistical significance was 0.05. Results Polyvinyl alcohol, tetracaine hydrochloride, and oxybuprocaine hydrochloride had no statistically significant (P > 0.05) effect on any of the biomechanical parameters of the cornea. There was no statistically significant effect on either IOPg (P > 0.05) or IOPcc (P > 0.05) 2 minutes after the eye drops were instilled in either session. Five minutes after the eye drops were instilled, polyvinyl alcohol showed no statistically significant effect on either IOPg (P > 0.05) or IOPcc (P > 0.05) in either session. Oxybuprocaine and tetracaine caused statistically significant (P < 0.05) reductions in IOPg in session 1, but only tetracaine had a significant (P < 0.05) effect in session 2. Tetracaine also caused a statistically significant (P < 0.05) reduction in IOPcc in session 1. Conclusion The statistically significant effect of topical anesthesia on IOPg varies with the anesthetic used, and while this effect was statistically significant in this study, the small effect is probably not clinically relevant. There was no effect on any of the biomechanical parameters of the cornea. PMID:22791966

  8. Corneal biomechanical parameters and intraocular pressure: the effect of topical anesthesia.

    PubMed

    Ogbuehi, Kelechi C

    2012-01-01

    The intraocular pressures and biomechanical parameters measured by the ocular response analyzer make the analyzer a useful tool for the diagnosis and management of anterior segment disease. This observational study was designed to investigate the effect of topical anesthesia on the parameters measured by the ocular response analyzer: corneal hysteresis, corneal resistance factor, Goldmann-correlated intraocular pressure (IOPg), and corneal-compensated intraocular pressure (IOPcc). Two sets of measurements were made for 78 eyes of 39 subjects, approximately 1 week apart. In session 1, each eye of each subject was randomized into one of three groups: polyvinyl alcohol (0.5%), tetracaine hydrochloride (0.5%), or oxybuprocaine hydrochloride (0.4%). In session 2, eyes that were in the polyvinyl alcohol group in session 1 were assigned to the tetracaine group, those in the tetracaine group in session 1 were assigned to oxybuprocaine group, and those in the oxybuprocaine group in session 1 were assigned to the polyvinyl alcohol group. For both sessions, each subject first had his or her central corneal thickness assessed with a specular microscope, followed by measurements of intraocular pressure and corneal biomechanical parameters with the Ocular Response Analyzer. All measurements were repeated for 2 minutes and 5 minutes following the instillation of either polyvinyl alcohol, tetracaine, or oxybuprocaine. The level of statistical significance was 0.05. Polyvinyl alcohol, tetracaine hydrochloride, and oxybuprocaine hydrochloride had no statistically significant (P > 0.05) effect on any of the biomechanical parameters of the cornea. There was no statistically significant effect on either IOPg (P > 0.05) or IOPcc (P > 0.05) 2 minutes after the eye drops were instilled in either session. Five minutes after the eye drops were instilled, polyvinyl alcohol showed no statistically significant effect on either IOPg (P > 0.05) or IOPcc (P > 0.05) in either session. Oxybuprocaine and tetracaine caused statistically significant (P < 0.05) reductions in IOPg in session 1, but only tetracaine had a significant (P < 0.05) effect in session 2. Tetracaine also caused a statistically significant (P < 0.05) reduction in IOPcc in session 1. The statistically significant effect of topical anesthesia on IOPg varies with the anesthetic used, and while this effect was statistically significant in this study, the small effect is probably not clinically relevant. There was no effect on any of the biomechanical parameters of the cornea.

  9. The Roles of Experience, Gender, and Individual Differences in Statistical Reasoning

    ERIC Educational Resources Information Center

    Martin, Nadia; Hughes, Jeffrey; Fugelsang, Jonathan

    2017-01-01

    We examine the joint effects of gender and experience on statistical reasoning. Participants with various levels of experience in statistics completed the Statistical Reasoning Assessment (Garfield, 2003), along with individual difference measures assessing cognitive ability and thinking dispositions. Although the performance of both genders…

  10. Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.

    PubMed

    Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H

    2016-07-01

    Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.

  11. Looking Back over Their Shoulders: A Qualitative Analysis of Portuguese Teachers' Attitudes towards Statistics

    ERIC Educational Resources Information Center

    Martins, Jose Alexandre; Nascimento, Maria Manuel; Estrada, Assumpta

    2012-01-01

    Teachers' attitudes towards statistics can have a significant effect on their own statistical training, their teaching of statistics, and the future attitudes of their students. The influence of attitudes in teaching statistics in different contexts was previously studied in the work of Estrada et al. (2004, 2010a, 2010b) and Martins et al.…

  12. An Update on Statistical Boosting in Biomedicine.

    PubMed

    Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf

    2017-01-01

    Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.

  13. Gene-Based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions.

    PubMed

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y; Chen, Wei

    2016-02-01

    Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, here we develop Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT), which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. © 2016 WILEY PERIODICALS, INC.

  14. Gene-based Association Analysis for Censored Traits Via Fixed Effect Functional Regressions

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Yan, Qi; Ding, Ying; Weeks, Daniel E.; Lu, Zhaohui; Ren, Haobo; Cook, Richard J; Xiong, Momiao; Swaroop, Anand; Chew, Emily Y.; Chen, Wei

    2015-01-01

    Summary Genetic studies of survival outcomes have been proposed and conducted recently, but statistical methods for identifying genetic variants that affect disease progression are rarely developed. Motivated by our ongoing real studies, we develop here Cox proportional hazard models using functional regression (FR) to perform gene-based association analysis of survival traits while adjusting for covariates. The proposed Cox models are fixed effect models where the genetic effects of multiple genetic variants are assumed to be fixed. We introduce likelihood ratio test (LRT) statistics to test for associations between the survival traits and multiple genetic variants in a genetic region. Extensive simulation studies demonstrate that the proposed Cox RF LRT statistics have well-controlled type I error rates. To evaluate power, we compare the Cox FR LRT with the previously developed burden test (BT) in a Cox model and sequence kernel association test (SKAT) which is based on mixed effect Cox models. The Cox FR LRT statistics have higher power than or similar power as Cox SKAT LRT except when 50%/50% causal variants had negative/positive effects and all causal variants are rare. In addition, the Cox FR LRT statistics have higher power than Cox BT LRT. The models and related test statistics can be useful in the whole genome and whole exome association studies. An age-related macular degeneration dataset was analyzed as an example. PMID:26782979

  15. Implementation of Discovery Projects in Statistics

    ERIC Educational Resources Information Center

    Bailey, Brad; Spence, Dianna J.; Sinn, Robb

    2013-01-01

    Researchers and statistics educators consistently suggest that students will learn statistics more effectively by conducting projects through which they actively engage in a broad spectrum of tasks integral to statistical inquiry, in the authentic context of a real-world application. In keeping with these findings, we share an implementation of…

  16. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    ERIC Educational Resources Information Center

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  17. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    PubMed

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  18. Effect of multiple spin species on spherical shell neutron transmission analysis

    NASA Technical Reports Server (NTRS)

    Semler, T. T.

    1972-01-01

    A series of Monte Carlo calculations were performed in order to evaluate the effect of separated against merged spin statistics on the analysis of spherical shell neutron transmission experiments for gold. It is shown that the use of separated spin statistics results in larger average capture cross sections of gold at 24 KeV. This effect is explained by stronger windows in the total cross section caused by the interference between potential and J(+) resonances and by J(+) and J(-) resonance overlap allowed by the use of separated spin statistics.

  19. Further Insight into the Reaction FeO+ + H2 Yields Fe+ + H2O: Temperature Dependent Kinetics, Isotope Effects, and Statistical Modeling (Postprint)

    DTIC Science & Technology

    2014-07-31

    a laminar flow tube via a Venturi inlet, where ∼104 to 105 collisions with a He buffer gas act to thermalize the ions and carry them downstream...ISOTOPE EFFECTS , AND STATISTICAL MODELING (POSTPRINT) Shaun G. Ard, et al. 31 July 2014 Journal Article AIR FORCE RESEARCH LABORATORY Space Vehicles...Kinetics, Isotope Effects , and Statistical Modeling (Postprint) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 61102F 6

  20. Effective control of complex turbulent dynamical systems through statistical functionals.

    PubMed

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  1. Characterizing the Joint Effect of Diverse Test-Statistic Correlation Structures and Effect Size on False Discovery Rates in a Multiple-Comparison Study of Many Outcome Measures

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Ploutz-Snyder, Robert; Fiedler, James

    2011-01-01

    In their 2009 Annals of Statistics paper, Gavrilov, Benjamini, and Sarkar report the results of a simulation assessing the robustness of their adaptive step-down procedure (GBS) for controlling the false discovery rate (FDR) when normally distributed test statistics are serially correlated. In this study we extend the investigation to the case of multiple comparisons involving correlated non-central t-statistics, in particular when several treatments or time periods are being compared to a control in a repeated-measures design with many dependent outcome measures. In addition, we consider several dependence structures other than serial correlation and illustrate how the FDR depends on the interaction between effect size and the type of correlation structure as indexed by Foerstner s distance metric from an identity. The relationship between the correlation matrix R of the original dependent variables and R, the correlation matrix of associated t-statistics is also studied. In general R depends not only on R, but also on sample size and the signed effect sizes for the multiple comparisons.

  2. Health significance and statistical uncertainty. The value of P-value.

    PubMed

    Consonni, Dario; Bertazzi, Pier Alberto

    2017-10-27

    The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P<0.05" (defined as "statistically significant") and "P>0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".

  3. "Suicide shall cease to be a crime": suicide and undetermined death trends 1970-2000 before and after the decriminalization of suicide in Ireland 1993.

    PubMed

    Osman, Mugtaba; Parnell, Andrew C; Haley, Clifford

    2017-02-01

    Suicide is criminalized in more than 100 countries around the world. A dearth of research exists into the effect of suicide legislation on suicide rates and available statistics are mixed. This study investigates 10,353 suicide deaths in Ireland that took place between 1970 and 2000. Irish 1970-2000 annual suicide data were obtained from the Central Statistics Office and modelled via a negative binomial regression approach. We examined the effect of suicide legislation on different age groups and on both sexes. We used Bonferroni correction for multiple modelling. Statistical analysis was performed using the R statistical package version 3.1.2. The coefficient for the effect of suicide act on overall suicide deaths was -9.094 (95 % confidence interval (CI) -34.086 to 15.899), statistically non-significant (p = 0.476). The coefficient for the effect suicide act on undetermined deaths was statistically significant (p < 0.001) and was estimated to be -644.4 (95 % CI -818.6 to -469.9). The results of our study indicate that legalization of suicide is not associated with a significant increase in subsequent suicide deaths. However, undetermined death verdict rates have significantly dropped following legalization of suicide.

  4. Anti-TNF-A therapy about infliximab and adalimamab for the effectiveness in ulcerative colitis compared with conventional therapy: a meta-analysis.

    PubMed

    Zhou, Zheng; Dai, Cong; Liu, Wei-Xin

    2015-01-01

    TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with conventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR = 1.82; 95% CI 1.24 to 2.67) and clinical response (RR = 1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.

  5. Anti-TNF-A Therapy about Infliximab and Adalimamab for the Effectiveness in Ulcerative Colitis Compared with Conventional Therapy: A Meta-Analysis.

    PubMed

    Zhou, Zheng; Dai, Cong; Liu, Wei-xin

    2015-06-01

    TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with con- ventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR =1.82; 95% CI 1.24 to 2.67) and clinical response (RR =1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.

  6. Strengthening Statistics Graduate Programs with Statistical Collaboration--The Case of Hawassa University, Ethiopia

    ERIC Educational Resources Information Center

    Goshu, Ayele Taye

    2016-01-01

    This paper describes the experiences gained from the established statistical collaboration canter at Hawassa University in May 2015 as part of LISA 2020 [Laboratory for Interdisciplinary Statistical Analysis] network. The center has got similar setup as LISA of Virginia Tech. Statisticians are trained on how to become more effective scientific…

  7. A Unifying Framework for Teaching Nonparametric Statistical Tests

    ERIC Educational Resources Information Center

    Bargagliotti, Anna E.; Orrison, Michael E.

    2014-01-01

    Increased importance is being placed on statistics at both the K-12 and undergraduate level. Research divulging effective methods to teach specific statistical concepts is still widely sought after. In this paper, we focus on best practices for teaching topics in nonparametric statistics at the undergraduate level. To motivate the work, we…

  8. Inferring Demographic History Using Two-Locus Statistics.

    PubMed

    Ragsdale, Aaron P; Gutenkunst, Ryan N

    2017-06-01

    Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.

  9. Statistical characteristics of trajectories of diamagnetic unicellular organisms in a magnetic field.

    PubMed

    Gorobets, Yu I; Gorobets, O Yu

    2015-01-01

    The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Perceptual statistical learning over one week in child speech production.

    PubMed

    Richtsmeier, Peter T; Goffman, Lisa

    2017-07-01

    What cognitive mechanisms account for the trajectory of speech sound development, in particular, gradually increasing accuracy during childhood? An intriguing potential contributor is statistical learning, a type of learning that has been studied frequently in infant perception but less often in child speech production. To assess the relevance of statistical learning to developing speech accuracy, we carried out a statistical learning experiment with four- and five-year-olds in which statistical learning was examined over one week. Children were familiarized with and tested on word-medial consonant sequences in novel words. There was only modest evidence for statistical learning, primarily in the first few productions of the first session. This initial learning effect nevertheless aligns with previous statistical learning research. Furthermore, the overall learning effect was similar to an estimate of weekly accuracy growth based on normative studies. The results implicate other important factors in speech sound development, particularly learning via production. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Modelling unsupervised online-learning of artificial grammars: linking implicit and statistical learning.

    PubMed

    Rohrmeier, Martin A; Cross, Ian

    2014-07-01

    Humans rapidly learn complex structures in various domains. Findings of above-chance performance of some untrained control groups in artificial grammar learning studies raise questions about the extent to which learning can occur in an untrained, unsupervised testing situation with both correct and incorrect structures. The plausibility of unsupervised online-learning effects was modelled with n-gram, chunking and simple recurrent network models. A novel evaluation framework was applied, which alternates forced binary grammaticality judgments and subsequent learning of the same stimulus. Our results indicate a strong online learning effect for n-gram and chunking models and a weaker effect for simple recurrent network models. Such findings suggest that online learning is a plausible effect of statistical chunk learning that is possible when ungrammatical sequences contain a large proportion of grammatical chunks. Such common effects of continuous statistical learning may underlie statistical and implicit learning paradigms and raise implications for study design and testing methodologies. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Multidimensional effects in nonadiabatic statistical theories of spin- forbidden kinetics. A case study of 3O + CO → CO 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasper, Ahren

    2015-04-14

    The appropriateness of treating crossing seams of electronic states of different spins as nonadiabatic transition states in statistical calculations of spin-forbidden reaction rates is considered. We show that the spin-forbidden reaction coordinate, the nuclear coordinate perpendicular to the crossing seam, is coupled to the remaining nuclear degrees of freedom. We found that this coupling gives rise to multidimensional effects that are not typically included in statistical treatments of spin-forbidden kinetics. Three qualitative categories of multidimensional effects may be identified: static multidimensional effects due to the geometry-dependence of the local shape of the crossing seam and of the spin–orbit coupling, dynamicalmore » multidimensional effects due to energy exchange with the reaction coordinate during the seam crossing, and nonlocal(history-dependent) multidimensional effects due to interference of the electronic variables at second, third, and later seam crossings. Nonlocal multidimensional effects are intimately related to electronic decoherence, where electronic dephasing acts to erase the history of the system. A semiclassical model based on short-time full-dimensional trajectories that includes all three multidimensional effects as well as a model for electronic decoherence is presented. The results of this multidimensional nonadiabatic statistical theory (MNST) for the 3O + CO → CO 2 reaction are compared with the results of statistical theories employing one-dimensional (Landau–Zener and weak coupling) models for the transition probability and with those calculated previously using multistate trajectories. The MNST method is shown to accurately reproduce the multistate decay-of-mixing trajectory results, so long as consistent thresholds are used. Furthermore, the MNST approach has several advantages over multistate trajectory approaches and is more suitable in chemical kinetics calculations at low temperatures and for complex systems. The error in statistical calculations that neglect multidimensional effects is shown to be as large as a factor of 2 for this system, with static multidimensional effects identified as the largest source of error.« less

  13. A Study of the Effectiveness of the Contextual Lab Activity in the Teaching and Learning Statistics at the UTHM (Universiti Tun Hussein Onn Malaysia)

    ERIC Educational Resources Information Center

    Kamaruddin, Nafisah Kamariah Md; Jaafar, Norzilaila bt; Amin, Zulkarnain Md

    2012-01-01

    Inaccurate concept in statistics contributes to the assumption by the students that statistics do not relate to the real world and are not relevant to the engineering field. There are universities which introduced learning statistics using statistics lab activities. However, the learning is more on the learning how to use software and not to…

  14. Patient education about anticoagulant medication: is narrative evidence or statistical evidence more effective?

    PubMed

    Mazor, Kathleen M; Baril, Joann; Dugan, Elizabeth; Spencer, Frederick; Burgwinkle, Pamela; Gurwitz, Jerry H

    2007-12-01

    To determine the relative impact of incorporating narrative evidence, statistical evidence or both into patient education about warfarin, a widely used oral anticoagulant medication. 600 patients receiving anticoagulant therapy were randomly assigned to view one of three versions of a video depicting a physician-patient encounter where anticoagulation treatment was discussed, or usual care (no video). The videos differed in whether the physician used narrative evidence (patient anecdotes), statistical evidence, or both to highlight key information. 317 patients completed both the baseline and post-test questionnaires. Questions assessed knowledge, beliefs and adherence to medication and laboratory monitoring regimens. All three approaches positively effected patients' warfarin-related knowledge, and beliefs in the importance of lab testing; there was also some indication that viewing a video strengthened belief in the benefits of warfarin. There was some indication that narrative evidence had a greater impact than statistical evidence on beliefs about the importance of lab testing and on knowledge. No other evidence of the differential effectiveness of either approach was found. No statistically significant effect was found on intent to adhere, or documented adherence to lab monitoring. Videos depicting a physician-patient dialogue about warfarin were effective in educating patients about anticoagulant medication, and had a positive impact on their beliefs. The use of narrative evidence in the form of patient anecdotes may be more effective than statistical evidence for some patient outcomes. Patients on oral anticoagulant therapy may benefit from periodic educational efforts reinforcing key medication safety information, even after initial education and ongoing monitoring. Incorporating patient anecdotes into physician-patient dialogues or educational materials may increase the effectiveness of the message.

  15. Effects of an Instructional Gaming Characteristic on Learning Effectiveness, Efficiency, and Engagement: Using a Storyline to Teach Basic Statistical Analytical Skills

    ERIC Educational Resources Information Center

    Novak, Elena

    2012-01-01

    The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. In addition, the study focused on examining the effects of a storyline GC on specific learning…

  16. The effect of iconicity of visual displays on statistical reasoning: evidence in favor of the null hypothesis.

    PubMed

    Sirota, Miroslav; Kostovičová, Lenka; Juanchich, Marie

    2014-08-01

    Knowing which properties of visual displays facilitate statistical reasoning bears practical and theoretical implications. Therefore, we studied the effect of one property of visual diplays - iconicity (i.e., the resemblance of a visual sign to its referent) - on Bayesian reasoning. Two main accounts of statistical reasoning predict different effect of iconicity on Bayesian reasoning. The ecological-rationality account predicts a positive iconicity effect, because more highly iconic signs resemble more individuated objects, which tap better into an evolutionary-designed frequency-coding mechanism that, in turn, facilitates Bayesian reasoning. The nested-sets account predicts a null iconicity effect, because iconicity does not affect the salience of a nested-sets structure-the factor facilitating Bayesian reasoning processed by a general reasoning mechanism. In two well-powered experiments (N = 577), we found no support for a positive iconicity effect across different iconicity levels that were manipulated in different visual displays (meta-analytical overall effect: log OR = -0.13, 95% CI [-0.53, 0.28]). A Bayes factor analysis provided strong evidence in favor of the null hypothesis-the null iconicity effect. Thus, these findings corroborate the nested-sets rather than the ecological-rationality account of statistical reasoning.

  17. An Assessment of the Effectiveness of Air Force Risk Management Practices in Program Acquisition Using Survey Instrument Analysis

    DTIC Science & Technology

    2015-06-18

    Engineering Effectiveness Survey. CMU/SEI-2012-SR-009. Carnegie Mellon University. November 2012. Field, Andy. Discovering Statistics Using SPSS , 3rd...enough into the survey to begin answering questions on risk practices. All of the data statistical analysis will be performed using SPSS . Prior to...probabilistically using distributions for likelihood and impact. Statistical methods like Monte Carlo can more comprehensively evaluate the cost and

  18. Statistical Approach To Extraction Of Texture In SAR

    NASA Technical Reports Server (NTRS)

    Rignot, Eric J.; Kwok, Ronald

    1992-01-01

    Improved statistical method of extraction of textural features in synthetic-aperture-radar (SAR) images takes account of effects of scheme used to sample raw SAR data, system noise, resolution of radar equipment, and speckle. Treatment of speckle incorporated into overall statistical treatment of speckle, system noise, and natural variations in texture. One computes speckle auto-correlation function from system transfer function that expresses effect of radar aperature and incorporates range and azimuth resolutions.

  19. A quadratically regularized functional canonical correlation analysis for identifying the global structure of pleiotropy with NGS data

    PubMed Central

    Zhu, Yun; Fan, Ruzong; Xiong, Momiao

    2017-01-01

    Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274

  20. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    ERIC Educational Resources Information Center

    Porter, Kristin E.

    2018-01-01

    Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…

  1. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  2. Can You Explain that in Plain English? Making Statistics Group Projects Work in a Multicultural Setting

    ERIC Educational Resources Information Center

    Sisto, Michelle

    2009-01-01

    Students increasingly need to learn to communicate statistical results clearly and effectively, as well as to become competent consumers of statistical information. These two learning goals are particularly important for business students. In line with reform movements in Statistics Education and the GAISE guidelines, we are working to implement…

  3. Influence of environmental statistics on inhibition of saccadic return

    PubMed Central

    Farrell, Simon; Ludwig, Casimir J. H.; Ellis, Lucy A.; Gilchrist, Iain D.

    2009-01-01

    Initiating an eye movement is slowed if the saccade is directed to a location that has been fixated in the recent past. We show that this inhibitory effect is modulated by the temporal statistics of the environment: If a return location is likely to become behaviorally relevant, inhibition of return is absent. By fitting an accumulator model of saccadic decision-making, we show that the inhibitory effect and the sensitivity to local statistics can be dissociated in their effects on the rate of accumulation of evidence, and the threshold controlling the amount of evidence needed to generate a saccade. PMID:20080778

  4. A Not-So-Fundamental Limitation on Studying Complex Systems with Statistics: Comment on Rabin (2011)

    NASA Astrophysics Data System (ADS)

    Thomas, Drew M.

    2012-12-01

    Although living organisms are affected by many interrelated and unidentified variables, this complexity does not automatically impose a fundamental limitation on statistical inference. Nor need one invoke such complexity as an explanation of the "Truth Wears Off" or "decline" effect; similar "decline" effects occur with far simpler systems studied in physics. Selective reporting and publication bias, and scientists' biases in favor of reporting eye-catching results (in general) or conforming to others' results (in physics) better explain this feature of the "Truth Wears Off" effect than Rabin's suggested limitation on statistical inference.

  5. The statistical average of optical properties for alumina particle cluster in aircraft plume

    NASA Astrophysics Data System (ADS)

    Li, Jingying; Bai, Lu; Wu, Zhensen; Guo, Lixin

    2018-04-01

    We establish a model for lognormal distribution of monomer radius and number of alumina particle clusters in plume. According to the Multi-Sphere T Matrix (MSTM) theory, we provide a method for finding the statistical average of optical properties for alumina particle clusters in plume, analyze the effect of different distributions and different detection wavelengths on the statistical average of optical properties for alumina particle cluster, and compare the statistical average optical properties under the alumina particle cluster model established in this study and those under three simplified alumina particle models. The calculation results show that the monomer number of alumina particle cluster and its size distribution have a considerable effect on its statistical average optical properties. The statistical average of optical properties for alumina particle cluster at common detection wavelengths exhibit obvious differences, whose differences have a great effect on modeling IR and UV radiation properties of plume. Compared with the three simplified models, the alumina particle cluster model herein features both higher extinction and scattering efficiencies. Therefore, we may find that an accurate description of the scattering properties of alumina particles in aircraft plume is of great significance in the study of plume radiation properties.

  6. Statistical characterization of the fatigue behavior of composite lamina

    NASA Technical Reports Server (NTRS)

    Yang, J. N.; Jones, D. L.

    1979-01-01

    A theoretical model was developed to predict statistically the effects of constant and variable amplitude fatigue loadings on the residual strength and fatigue life of composite lamina. The parameters in the model were established from the results of a series of static tensile tests and a fatigue scan and a number of verification tests were performed. Abstracts for two other papers on the effect of load sequence on the statistical fatigue of composites are also presented.

  7. Enhanced understanding of the relationship between erection and satisfaction in ED treatment: application of a longitudinal mediation model.

    PubMed

    Bushmakin, A G; Cappelleri, J C; Symonds, T; Stecher, V J

    2014-01-01

    To apportion the direct effect and the indirect effect (through erections) that sildenafil (vs placebo) has on individual satisfaction and couple satisfaction over time, longitudinal mediation modeling was applied to outcomes on the Sexual Experience Questionnaire. The model included data from weeks 4 and 10 (double-blind phase) and week 16 (open-label phase) of a controlled study. Data from 167 patients with erectile dysfunction (ED) were available for analysis. Estimation of statistical significance was based on bootstrap simulations, which allowed inferences at and between time points. Percentages (and corresponding 95% confidence intervals) for direct and indirect effects of treatment were calculated using the model. For the individual satisfaction and couple satisfaction domains, direct treatment effects were negligible (not statistically significant) whereas indirect treatment effects via the erection domain represented >90% of the treatment effects (statistically significant). Week 4 vs week 10 percentages of direct and indirect effects were not statistically different, indicating that the mediation effects are longitudinally invariant. As there was no placebo arm in the open-label phase, mediation effects at week 16 were not estimable. In conclusion, erection has a crucial role as a mediator in restoring individual satisfaction and couple satisfaction in men with ED treated with sildenafil.

  8. Statistical projection effects in a hydrodynamic pilot-wave system

    NASA Astrophysics Data System (ADS)

    Sáenz, Pedro J.; Cristea-Platon, Tudor; Bush, John W. M.

    2018-03-01

    Millimetric liquid droplets can walk across the surface of a vibrating fluid bath, self-propelled through a resonant interaction with their own guiding or `pilot' wave fields. These walking droplets, or `walkers', exhibit several features previously thought to be peculiar to the microscopic, quantum realm. In particular, walkers confined to circular corrals manifest a wave-like statistical behaviour reminiscent of that of electrons in quantum corrals. Here we demonstrate that localized topological inhomogeneities in an elliptical corral may lead to resonant projection effects in the walker's statistics similar to those reported in quantum corrals. Specifically, we show that a submerged circular well may drive the walker to excite specific eigenmodes in the bath that result in drastic changes in the particle's statistical behaviour. The well tends to attract the walker, leading to a local peak in the walker's position histogram. By placing the well at one of the foci, a mode with maxima near the foci is preferentially excited, leading to a projection effect in the walker's position histogram towards the empty focus, an effect strongly reminiscent of the quantum mirage. Finally, we demonstrate that the mean pilot-wave field has the same form as the histogram describing the walker's statistics.

  9. Bootstrap versus Statistical Effect Size Corrections: A Comparison with Data from the Finding Embedded Figures Test.

    ERIC Educational Resources Information Center

    Thompson, Bruce; Melancon, Janet G.

    Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…

  10. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    ERIC Educational Resources Information Center

    Porter, Kristin E.

    2016-01-01

    In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…

  11. A General Model for Estimating and Correcting the Effects of Nonindependence in Meta-Analysis.

    ERIC Educational Resources Information Center

    Strube, Michael J.

    A general model is described which can be used to represent the four common types of meta-analysis: (1) estimation of effect size by combining study outcomes; (2) estimation of effect size by contrasting study outcomes; (3) estimation of statistical significance by combining study outcomes; and (4) estimation of statistical significance by…

  12. Effect size and statistical power in the rodent fear conditioning literature - A systematic review.

    PubMed

    Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.

  13. Effect size and statistical power in the rodent fear conditioning literature – A systematic review

    PubMed Central

    Macleod, Malcolm R.

    2018-01-01

    Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451

  14. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Recurrence time statistics for finite size intervals

    NASA Astrophysics Data System (ADS)

    Altmann, Eduardo G.; da Silva, Elton C.; Caldas, Iberê L.

    2004-12-01

    We investigate the statistics of recurrences to finite size intervals for chaotic dynamical systems. We find that the typical distribution presents an exponential decay for almost all recurrence times except for a few short times affected by a kind of memory effect. We interpret this effect as being related to the unstable periodic orbits inside the interval. Although it is restricted to a few short times it changes the whole distribution of recurrences. We show that for systems with strong mixing properties the exponential decay converges to the Poissonian statistics when the width of the interval goes to zero. However, we alert that special attention to the size of the interval is required in order to guarantee that the short time memory effect is negligible when one is interested in numerically or experimentally calculated Poincaré recurrence time statistics.

  16. Structural Model of the Effects of Cognitive and Affective Factors on the Achievement of Arabic-Speaking Pre-Service Teachers in Introductory Statistics

    ERIC Educational Resources Information Center

    Nasser, Fadia M.

    2004-01-01

    This study examined the extent to which statistics and mathematics anxiety, attitudes toward mathematics and statistics, motivation and mathematical aptitude can explain the achievement of Arabic speaking pre-service teachers in introductory statistics. Complete data were collected from 162 pre-service teachers enrolled in an academic…

  17. Using Real-Life Data When Teaching Statistics: Student Perceptions of this Strategy in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Neumann, David L.; Hood, Michelle; Neumann, Michelle M.

    2013-01-01

    Many teachers of statistics recommend using real-life data during class lessons. However, there has been little systematic study of what effect this teaching method has on student engagement and learning. The present study examined this question in a first-year university statistics course. Students (n = 38) were interviewed and their reflections…

  18. 76 FR 11393 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 630 in the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... Statistical Area 630 in the Gulf of Alaska AGENCY: National Marine Fisheries Service (NMFS), National Oceanic.... SUMMARY: NMFS is opening directed fishing for pollock in Statistical Area 630 of the Gulf of Alaska (GOA... catch (TAC) of pollock in Statistical Area 630 of the GOA. DATES: Effective 1200 hrs, Alaska local time...

  19. 76 FR 10779 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 610 in the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-28

    ... Statistical Area 610 in the Gulf of Alaska AGENCY: National Marine Fisheries Service (NMFS), National Oceanic.... SUMMARY: NMFS is opening directed fishing for pollock in Statistical Area 610 of the Gulf of Alaska (GOA... pollock in Statistical Area 610 of the GOA. DATES: Effective 1200 hrs, Alaska local time (A.l.t...

  20. Developing Conceptual Understanding in a Statistics Course: Merrill's First Principles and Real Data at Work

    ERIC Educational Resources Information Center

    Tu, Wendy; Snyder, Martha M.

    2017-01-01

    Difficulties in learning statistics primarily at the college-level led to a reform movement in statistics education in the early 1990s. Although much work has been done, effective learning designs that facilitate active learning, conceptual understanding of statistics, and the use of real-data in the classroom are needed. Guided by Merrill's First…

  1. Low statistical power in biomedical science: a review of three human research domains.

    PubMed

    Dumas-Mallet, Estelle; Button, Katherine S; Boraud, Thomas; Gonon, Francois; Munafò, Marcus R

    2017-02-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0-10% or 11-20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.

  2. Low statistical power in biomedical science: a review of three human research domains

    PubMed Central

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409

  3. Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?

    PubMed Central

    Tressoldi, Patrizio E.

    2012-01-01

    The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215

  4. Effects of quantum coherence on work statistics

    NASA Astrophysics Data System (ADS)

    Xu, Bao-Ming; Zou, Jian; Guo, Li-Sha; Kong, Xiang-Mu

    2018-05-01

    In the conventional two-point measurement scheme of quantum thermodynamics, quantum coherence is destroyed by the first measurement. But as we know the coherence really plays an important role in the quantum thermodynamics process, and how to describe the work statistics for a quantum coherent process is still an open question. In this paper, we use the full counting statistics method to investigate the effects of quantum coherence on work statistics. First, we give a general discussion and show that for a quantum coherent process, work statistics is very different from that of the two-point measurement scheme, specifically the average work is increased or decreased and the work fluctuation can be decreased by quantum coherence, which strongly depends on the relative phase, the energy level structure, and the external protocol. Then, we concretely consider a quenched one-dimensional transverse Ising model and show that quantum coherence has a more significant influence on work statistics in the ferromagnetism regime compared with that in the paramagnetism regime, so that due to the presence of quantum coherence the work statistics can exhibit the critical phenomenon even at high temperature.

  5. A Guerilla Guide to Common Problems in ‘Neurostatistics’: Essential Statistical Topics in Neuroscience

    PubMed Central

    Smith, Paul F.

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855

  6. A Guerilla Guide to Common Problems in 'Neurostatistics': Essential Statistical Topics in Neuroscience.

    PubMed

    Smith, Paul F

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.

  7. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  8. Selecting Summary Statistics in Approximate Bayesian Computation for Calibrating Stochastic Models

    PubMed Central

    Burr, Tom

    2013-01-01

    Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the “go-to” option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example. PMID:24288668

  9. Selecting summary statistics in approximate Bayesian computation for calibrating stochastic models.

    PubMed

    Burr, Tom; Skurikhin, Alexei

    2013-01-01

    Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the "go-to" option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example.

  10. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  11. Source credibility and evidence format: examining the effectiveness of HIV/AIDS messages for young African Americans.

    PubMed

    Major, Lesa Hatley; Coleman, Renita

    2012-01-01

    Using experimental methodology, this study tests the effectiveness of HIV/AIDS prevention messages tailored specifically to college-aged African Americans. To test interaction effects, it intersects source role and evidence format. The authors used gain-framed and loss-framed information specific to young African Americans and HIV to test message effectiveness between statistical and emotional evidence formats, and for the first time, a statistical/emotional combination format. It tests which source--physician or minister--that young African Americans believe is more effective when delivering HIV/AIDS messages to young African Americans. By testing the interaction between source credibility and evidence format, this research expands knowledge on creating effective health messages in several major areas. Findings include a significant interaction between the role of physician and the combined statistical/emotional format. This message was rated as the most effective way to deliver HIV/AIDS prevention messages.

  12. Match statistics related to winning in the group stage of 2014 Brazil FIFA World Cup.

    PubMed

    Liu, Hongyou; Gomez, Miguel-Ángel; Lago-Peñas, Carlos; Sampaio, Jaime

    2015-01-01

    Identifying match statistics that strongly contribute to winning in football matches is a very important step towards a more predictive and prescriptive performance analysis. The current study aimed to determine relationships between 24 match statistics and the match outcome (win, loss and draw) in all games and close games of the group stage of FIFA World Cup (2014, Brazil) by employing the generalised linear model. The cumulative logistic regression was run in the model taking the value of each match statistic as independent variable to predict the logarithm of the odds of winning. Relationships were assessed as effects of a two-standard-deviation increase in the value of each variable on the change in the probability of a team winning a match. Non-clinical magnitude-based inferences were employed and were evaluated by using the smallest worthwhile change. Results showed that for all the games, nine match statistics had clearly positive effects on the probability of winning (Shot, Shot on Target, Shot from Counter Attack, Shot from Inside Area, Ball Possession, Short Pass, Average Pass Streak, Aerial Advantage and Tackle), four had clearly negative effects (Shot Blocked, Cross, Dribble and Red Card), other 12 statistics had either trivial or unclear effects. While for the close games, the effects of Aerial Advantage and Yellow Card turned to trivial and clearly negative, respectively. Information from the tactical modelling can provide a more thorough and objective match understanding to coaches and performance analysts for evaluating post-match performances and for scouting upcoming oppositions.

  13. A first-order statistical smoothing approximation for the coherent wave field in random porous random media

    NASA Astrophysics Data System (ADS)

    Müller, Tobias M.; Gurevich, Boris

    2005-04-01

    An important dissipation mechanism for waves in randomly inhomogeneous poroelastic media is the effect of wave-induced fluid flow. In the framework of Biot's theory of poroelasticity, this mechanism can be understood as scattering from fast into slow compressional waves. To describe this conversion scattering effect in poroelastic random media, the dynamic characteristics of the coherent wavefield using the theory of statistical wave propagation are analyzed. In particular, the method of statistical smoothing is applied to Biot's equations of poroelasticity. Within the accuracy of the first-order statistical smoothing an effective wave number of the coherent field, which accounts for the effect of wave-induced flow, is derived. This wave number is complex and involves an integral over the correlation function of the medium's fluctuations. It is shown that the known one-dimensional (1-D) result can be obtained as a special case of the present 3-D theory. The expression for the effective wave number allows to derive a model for elastic attenuation and dispersion due to wave-induced fluid flow. These wavefield attributes are analyzed in a companion paper. .

  14. The use and misuse of statistical methodologies in pharmacology research.

    PubMed

    Marino, Michael J

    2014-01-01

    Descriptive, exploratory, and inferential statistics are necessary components of hypothesis-driven biomedical research. Despite the ubiquitous need for these tools, the emphasis on statistical methods in pharmacology has become dominated by inferential methods often chosen more by the availability of user-friendly software than by any understanding of the data set or the critical assumptions of the statistical tests. Such frank misuse of statistical methodology and the quest to reach the mystical α<0.05 criteria has hampered research via the publication of incorrect analysis driven by rudimentary statistical training. Perhaps more critically, a poor understanding of statistical tools limits the conclusions that may be drawn from a study by divorcing the investigator from their own data. The net result is a decrease in quality and confidence in research findings, fueling recent controversies over the reproducibility of high profile findings and effects that appear to diminish over time. The recent development of "omics" approaches leading to the production of massive higher dimensional data sets has amplified these issues making it clear that new approaches are needed to appropriately and effectively mine this type of data. Unfortunately, statistical education in the field has not kept pace. This commentary provides a foundation for an intuitive understanding of statistics that fosters an exploratory approach and an appreciation for the assumptions of various statistical tests that hopefully will increase the correct use of statistics, the application of exploratory data analysis, and the use of statistical study design, with the goal of increasing reproducibility and confidence in the literature. Copyright © 2013. Published by Elsevier Inc.

  15. Online incidental statistical learning of audiovisual word sequences in adults: a registered report.

    PubMed

    Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy

    2018-02-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r  = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.

  16. Online incidental statistical learning of audiovisual word sequences in adults: a registered report

    PubMed Central

    Duta, Mihaela; Thompson, Paul

    2018-01-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory–picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test–retest reliability (r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process. PMID:29515876

  17. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  18. The Statistical Power of Planned Comparisons.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Basic principles underlying statistical power are examined; and issues pertaining to effect size, sample size, error variance, and significance level are highlighted via the use of specific hypothetical examples. Analysis of variance (ANOVA) and related methods remain popular, although other procedures sometimes have more statistical power against…

  19. Dominant role of many-body effects on the carrier distribution function of quantum dot lasers

    NASA Astrophysics Data System (ADS)

    Peyvast, Negin; Zhou, Kejia; Hogg, Richard A.; Childs, David T. D.

    2016-03-01

    The effects of free-carrier-induced shift and broadening on the carrier distribution function are studied considering different extreme cases for carrier statistics (Fermi-Dirac and random carrier distributions) as well as quantum dot (QD) ensemble inhomogeneity and state separation using a Monte Carlo model. Using this model, we show that the dominant factor determining the carrier distribution function is the free carrier effects and not the choice of carrier statistics. By using empirical values of the free-carrier-induced shift and broadening, good agreement is obtained with experimental data of QD materials obtained under electrical injection for both extreme cases of carrier statistics.

  20. Effects of an Instructional Gaming Characteristic on Learning Effectiveness, Efficiency, and Engagement: Using a Storyline for Teaching Basic Statistical Skills

    ERIC Educational Resources Information Center

    Novak, Elena; Johnson, Tristan E.; Tenenbaum, Gershon; Shute, Valerie J.

    2016-01-01

    The study explored instructional benefits of a storyline gaming characteristic (GC) on learning effectiveness, efficiency, and engagement with the use of an online instructional simulation for graduate students in an introductory statistics course. A storyline is a game-design element that connects scenes with the educational content. In order to…

  1. A General Framework for Power Analysis to Detect the Moderator Effects in Two- and Three-Level Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben

    2016-01-01

    The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…

  2. The Effects of the Recession on Child Poverty: Poverty Statistics for 2008 and Growth in Need during 2009

    ERIC Educational Resources Information Center

    Isaacs, Julia B.

    2009-01-01

    Nearly one in five children under age 18 lived in poor families in 2008, according to poverty statistics released by the Census Bureau in September 2009. Though high, this statistic does not capture the full impact of the economic downturn, which is expected to drive poverty even higher in 2009. However, updated poverty statistics will not be…

  3. 77 FR 16950 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 630 in the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... Statistical Area 630 in the Gulf of Alaska AGENCY: National Marine Fisheries Service (NMFS), National Oceanic.... SUMMARY: NMFS is opening directed fishing for pollock in Statistical Area 630 of the Gulf of Alaska (GOA... catch of pollock in Statistical Area 630 of the GOA. DATES: Effective 1200 hrs, Alaska local time (A.l.t...

  4. 78 FR 17886 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 630 in the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-25

    ... Statistical Area 630 in the Gulf of Alaska AGENCY: National Marine Fisheries Service (NMFS), National Oceanic.... SUMMARY: NMFS is opening directed fishing for pollock in Statistical Area 630 of the Gulf of Alaska (GOA... Statistical Area 630 of the GOA. DATES: Effective 1200 hrs, Alaska local time (A.l.t.), March 22, 2013...

  5. 75 FR 64958 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 630 of the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-21

    .... 0910131362-0087-02] RIN 0648-XZ84 Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical... is opening directed fishing for pollock in Statistical Area 630 of the Gulf of Alaska (GOA) for 72... for Statistical Area 630 of the GOA. DATES: Effective 1200 hrs, Alaska local time (A.l.t.), October 15...

  6. 75 FR 14359 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 630 in the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    .... 0910131362-0087-02] RIN 0648-XV45 Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical... is reopening directed fishing for pollock in Statistical Area 630 of the Gulf of Alaska (GOA) for 72... (TAC) of pollock in Statistical Area 630 of the GOA. DATES: Effective 1200 hrs, Alaska local time (A.l...

  7. 76 FR 13097 - Fisheries of the Exclusive Economic Zone Off Alaska; Pollock in Statistical Area 630 in the Gulf...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    ... Statistical Area 630 in the Gulf of Alaska AGENCY: National Marine Fisheries Service (NMFS), National Oceanic.... SUMMARY: NMFS is opening directed fishing for pollock in Statistical Area 630 of the Gulf of Alaska (GOA... pollock in Statistical Area 630 of the GOA. DATES: Effective 1200 hrs, Alaska local time (A.l.t.), March 7...

  8. Field Penetration in a Rectangular Box Using Numerical Techniques: An Effort to Obtain Statistical Shielding Effectiveness

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Yu, Shih-Pin

    2006-01-01

    This paper emphasizes the application of numerical methods to explore the ideas related to shielding effectiveness from a statistical view. An empty rectangular box is examined using a hybrid modal/moment method. The basic computational method is presented followed by the results for single- and multiple observation points within the over-moded empty structure. The statistics of the field are obtained by using frequency stirring, borrowed from the ideas connected with reverberation chamber techniques, and extends the ideas of shielding effectiveness well into the multiple resonance regions. The study presented in this paper will address the average shielding effectiveness over a broad spatial sample within the enclosure as the frequency is varied.

  9. Spatial and temporal patterns of shoreline change of a 280-km high-energy disrupted sandy coast from 1950 to 2014: SW France

    NASA Astrophysics Data System (ADS)

    Castelle, Bruno; Guillot, Benoit; Marieu, Vincent; Chaumillon, Eric; Hanquiez, Vincent; Bujan, Stéphane; Poppeschi, Coline

    2018-01-01

    A dataset of 15 geo-referenced orthomosaics photos was generated to address long-term shoreline change along approximately 270 km of high-energy sandy coast in SW France between 1950 and 2014. The coast consists of sandy beaches backed by coastal dunes, which are only disrupted by two wide tidal inlets (Arcachon and Maumusson), a wide estuary mouth (Gironde) and a few small wave-dominated inlets and coastal towns. A time and spatially averaged erosion trend of 1.12 m/year is found over 1950-2014, with a local maximum of approximately 11 m/year and a maximum local accretion of approximately 6 m/year, respectively. Maximum shoreline evolutions are observed along coasts adjacent to the inlets and to the estuary mouth, with erosion and accretion alternating over time on the timescale of decades. The two inlet-sandspit systems of Arcachon and Maumusson show a quasi-synchronous behaviour with the two updrift coasts accreting until the 1970s and subsequently eroding since then, which suggests that shoreline change at these locations is controlled by allocyclic mechanisms. Despite sea level rise and the well-established increase in winter wave height over the last decades, there is no capture of significant increase in mean erosion rate. This is hypothesized to be partly the result of relevant coastal dune management works from the 1960s to the 1980s after a long period of coastal dune disrepair during and after the Second World War. This study suggests that long-term shoreline change of high-energy sandy coasts disrupted by inlets and/or estuaries is complex and needs to consider a wide range of parameters including, non-extensively, waves, tides, inlet dynamics, sea level rise, coastal dune management and coastal defences, which challenges the development of reliable long-term coastal evolution numerical models.

  10. An entropy-based analysis of lane changing behavior: An interactive approach.

    PubMed

    Kosun, Caglar; Ozdemir, Serhan

    2017-05-19

    As a novelty, this article proposes the nonadditive entropy framework for the description of driver behaviors during lane changing. The authors also state that this entropy framework governs the lane changing behavior in traffic flow in accordance with the long-range vehicular interactions and traffic safety. The nonadditive entropy framework is the new generalized theory of thermostatistical mechanics. Vehicular interactions during lane changing are considered within this framework. The interactive approach for the lane changing behavior of the drivers is presented in the traffic flow scenarios presented in the article. According to the traffic flow scenarios, 4 categories of traffic flow and driver behaviors are obtained. Through the scenarios, comparative analyses of nonadditive and additive entropy domains are also provided. Two quadrants of the categories belong to the nonadditive entropy; the rest are involved in the additive entropy domain. Driving behaviors are extracted and the scenarios depict that nonadditivity matches safe driving well, whereas additivity corresponds to unsafe driving. Furthermore, the cooperative traffic system is considered in nonadditivity where the long-range interactions are present. However, the uncooperative traffic system falls into the additivity domain. The analyses also state that there would be possible traffic flow transitions among the quadrants. This article shows that lane changing behavior could be generalized as nonadditive, with additivity as a special case, based on the given traffic conditions. The nearest and close neighbor models are well within the conventional additive entropy framework. In this article, both the long-range vehicular interactions and safe driving behavior in traffic are handled in the nonadditive entropy domain. It is also inferred that the Tsallis entropy region would correspond to mandatory lane changing behavior, whereas additive and either the extensive or nonextensive entropy region would match discretionary lane changing behavior. This article states that driver behaviors would be in the nonadditive entropy domain to provide a safe traffic stream and hence with vehicle accident prevention in mind.

  11. Wholly Patient-tailored Ablation of Atrial Fibrillation Guided by Spatio-Temporal Dispersion of Electrograms in the Absence of Pulmonary Veins Isolation

    PubMed Central

    Seitz, Julien; Bars, Clément; Théodore, Guillaume; Beurtheret, Sylvain; Lellouche, Nicolas; Bremondy, Michel; Ferracci, Ange; Faure, Jacques; Penaranda, Guillaume; Yamazaki, Masatoshi; Avula, Uma Mahesh R.; Curel, Laurence; Siame, Sabrina; Berenfeld, Omer; Pisapia, André; Kalifa, Jérôme

    2017-01-01

    Background The use of intra-cardiac electrograms to guide atrial fibrillation (AF) ablation has yielded conflicting results. We evaluated an electrogram marker of AF drivers: the clustering of electrograms exhibiting spatio-temporal dispersion — regardless of whether such electrograms were fractionated or not. Objective To evaluate the usefulness of spatio-temporal dispersion, a visually recognizable electric footprint of AF drivers, for the ablation of all forms of AF. Methods We prospectively enrolled 105 patients admitted for AF ablation. AF was sequentially mapped in both atria with a 20-pole PentaRay catheter. We tagged and ablated only regions displaying electrogram dispersion during AF. Results were compared to a validation set in which a conventional ablation approach was used (pulmonary vein isolation/stepwise approach). To establish the mechanism underlying spatio-temporal dispersion of AF electrograms, we conducted realistic numerical simulations of AF drivers in a 2-dimensional model and optical mapping of ovine atrial scar-related AF. Results Ablation at dispersion areas terminated AF in 95%. After ablation of 17±10% of the left atrial surface and 18 months of follow-up, the atrial arrhythmia recurrence rate was 15% after 1.4±0.5 procedure/patient vs 41% in the validation set after 1.5±0.5 procedure/patient (arrhythmia free-survival rates: 85% vs 59%, log rank P<0.001). In comparison with the validation set, radiofrequency times (49 ± 21 minutes vs 85 ± 34.5 minutes, p=0.001) and procedure times (168 ± 42 minutes vs. 230 ± 67 minutes, p<.0001) were shorter. In simulations and optical mapping experiments, virtual PentaRay recordings demonstrated that electrogram dispersion is mostly recorded in the vicinity of a driver. Conclusions The clustering of intra-cardiac electrograms exhibiting spatio-temporal dispersion is indicative of AF drivers. Their ablation allows for a non-extensive and patient-tailored approach to AF ablation. Clinical trial.gov number: NCT02093949 PMID:28104073

  12. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students.

    PubMed

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah

    2015-01-01

    Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.

  13. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    PubMed

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  14. Mathematical background and attitudes toward statistics in a sample of Spanish college students.

    PubMed

    Carmona, José; Martínez, Rafael J; Sánchez, Manuel

    2005-08-01

    To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.

  15. Significant Statistics: Viewed with a Contextual Lens

    ERIC Educational Resources Information Center

    Tait-McCutcheon, Sandi

    2010-01-01

    This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…

  16. Concurrent Movement Impairs Incidental but Not Intentional Statistical Learning

    ERIC Educational Resources Information Center

    Stevens, David J.; Arciuli, Joanne; Anderson, David I.

    2015-01-01

    The effect of concurrent movement on incidental versus intentional statistical learning was examined in two experiments. In Experiment 1, participants learned the statistical regularities embedded within familiarization stimuli implicitly, whereas in Experiment 2 they were made aware of the embedded regularities and were instructed explicitly to…

  17. Teaching Principles of Linkage and Gene Mapping with the Tomato.

    ERIC Educational Resources Information Center

    Hawk, James A.; And Others

    1980-01-01

    A three-point linkage system in tomatoes is used to explain concepts of gene mapping, linking and statistical analysis. The system is designed for teaching the effective use of statistics, and the power of genetic analysis from statistical analysis of phenotypic ratios. (Author/SA)

  18. Modelling the Effects of Land-Use Changes on Climate: a Case Study on Yamula DAM

    NASA Astrophysics Data System (ADS)

    Köylü, Ü.; Geymen, A.

    2016-10-01

    Dams block flow of rivers and cause artificial water reservoirs which affect the climate and the land use characteristics of the river basin. In this research, the effect of the huge water body obtained by Yamula Dam in Kızılırmak Basin is analysed over surrounding spatial's land use and climate change. Mann Kendal non-parametrical statistical test, Theil&Sen Slope method, Inverse Distance Weighting (IDW), Soil Conservation Service-Curve Number (SCS-CN) methods are integrated for spatial and temporal analysis of the research area. For this research humidity, temperature, wind speed, precipitation observations which are collected in 16 weather stations nearby Kızılırmak Basin are analyzed. After that these statistical information is combined by GIS data over years. An application is developed for GIS analysis in Python Programming Language and integrated with ArcGIS software. Statistical analysis calculated in the R Project for Statistical Computing and integrated with developed application. According to the statistical analysis of extracted time series of meteorological parameters, statistical significant spatiotemporal trends are observed for climate change and land use characteristics. In this study, we indicated the effect of big dams in local climate on semi-arid Yamula Dam.

  19. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  20. Using a higher criticism statistic to detect modest effects in a genome-wide study of rheumatoid arthritis

    PubMed Central

    2009-01-01

    In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032

  1. Effects of a Value-Reappraisal Intervention on Statistics Students' Motivation and Performance

    ERIC Educational Resources Information Center

    Acee, Taylor W.; Weinstein, Claire Ellen

    2010-01-01

    The authors investigated the effects of an exploratory value-reappraisal intervention on students' motivation and performance in an undergraduate introductory statistics course. They sampled 82 students from 2 instructors' sections during both the fall and spring semesters. Students were randomly assigned within each section to either the…

  2. An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.

    ERIC Educational Resources Information Center

    Capraro, Mary Margaret

    This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…

  3. Teacher Effects, Value-Added Models, and Accountability

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2014-01-01

    Background: In the last decade, the effects of teachers on student performance (typically manifested as state-wide standardized tests) have been re-examined using statistical models that are known as value-added models. These statistical models aim to compute the unique contribution of the teachers in promoting student achievement gains from grade…

  4. Self-Explanation in the Domain of Statistics: An Expertise Reversal Effect

    ERIC Educational Resources Information Center

    Leppink, Jimmie; Broers, Nick J.; Imbos, Tjaart; van der Vleuten, Cees P. M.; Berger, Martijn P. F.

    2012-01-01

    This study investigated the effects of four instructional methods on cognitive load, propositional knowledge, and conceptual understanding of statistics, for low prior knowledge students and for high prior knowledge students. The instructional methods were (1) a reading-only control condition, (2) answering open-ended questions, (3) answering…

  5. Global statistics of liquid water content and effective number density of water clouds over ocean derived from combined CALIPSO and MODIS measurements

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.

    2007-03-01

    This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water content and effective number density are presented.

  6. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  7. Potential errors and misuse of statistics in studies on leakage in endodontics.

    PubMed

    Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J

    2013-04-01

    To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.

  8. Applications of spatial statistical network models to stream data

    USGS Publications Warehouse

    Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal

    2014-01-01

    Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.

  9. Noise exposure-response relationships established from repeated binary observations: Modeling approaches and applications.

    PubMed

    Schäffer, Beat; Pieren, Reto; Mendolia, Franco; Basner, Mathias; Brink, Mark

    2017-05-01

    Noise exposure-response relationships are used to estimate the effects of noise on individuals or a population. Such relationships may be derived from independent or repeated binary observations, and modeled by different statistical methods. Depending on the method by which they were established, their application in population risk assessment or estimation of individual responses may yield different results, i.e., predict "weaker" or "stronger" effects. As far as the present body of literature on noise effect studies is concerned, however, the underlying statistical methodology to establish exposure-response relationships has not always been paid sufficient attention. This paper gives an overview on two statistical approaches (subject-specific and population-averaged logistic regression analysis) to establish noise exposure-response relationships from repeated binary observations, and their appropriate applications. The considerations are illustrated with data from three noise effect studies, estimating also the magnitude of differences in results when applying exposure-response relationships derived from the two statistical approaches. Depending on the underlying data set and the probability range of the binary variable it covers, the two approaches yield similar to very different results. The adequate choice of a specific statistical approach and its application in subsequent studies, both depending on the research question, are therefore crucial.

  10. [Value influence of different compatibilities of main active parts in yangyintongnao granule on pharmacokinetics parameters in rats with cerebral ischemia reperfusion injury by total amount statistic moment method].

    PubMed

    Guo, Ying; Yang, Jiehong; Znang, Hengyi; Fu, Xuchun; Zhnag, Yuyan; Wan, Haitong

    2010-02-01

    To study the influence of the different combinations of the main active parts in Yangyintongnao granule on the pharmacokinetics parameters of the two active components--ligustrazine and puerarin using the method of total amount statistic moment for pharmacokinetics. Combinations were formed according to the dosages of the four active parts (alkaloid, flavone, saponin, naphtha) by orthogonal experiment L9 (3(4)). Blood concentrations of ligustrazine and puerarin were determinated by HPLC at different time. Zero rank moment (AUC) and one rank moment (MRT, mean residence time) of ligustrazine and puerarin have been worked out to calculate the total amount statistic moment parameters was analyzed of Yangyintongnao granule by the method of the total amount statistic moment. The influence of different compatibilities on the pharmacokinetics parameters was analyzed by orthogonal test. Flavone has the strongest effect than saponin on the total AUC. Ligustrazine has the strongest effect on the total MRT. Saponin has little effect on the two parameters, but naphtha has more effect on both of them. It indicates that naphtha may promote metabolism of ligustrazine and puerarin in rat. Total amount statistic moment parameters can be used to guide for compatibilities of TCM.

  11. Statistics Anxiety Update: Refining the Construct and Recommendations for a New Research Agenda.

    PubMed

    Chew, Peter K H; Dillon, Denise B

    2014-03-01

    Appreciation of the importance of statistics literacy for citizens of a democracy has resulted in an increasing number of degree programs making statistics courses mandatory for university students. Unfortunately, empirical evidence suggests that students in nonmathematical disciplines (e.g., social sciences) regard statistics courses as the most anxiety-inducing course in their degree programs. Although a literature review exists for statistics anxiety, it was done more than a decade ago, and newer studies have since added findings for consideration. In this article, we provide a current review of the statistics anxiety literature. Specifically, related variables, definitions, and measures of statistics anxiety are reviewed with the goal of refining the statistics anxiety construct. Antecedents, effects, and interventions of statistics anxiety are also reviewed to provide recommendations for statistics instructors and for a new research agenda. © The Author(s) 2014.

  12. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    PubMed

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  13. Effective Vaccine Communication during the Disneyland Measles Outbreak

    PubMed Central

    Broniatowski, David Andre; Hilyard, Karen M.; Dredze, Mark

    2016-01-01

    Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and that articles expressing a clear gist will be most compelling. We coded news articles (n=4,686) collected during the 2014–2015 Disneyland measles for content including statistics, stories, or opinions containing bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line opinions, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. PMID:27179915

  14. Effective vaccine communication during the disneyland measles outbreak.

    PubMed

    Broniatowski, David A; Hilyard, Karen M; Dredze, Mark

    2016-06-14

    Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and those articles expressing a clear gist will be most compelling. We coded news articles (n=4581) collected during the 2014-2015 Disneyland measles for content including statistics, stories, or bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line gists, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Counting statistics for genetic switches based on effective interaction approximation

    NASA Astrophysics Data System (ADS)

    Ohkubo, Jun

    2012-09-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid having the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  16. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling

    PubMed Central

    Wood, John

    2017-01-01

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered—some very seriously so—but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. PMID:28706080

  17. The T(ea) Test: Scripted Stories Increase Statistical Method Selection Skills

    ERIC Educational Resources Information Center

    Hackathorn, Jana; Ashdown, Brien

    2015-01-01

    To teach statistics, teachers must attempt to overcome pedagogical obstacles, such as dread, anxiety, and boredom. There are many options available to teachers that facilitate a pedagogically conducive environment in the classroom. The current study examined the effectiveness of incorporating scripted stories and humor into statistical method…

  18. Intelligence Reform in Colombia: Transparency and Effectiveness against Internal Threats

    DTIC Science & Technology

    2007-05-01

    Humanos Y Derecho Internacional Humano 2004, page 228. Statistics for 2006 may be viewed at: http://www.derechoshumanos.gov.co/index.php?newsecc...used here are projections based on official government statistics through May of 2006. Statistics for 2003 are contained in Informe Annual de Derechos

  19. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  20. Flexibility in Statistical Word Segmentation: Finding Words in Foreign Speech

    ERIC Educational Resources Information Center

    Graf Estes, Katharine; Gluck, Stephanie Chen-Wu; Bastos, Carolina

    2015-01-01

    The present experiments investigated the flexibility of statistical word segmentation. There is ample evidence that infants can use statistical cues (e.g., syllable transitional probabilities) to segment fluent speech. However, it is unclear how effectively infants track these patterns in unfamiliar phonological systems. We examined whether…

  1. Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials

    ERIC Educational Resources Information Center

    Potter, Christine E.; Wang, Tianlin; Saffran, Jenny R.

    2017-01-01

    Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning…

  2. Statistical Knowledge and the Over-Interpretation of Student Evaluations of Teaching

    ERIC Educational Resources Information Center

    Boysen, Guy A.

    2017-01-01

    Research shows that teachers interpret small differences in student evaluations of teaching as meaningful even when available statistical information indicates that the differences are not reliable. The current research explored the effect of statistical training on college teachers' tendency to over-interpret student evaluation differences. A…

  3. Statistically Modeling Individual Students' Learning over Successive Collaborative Practice Opportunities

    ERIC Educational Resources Information Center

    Olsen, Jennifer; Aleven, Vincent; Rummel, Nikol

    2017-01-01

    Within educational data mining, many statistical models capture the learning of students working individually. However, not much work has been done to extend these statistical models of individual learning to a collaborative setting, despite the effectiveness of collaborative learning activities. We extend a widely used model (the additive factors…

  4. The influence of narrative v. statistical information on perceiving vaccination risks.

    PubMed

    Betsch, Cornelia; Ulshöfer, Corina; Renkewitz, Frank; Betsch, Tilmann

    2011-01-01

    Health-related information found on the Internet is increasing and impacts patient decision making, e.g. regarding vaccination decisions. In addition to statistical information (e.g. incidence rates of vaccine adverse events), narrative information is also widely available such as postings on online bulletin boards. Previous research has shown that narrative information can impact treatment decisions, even when statistical information is presented concurrently. As the determinants of this effect are largely unknown, we will vary features of the narratives to identify mechanisms through which narratives impact risk judgments. An online bulletin board setting provided participants with statistical information and authentic narratives about the occurrence and nonoccurrence of adverse events. Experiment 1 followed a single factorial design with 1, 2, or 4 narratives out of 10 reporting adverse events. Experiment 2 implemented a 2 (statistical risk 20% vs. 40%) × 2 (2/10 vs. 4/10 narratives reporting adverse events) × 2 (high vs. low richness) × 2 (high vs. low emotionality) between-subjects design. Dependent variables were perceived risk of side-effects and vaccination intentions. Experiment 1 shows an inverse relation between the number of narratives reporting adverse-events and vaccination intentions, which was mediated by the perceived risk of vaccinating. Experiment 2 showed a stronger influence of the number of narratives than of the statistical risk information. High (vs. low) emotional narratives had a greater impact on the perceived risk, while richness had no effect. The number of narratives influences risk judgments can potentially override statistical information about risk.

  5. Ergodicity of Truncated Stochastic Navier Stokes with Deterministic Forcing and Dispersion

    NASA Astrophysics Data System (ADS)

    Majda, Andrew J.; Tong, Xin T.

    2016-10-01

    Turbulence in idealized geophysical flows is a very rich and important topic. The anisotropic effects of explicit deterministic forcing, dispersive effects from rotation due to the β -plane and F-plane, and topography together with random forcing all combine to produce a remarkable number of realistic phenomena. These effects have been studied through careful numerical experiments in the truncated geophysical models. These important results include transitions between coherent jets and vortices, and direct and inverse turbulence cascades as parameters are varied, and it is a contemporary challenge to explain these diverse statistical predictions. Here we contribute to these issues by proving with full mathematical rigor that for any values of the deterministic forcing, the β - and F-plane effects and topography, with minimal stochastic forcing, there is geometric ergodicity for any finite Galerkin truncation. This means that there is a unique smooth invariant measure which attracts all statistical initial data at an exponential rate. In particular, this rigorous statistical theory guarantees that there are no bifurcations to multiple stable and unstable statistical steady states as geophysical parameters are varied in contrast to claims in the applied literature. The proof utilizes a new statistical Lyapunov function to account for enstrophy exchanges between the statistical mean and the variance fluctuations due to the deterministic forcing. It also requires careful proofs of hypoellipticity with geophysical effects and uses geometric control theory to establish reachability. To illustrate the necessity of these conditions, a two-dimensional example is developed which has the square of the Euclidean norm as the Lyapunov function and is hypoelliptic with nonzero noise forcing, yet fails to be reachable or ergodic.

  6. Construction of cosmic string induced temperature anisotropy maps with CMBFAST and statistical analysis

    NASA Astrophysics Data System (ADS)

    Simatos, N.; Perivolaropoulos, L.

    2001-01-01

    We use the publicly available code CMBFAST, as modified by Pogosian and Vachaspati, to simulate the effects of wiggly cosmic strings on the cosmic microwave background (CMB). Using the modified CMBFAST code, which takes into account vector modes and models wiggly cosmic strings by the one-scale model, we go beyond the angular power spectrum to construct CMB temperature maps with a resolution of a few degrees. The statistics of these maps are then studied using conventional and recently proposed statistical tests optimized for the detection of hidden temperature discontinuities induced by the Gott-Kaiser-Stebbins effect. We show, however, that these realistic maps cannot be distinguished in a statistically significant way from purely Gaussian maps with an identical power spectrum.

  7. Ganymede - A relationship between thermal history and crater statistics

    NASA Technical Reports Server (NTRS)

    Phillips, R. J.; Malin, M. C.

    1980-01-01

    An approach for factoring the effects of a planetary thermal history into a predicted set of crater statistics for an icy satellite is developed and forms the basis for subsequent data inversion studies. The key parameter is a thermal evolution-dependent critical time for which craters of a particular size forming earlier do not contribute to present-day statistics. An example is given for the satellite Ganymede and the effect of the thermal history is easily seen in the resulting predicted crater statistics. A preliminary comparison with the data, subject to the uncertainties in ice rheology and impact flux history, suggests a surface age of 3.8 x 10 to the 9th years and a radionuclide abundance of 0.3 times the chondritic value.

  8. New selection effect in statistical investigations of supernova remnants

    NASA Astrophysics Data System (ADS)

    Allakhverdiev, A. O.; Guseinov, O. Kh.; Kasumov, F. K.

    1986-01-01

    The influence of H II regions on the parameters of supernova remnants (SNR) is investigated. It has been shown that the projection of such regions on the SNRs leads to: a) local changes of morphological structure of young shell-type SNRs and b) considerable distortions of integral parameters of evolved shell-type SNRs (with D > 10 pc) and plerions, up to their complete undetectability on the background of classical and gigantic H II regions. A new selection effect, in fact, arises from these factors connected with additional limitations made by the real structure of the interstellar medium on the statistical investigations of SNRs. The influence of this effect on the statistical completeness of objects has been estimated.

  9. Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.

    PubMed

    Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R

    2007-12-01

    After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.

  10. Effectiveness of groundwater governance structures and institutions in Tanzania

    NASA Astrophysics Data System (ADS)

    Gudaga, J. L.; Kabote, S. J.; Tarimo, A. K. P. R.; Mosha, D. B.; Kashaigili, J. J.

    2018-05-01

    This paper examines effectiveness of groundwater governance structures and institutions in Mbarali District, Mbeya Region. The paper adopts exploratory sequential research design to collect quantitative and qualitative data. A random sample of 90 groundwater users with 50% women was involved in the survey. Descriptive statistics, Kruskal-Wallis H test and Mann-Whitney U test were used to compare the differences in responses between groups, while qualitative data were subjected to content analysis. The results show that the Village Councils and Community Water Supply Organizations (COWSOs) were effective in governing groundwater. The results also show statistical significant difference on the overall extent of effectiveness of the Village Councils in governing groundwater between villages ( P = 0.0001), yet there was no significant difference ( P > 0.05) between male and female responses on the effectiveness of Village Councils, village water committees and COWSOs. The Mann-Whitney U test showed statistical significant difference between male and female responses on effectiveness of formal and informal institutions ( P = 0.0001), such that informal institutions were effective relative to formal institutions. The Kruskal-Wallis H test also showed statistical significant difference ( P ≤ 0.05) on the extent of effectiveness of formal institutions, norms and values between low, medium and high categories. The paper concludes that COWSOs were more effective in governing groundwater than other groundwater governance structures. Similarly, norms and values were more effective than formal institutions. The paper recommends sensitization and awareness creation on formal institutions so that they can influence water users' behaviour to govern groundwater.

  11. Principles of Statistics: What the Sports Medicine Professional Needs to Know.

    PubMed

    Riemann, Bryan L; Lininger, Monica R

    2018-07-01

    Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. The effect of a major cigarette price change on smoking behavior in california: a zero-inflated negative binomial model.

    PubMed

    Sheu, Mei-Ling; Hu, Teh-Wei; Keeler, Theodore E; Ong, Michael; Sung, Hai-Yen

    2004-08-01

    The objective of this paper is to determine the price sensitivity of smokers in their consumption of cigarettes, using evidence from a major increase in California cigarette prices due to Proposition 10 and the Tobacco Settlement. The study sample consists of individual survey data from Behavioral Risk Factor Survey (BRFS) and price data from the Bureau of Labor Statistics between 1996 and 1999. A zero-inflated negative binomial (ZINB) regression model was applied for the statistical analysis. The statistical model showed that price did not have an effect on reducing the estimated prevalence of smoking. However, it indicated that among smokers the price elasticity was at the level of -0.46 and statistically significant. Since smoking prevalence is significantly lower than it was a decade ago, price increases are becoming less effective as an inducement for hard-core smokers to quit, although they may respond by decreasing consumption. For those who only smoke occasionally (many of them being young adults) price increases alone may not be an effective inducement to quit smoking. Additional underlying behavioral factors need to be identified so that more effective anti-smoking strategies can be developed.

  13. The effect of inclusion classrooms on the science achievement of general education students

    NASA Astrophysics Data System (ADS)

    Dodd, Matthew Robert

    General education and Special Education students from three high schools in Rutherford County were sampled to determine the effect on their academic achievement on the Tennessee Biology I Gateway Exam in Inclusion classrooms. Each student's predicted and actual Gateway Exam scores from the academic year 2006--2007 were used to determine the effect the student's classroom had on his academic achievement. Independent variables used in the study were gender, ethnicity, socioeconomic level, grade point average, type of classroom (general or Inclusion), and type student (General Education or Special Education). The statistical tests used in this study were a t-test and a Mann--Whitney U Test. From this study, the effect of the Inclusion classroom on general education students was not significant statistically. Although the Inclusion classroom allows the special education student to succeed in the classroom, the effect on general education students is negligible. This study also provided statistical data that the Inclusion classroom did not improve the special education students' academic performances on the Gateway Exam. Students in a general education classroom with a GPA above 3.000 and those from a household without a low socioeconomic status performed at a statistically different level in this study.

  14. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  15. Measuring the statistical validity of summary meta-analysis and meta-regression results for use in clinical practice.

    PubMed

    Willis, Brian H; Riley, Richard D

    2017-09-20

    An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  16. Older and Younger Workers: The Equalling Effects of Health

    ERIC Educational Resources Information Center

    Beck, Vanessa; Quinn, Martin

    2012-01-01

    Purpose: The purpose of this paper is to consider the statistical evidence on the effects that ill health has on labour market participation and opportunities for younger and older workers in the East Midlands (UK). Design/methodology/approach: A statistical analysis of Labour Force Survey data was undertaken to demonstrate that health issues…

  17. Statistical Significance and Effect Size: Two Sides of a Coin.

    ERIC Educational Resources Information Center

    Fan, Xitao

    This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…

  18. Effectiveness of "Essentials for College Math" as a High School Transitional Course

    ERIC Educational Resources Information Center

    Riggleman, Jennifer S.

    2017-01-01

    Statistics on the number of students who leave high school underprepared for postsecondary education, and have to take remedial coursework upon entrance to college vary, but, unfortunately, for at least the last 10 years, these statistics have remained high. This study examined the effectiveness of one transitional high school math curriculum…

  19. Effectiveness of Project Based Learning in Statistics for Lower Secondary Schools

    ERIC Educational Resources Information Center

    Siswono, Tatag Yuli Eko; Hartono, Sugi; Kohar, Ahmad Wachidul

    2018-01-01

    Purpose: This study aimed at investigating the effectiveness of implementing Project Based Learning (PBL) on the topic of statistics at a lower secondary school in Surabaya city, Indonesia, indicated by examining student learning outcomes, student responses, and student activity. Research Methods: A quasi experimental method was conducted over two…

  20. The Effects and Side-Effects of Statistics Education: Psychology Students' (Mis-)Conceptions of Probability

    ERIC Educational Resources Information Center

    Morsanyi, Kinga; Primi, Caterina; Chiesi, Francesca; Handley, Simon

    2009-01-01

    In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists'…

  1. An Empirical Consideration of a Balanced Amalgamation of Learning Strategies in Graduate Introductory Statistics Classes

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.

    2009-01-01

    This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this…

  2. The Misdirection of Public Policy: Comparing and Combining Standardised Effect Sizes

    ERIC Educational Resources Information Center

    Simpson, Adrian

    2017-01-01

    Increased attention on "what works" in education has led to an emphasis on developing policy from evidence based on comparing and combining a particular statistical summary of intervention studies: the standardised effect size. It is assumed that this statistical summary provides an estimate of the educational impact of interventions and…

  3. The statistics of laser returns from cube-corner arrays on satellite

    NASA Technical Reports Server (NTRS)

    Lehr, C. G.

    1973-01-01

    A method first presented by Goodman is used to derive an equation for the statistical effects associated with laser returns from satellites having retroreflecting arrays of cube corners. The effect of the distribution on the returns of a satellite-tracking system is illustrated by a computation based on randomly generated numbers.

  4. Uncertainty quantification of effective nuclear interactions

    DOE PAGES

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-02

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  5. Uncertainty quantification of effective nuclear interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  6. Cost-Effectiveness Analysis: a proposal of new reporting standards in statistical analysis

    PubMed Central

    Bang, Heejung; Zhao, Hongwei

    2014-01-01

    Cost-effectiveness analysis (CEA) is a method for evaluating the outcomes and costs of competing strategies designed to improve health, and has been applied to a variety of different scientific fields. Yet, there are inherent complexities in cost estimation and CEA from statistical perspectives (e.g., skewness, bi-dimensionality, and censoring). The incremental cost-effectiveness ratio that represents the additional cost per one unit of outcome gained by a new strategy has served as the most widely accepted methodology in the CEA. In this article, we call for expanded perspectives and reporting standards reflecting a more comprehensive analysis that can elucidate different aspects of available data. Specifically, we propose that mean and median-based incremental cost-effectiveness ratios and average cost-effectiveness ratios be reported together, along with relevant summary and inferential statistics as complementary measures for informed decision making. PMID:24605979

  7. An Improved Rank Correlation Effect Size Statistic for Single-Case Designs: Baseline Corrected Tau.

    PubMed

    Tarlow, Kevin R

    2017-07-01

    Measuring treatment effects when an individual's pretreatment performance is improving poses a challenge for single-case experimental designs. It may be difficult to determine whether improvement is due to the treatment or due to the preexisting baseline trend. Tau- U is a popular single-case effect size statistic that purports to control for baseline trend. However, despite its strengths, Tau- U has substantial limitations: Its values are inflated and not bound between -1 and +1, it cannot be visually graphed, and its relatively weak method of trend control leads to unacceptable levels of Type I error wherein ineffective treatments appear effective. An improved effect size statistic based on rank correlation and robust regression, Baseline Corrected Tau, is proposed and field-tested with both published and simulated single-case time series. A web-based calculator for Baseline Corrected Tau is also introduced for use by single-case investigators.

  8. Long-term strategy for the statistical design of a forest health monitoring system

    Treesearch

    Hans T. Schreuder; Raymond L. Czaplewski

    1993-01-01

    A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...

  9. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    ERIC Educational Resources Information Center

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  10. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    ERIC Educational Resources Information Center

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  11. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  12. Student and Professor Gender Effects in Introductory Business Statistics

    ERIC Educational Resources Information Center

    Haley, M. Ryan; Johnson, Marianne F.; Kuennen, Eric W.

    2007-01-01

    Studies have yielded highly mixed results as to differences in male and female student performance in statistics courses; the role that professors play in these differences is even less clear. In this paper, we consider the impact of professor and student gender on student performance in an introductory business statistics course taught by…

  13. The Effect of "Clickers" on Attendance in an Introductory Statistics Course: An Action Research Study

    ERIC Educational Resources Information Center

    Amstelveen, Raoul H.

    2013-01-01

    The purpose of this study was to design and implement a Classroom Response System, also known as a "clicker," to increase attendance in introductory statistics courses at an undergraduate university. Since 2010, non-attendance had been prevalent in introductory statistics courses. Moreover, non-attendance created undesirable classrooms…

  14. An analysis of the relationship of flight hours and naval rotary wing aviation mishaps

    DTIC Science & Technology

    2017-03-01

    evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically significant effects on...estimates found enough evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically ...38 C. DESCRIPTIVE STATISTICS ................................................................38 D

  15. Effect of Task Presentation on Students' Performances in Introductory Statistics Courses

    ERIC Educational Resources Information Center

    Tomasetto, Carlo; Matteucci, Maria Cristina; Carugati, Felice; Selleri, Patrizia

    2009-01-01

    Research on academic learning indicates that many students experience major difficulties with introductory statistics and methodology courses. We hypothesized that students' difficulties may depend in part on the fact that statistics tasks are commonly viewed as related to the threatening domain of math. In two field experiments which we carried…

  16. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  17. Antecedents of students' achievement in statistics

    NASA Astrophysics Data System (ADS)

    Awaludin, Izyan Syazana; Razak, Ruzanna Ab; Harris, Hezlin; Selamat, Zarehan

    2015-02-01

    The applications of statistics in most fields have been vast. Many degree programmes at local universities require students to enroll in at least one statistics course. The standard of these courses varies across different degree programmes. This is because of students' diverse academic backgrounds in which some comes far from the field of statistics. The high failure rate in statistics courses for non-science stream students had been concerning every year. The purpose of this research is to investigate the antecedents of students' achievement in statistics. A total of 272 students participated in the survey. Multiple linear regression was applied to examine the relationship between the factors and achievement. We found that statistics anxiety was a significant predictor of students' achievement. We also found that students' age has significant effect to achievement. Older students are more likely to achieve lowers scores in statistics. Student's level of study also has a significant impact on their achievement in statistics.

  18. Effect of Internet-Based Cognitive Apprenticeship Model (i-CAM) on Statistics Learning among Postgraduate Students

    PubMed Central

    Saadati, Farzaneh; Ahmad Tarmizi, Rohani

    2015-01-01

    Because students’ ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is ‘value added’ because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students’ problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students. PMID:26132553

  19. Phenformin-induced Hypoglycaemia in Normal Subjects*

    PubMed Central

    Lyngsøe, J.; Trap-Jensen, J.

    1969-01-01

    Study of the effect of phenformin on the blood glucose level in normal subjects before and during 70 hours of starvation showed a statistically significant hypoglycaemic effect after 40 hours of starvation. This effect was not due to increased glucose utilization. Another finding in this study was a statistically significant decrease in total urinary nitrogen excretion during starvation in subjects given phenformin. These findings show that the hypoglycaemic effect of phenformin in starved normal subjects is due to inhibition of gluconeogenesis. PMID:5780431

  20. Statistical methods to estimate treatment effects from multichannel electroencephalography (EEG) data in clinical trials.

    PubMed

    Ma, Junshui; Wang, Shubing; Raubertas, Richard; Svetnik, Vladimir

    2010-07-15

    With the increasing popularity of using electroencephalography (EEG) to reveal the treatment effect in drug development clinical trials, the vast volume and complex nature of EEG data compose an intriguing, but challenging, topic. In this paper the statistical analysis methods recommended by the EEG community, along with methods frequently used in the published literature, are first reviewed. A straightforward adjustment of the existing methods to handle multichannel EEG data is then introduced. In addition, based on the spatial smoothness property of EEG data, a new category of statistical methods is proposed. The new methods use a linear combination of low-degree spherical harmonic (SPHARM) basis functions to represent a spatially smoothed version of the EEG data on the scalp, which is close to a sphere in shape. In total, seven statistical methods, including both the existing and the newly proposed methods, are applied to two clinical datasets to compare their power to detect a drug effect. Contrary to the EEG community's recommendation, our results suggest that (1) the nonparametric method does not outperform its parametric counterpart; and (2) including baseline data in the analysis does not always improve the statistical power. In addition, our results recommend that (3) simple paired statistical tests should be avoided due to their poor power; and (4) the proposed spatially smoothed methods perform better than their unsmoothed versions. Copyright 2010 Elsevier B.V. All rights reserved.

  1. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum

    PubMed Central

    Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.

    2016-01-01

    Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). Conclusion This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832

  2. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum.

    PubMed

    Milic, Natasa M; Trajkovic, Goran Z; Bukumiric, Zoran M; Cirkovic, Andja; Nikolic, Ivan M; Milin, Jelena S; Milic, Nikola V; Savic, Marko D; Corac, Aleksandar M; Marinkovic, Jelena M; Stanisavljevic, Dejana M

    2016-01-01

    Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics.

  3. Design of order statistics filters using feedforward neural networks

    NASA Astrophysics Data System (ADS)

    Maslennikova, Yu. S.; Bochkarev, V. V.

    2016-08-01

    In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.

  4. Precision, Reliability, and Effect Size of Slope Variance in Latent Growth Curve Models: Implications for Statistical Power Analysis

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; Ghisletta, Paolo; Lindenberger, Ulman; Hertzog, Christopher

    2018-01-01

    Latent Growth Curve Models (LGCM) have become a standard technique to model change over time. Prediction and explanation of inter-individual differences in change are major goals in lifespan research. The major determinants of statistical power to detect individual differences in change are the magnitude of true inter-individual differences in linear change (LGCM slope variance), design precision, alpha level, and sample size. Here, we show that design precision can be expressed as the inverse of effective error. Effective error is determined by instrument reliability and the temporal arrangement of measurement occasions. However, it also depends on another central LGCM component, the variance of the latent intercept and its covariance with the latent slope. We derive a new reliability index for LGCM slope variance—effective curve reliability (ECR)—by scaling slope variance against effective error. ECR is interpretable as a standardized effect size index. We demonstrate how effective error, ECR, and statistical power for a likelihood ratio test of zero slope variance formally relate to each other and how they function as indices of statistical power. We also provide a computational approach to derive ECR for arbitrary intercept-slope covariance. With practical use cases, we argue for the complementary utility of the proposed indices of a study's sensitivity to detect slope variance when making a priori longitudinal design decisions or communicating study designs. PMID:29755377

  5. Statistics for characterizing data on the periphery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James P; Hush, Donald R

    2010-01-01

    We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.

  6. Statistical model specification and power: recommendations on the use of test-qualified pooling in analysis of experimental data

    PubMed Central

    Colegrave, Nick

    2017-01-01

    A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912

  7. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  8. Current and future health care professionals attitudes toward and knowledge of statistics: How confidence influences learning.

    PubMed

    Baghi, Heibatollah; Kornides, Melanie L

    2013-01-01

    Health care professionals require some understanding of statistics to successfully implement evidence based practice. Developing competency in statistical reasoning is necessary for students training in health care administration, research, and clinical care. Recently, the interest in healthcare professional's attitudes toward statistics has increased substantially due to evidence that these attitudes can hinder professionalism developing an understanding of statistical concepts. In this study, we analyzed pre- and post-instruction attitudes towards and knowledge of statistics obtained from health science graduate students, including nurses and nurse practitioners, enrolled in an introductory graduate course in statistics (n = 165). Results show that the students already held generally positive attitudes toward statistics at the beginning of course. However, these attitudes-along with the students' statistical proficiency-improved after 10 weeks of instruction. The results have implications for curriculum design and delivery methods as well as for health professionals' effective use of statistics in critically evaluating and utilizing research in their practices.

  9. Current and future health care professionals attitudes toward and knowledge of statistics: How confidence influences learning

    PubMed Central

    Baghi, Heibatollah; Kornides, Melanie L.

    2014-01-01

    Background Health care professionals require some understanding of statistics to successfully implement evidence based practice. Developing competency in statistical reasoning is necessary for students training in health care administration, research, and clinical care. Recently, the interest in healthcare professional's attitudes toward statistics has increased substantially due to evidence that these attitudes can hinder professionalism developing an understanding of statistical concepts. Methods In this study, we analyzed pre- and post-instruction attitudes towards and knowledge of statistics obtained from health science graduate students, including nurses and nurse practitioners, enrolled in an introductory graduate course in statistics (n = 165). Results and Conclusions Results show that the students already held generally positive attitudes toward statistics at the beginning of course. However, these attitudes—along with the students’ statistical proficiency—improved after 10 weeks of instruction. The results have implications for curriculum design and delivery methods as well as for health professionals’ effective use of statistics in critically evaluating and utilizing research in their practices. PMID:25419256

  10. Intensity statistics in the presence of translational noncrystallographic symmetry.

    PubMed

    Read, Randy J; Adams, Paul D; McCoy, Airlie J

    2013-02-01

    In the case of translational noncrystallographic symmetry (tNCS), two or more copies of a component in the asymmetric unit of the crystal are present in a similar orientation. This causes systematic modulations of the reflection intensities in the diffraction pattern, leading to problems with structure determination and refinement methods that assume, either implicitly or explicitly, that the distribution of intensities is a function only of resolution. To characterize the statistical effects of tNCS accurately, it is necessary to determine the translation relating the copies, any small rotational differences in their orientations, and the size of random coordinate differences caused by conformational differences. An algorithm to estimate these parameters and refine their values against a likelihood function is presented, and it is shown that by accounting for the statistical effects of tNCS it is possible to unmask the competing statistical effects of twinning and tNCS and to more robustly assess the crystal for the presence of twinning.

  11. Effects of statistical learning on the acquisition of grammatical categories through Qur'anic memorization: A natural experiment.

    PubMed

    Zuhurudeen, Fathima Manaar; Huang, Yi Ting

    2016-03-01

    Empirical evidence for statistical learning comes from artificial language tasks, but it is unclear how these effects scale up outside of the lab. The current study turns to a real-world test case of statistical learning where native English speakers encounter the syntactic regularities of Arabic through memorization of the Qur'an. This unique input provides extended exposure to the complexity of a natural language, with minimal semantic cues. Memorizers were asked to distinguish unfamiliar nouns and verbs based on their co-occurrence with familiar pronouns in an Arabic language sample. Their performance was compared to that of classroom learners who had explicit knowledge of pronoun meanings and grammatical functions. Grammatical judgments were more accurate in memorizers compared to non-memorizers. No effects of classroom experience were found. These results demonstrate that real-world exposure to the statistical properties of a natural language facilitates the acquisition of grammatical categories. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Data analysis report on ATS-F COMSAT millimeter wave propagation experiment, part 1. [effects of hydrometeors on ground to satellite communication

    NASA Technical Reports Server (NTRS)

    Hyde, G.

    1976-01-01

    The 13/18 GHz COMSAT Propagation Experiment (CPE) was performed to measure attenuation caused by hydrometeors along slant paths from transmitting terminals on the ground to the ATS-6 satellite. The effectiveness of site diversity in overcoming this impairment was also studied. Problems encountered in assembling a valid data base of rain induced attenuation data for statistical analysis are considered. The procedures used to obtain the various statistics are then outlined. The graphs and tables of statistical data for the 15 dual frequency (13 and 18 GHz) site diversity locations are discussed. Cumulative rain rate statistics for the Fayetteville and Boston sites based on point rainfall data collected are presented along with extrapolations of the attenuation and point rainfall data.

  13. Targeted On-Demand Team Performance App Development

    DTIC Science & Technology

    2016-10-01

    from three sites; 6) Preliminary analysis indicates larger than estimate effect size and study is sufficiently powered for generalizable outcomes...statistical analyses, and examine any resulting qualitative data for trends or connections to statistical outcomes. On Schedule 21 Predictive...Preliminary analysis indicates larger than estimate effect size and study is sufficiently powered for generalizable outcomes.  What opportunities for

  14. Demonstrating the Effectiveness of an Integrated and Intensive Research Methods and Statistics Course Sequence

    ERIC Educational Resources Information Center

    Pliske, Rebecca M.; Caldwell, Tracy L.; Calin-Jageman, Robert J.; Taylor-Ritzler, Tina

    2015-01-01

    We developed a two-semester series of intensive (six-contact hours per week) behavioral research methods courses with an integrated statistics curriculum. Our approach includes the use of team-based learning, authentic projects, and Excel and SPSS. We assessed the effectiveness of our approach by examining our students' content area scores on the…

  15. Differential Effects of Goal Setting and Value Reappraisal on College Women's Motivation and Achievement in Statistics

    ERIC Educational Resources Information Center

    Acee, Taylor Wayne

    2009-01-01

    The purpose of this dissertation was to investigate the differential effects of goal setting and value reappraisal on female students' self-efficacy beliefs, value perceptions, exam performance and continued interest in statistics. It was hypothesized that the Enhanced Goal Setting Intervention (GS-E) would positively impact students'…

  16. The Effects of Clinically Relevant Multiple-Choice Items on the Statistical Discrimination of Physician Clinical Competence.

    ERIC Educational Resources Information Center

    Downing, Steven M.; Maatsch, Jack L.

    To test the effect of clinically relevant multiple-choice item content on the validity of statistical discriminations of physicians' clinical competence, data were collected from a field test of the Emergency Medicine Examination, test items for the certification of specialists in emergency medicine. Two 91-item multiple-choice subscales were…

  17. 78 FR 19098 - Wage Methodology for the Temporary Non-Agricultural Employment H-2B Program; Delay of Effective Date

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-29

    ... by dividing the Bureau of Labor Statistics Occupational Employment Statistics Survey (OES survey... DEPARTMENT OF LABOR Employment and Training Administration 20 CFR Part 655 RIN 1205-AB61 Wage Methodology for the Temporary Non-Agricultural Employment H- 2B Program; Delay of Effective Date AGENCY...

  18. Solar-terrestrial predictions proceedings. Volume 4: Prediction of terrestrial effects of solar activity

    NASA Technical Reports Server (NTRS)

    Donnelly, R. E. (Editor)

    1980-01-01

    Papers about prediction of ionospheric and radio propagation conditions based primarily on empirical or statistical relations is discussed. Predictions of sporadic E, spread F, and scintillations generally involve statistical or empirical predictions. The correlation between solar-activity and terrestrial seismic activity and the possible relation between solar activity and biological effects is discussed.

  19. A Study of the Effectiveness of Web-Based Homework in Teaching Undergraduate Business Statistics

    ERIC Educational Resources Information Center

    Palocsay, Susan W.; Stevens, Scott P.

    2008-01-01

    Web-based homework (WBH) Technology can simplify the creation and grading of assignments as well as provide a feasible platform for assessment testing, but its effect on student learning in business statistics is unknown. This is particularly true of the latest software development of Web-based tutoring agents that dynamically evaluate individual…

  20. The Effect of Using Case Studies in Business Statistics

    ERIC Educational Resources Information Center

    Pariseau, Susan E.; Kezim, Boualem

    2007-01-01

    The authors evaluated the effect on learning of using case studies in business statistics courses. The authors divided students into 3 groups: a control group, a group that completed 1 case study, and a group that completed 3 case studies. Results evidenced that, on average, students whom the authors required to complete a case analysis received…

  1. Confidence crisis of results in biomechanics research.

    PubMed

    Knudson, Duane

    2017-11-01

    Many biomechanics studies have small sample sizes and incorrect statistical analyses, so reporting of inaccurate inferences and inflated magnitude of effects are common in the field. This review examines these issues in biomechanics research and summarises potential solutions from research in other fields to increase the confidence in the experimental effects reported in biomechanics. Authors, reviewers and editors of biomechanics research reports are encouraged to improve sample sizes and the resulting statistical power, improve reporting transparency, improve the rigour of statistical analyses used, and increase the acceptance of replication studies to improve the validity of inferences from data in biomechanics research. The application of sports biomechanics research results would also improve if a larger percentage of unbiased effects and their uncertainty were reported in the literature.

  2. Grounding statistical learning in context: The effects of learning and retrieval contexts on cross-situational word learning.

    PubMed

    Chen, Chi-Hsin; Yu, Chen

    2017-06-01

    Natural language environments usually provide structured contexts for learning. This study examined the effects of semantically themed contexts-in both learning and retrieval phases-on statistical word learning. Results from 2 experiments consistently showed that participants had higher performance in semantically themed learning contexts. In contrast, themed retrieval contexts did not affect performance. Our work suggests that word learners are sensitive to statistical regularities not just at the level of individual word-object co-occurrences but also at another level containing a whole network of associations among objects and their properties.

  3. New applications of maximum likelihood and Bayesian statistics in macromolecular crystallography.

    PubMed

    McCoy, Airlie J

    2002-10-01

    Maximum likelihood methods are well known to macromolecular crystallographers as the methods of choice for isomorphous phasing and structure refinement. Recently, the use of maximum likelihood and Bayesian statistics has extended to the areas of molecular replacement and density modification, placing these methods on a stronger statistical foundation and making them more accurate and effective.

  4. Strengthening Statistics Graduate Programs with Statistical Collaboration--The Case of Hawassa University, Ethiopia

    ERIC Educational Resources Information Center

    Goshu, Ayele Taye

    2016-01-01

    This paper describes the experiences gained from the established statistical collaboration center at Hawassa University as part of LISA 2020 network. The center has got similar setup as LISA at Virginia Tech. Statisticians are trained on how to become more effective scientific collaborators with researchers. The services are being delivered since…

  5. Outliers in Questionnaire Data: Can They Be Detected and Should They Be Removed?

    ERIC Educational Resources Information Center

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Outliers in questionnaire data are unusual observations, which may bias statistical results, and outlier statistics may be used to detect such outliers. The authors investigated the effect outliers have on the specificity and the sensitivity of each of six different outlier statistics. The Mahalanobis distance and the item-pair based outlier…

  6. The Precision-Power-Gradient Theory for Teaching Basic Research Statistical Tools to Graduate Students.

    ERIC Educational Resources Information Center

    Cassel, Russell N.

    This paper relates educational and psychological statistics to certain "Research Statistical Tools" (RSTs) necessary to accomplish and understand general research in the behavioral sciences. Emphasis is placed on acquiring an effective understanding of the RSTs and to this end they are are ordered to a continuum scale in terms of individual…

  7. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  8. Linking Science and Statistics: Curriculum Expectations in Three Countries

    ERIC Educational Resources Information Center

    Watson, Jane M.

    2017-01-01

    This paper focuses on the curriculum links between statistics and science that teachers need to understand and apply in order to be effective teachers of the two fields of study. Meaningful statistics does not exist without context and science is the context for this paper. Although curriculum documents differ from country to country, this paper…

  9. A Model for Developing and Assessing Community College Students' Conceptions of the Range, Interquartile Range, and Standard Deviation

    ERIC Educational Resources Information Center

    Turegun, Mikhail

    2011-01-01

    Traditional curricular materials and pedagogical strategies have not been effective in developing conceptual understanding of statistics topics and statistical reasoning abilities of students. Much of the changes proposed by statistics education research and the reform movement over the past decade have supported efforts to transform teaching…

  10. Enhancing an Undergraduate Business Statistics Course: Linking Teaching and Learning with Assessment Issues

    ERIC Educational Resources Information Center

    Fairfield-Sonn, James W.; Kolluri, Bharat; Rogers, Annette; Singamsetti, Rao

    2009-01-01

    This paper examines several ways in which teaching effectiveness and student learning in an undergraduate Business Statistics course can be enhanced. First, we review some key concepts in Business Statistics that are often challenging to teach and show how using real data sets assist students in developing deeper understanding of the concepts.…

  11. Reform-Oriented Teaching of Introductory Statistics in the Health, Social and Behavioral Sciences--Historical Context and Rationale

    ERIC Educational Resources Information Center

    Hassad, Rossi A.

    2009-01-01

    There is widespread emphasis on reform in the teaching of introductory statistics at the college level. Underpinning this reform is a consensus among educators and practitioners that traditional curricular materials and pedagogical strategies have not been effective in promoting statistical literacy, a competency that is becoming increasingly…

  12. Incorporating Code-Based Software in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  13. A Mediation Model to Explain the Role of Mathematics Skills and Probabilistic Reasoning on Statistics Achievement

    ERIC Educational Resources Information Center

    Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca

    2016-01-01

    Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…

  14. Accounting for Population Structure in Gene-by-Environment Interactions in Genome-Wide Association Studies Using Mixed Models.

    PubMed

    Sul, Jae Hoon; Bilow, Michael; Yang, Wen-Yun; Kostem, Emrah; Furlotte, Nick; He, Dan; Eskin, Eleazar

    2016-03-01

    Although genome-wide association studies (GWASs) have discovered numerous novel genetic variants associated with many complex traits and diseases, those genetic variants typically explain only a small fraction of phenotypic variance. Factors that account for phenotypic variance include environmental factors and gene-by-environment interactions (GEIs). Recently, several studies have conducted genome-wide gene-by-environment association analyses and demonstrated important roles of GEIs in complex traits. One of the main challenges in these association studies is to control effects of population structure that may cause spurious associations. Many studies have analyzed how population structure influences statistics of genetic variants and developed several statistical approaches to correct for population structure. However, the impact of population structure on GEI statistics in GWASs has not been extensively studied and nor have there been methods designed to correct for population structure on GEI statistics. In this paper, we show both analytically and empirically that population structure may cause spurious GEIs and use both simulation and two GWAS datasets to support our finding. We propose a statistical approach based on mixed models to account for population structure on GEI statistics. We find that our approach effectively controls population structure on statistics for GEIs as well as for genetic variants.

  15. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births

    PubMed Central

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A. Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers. PMID:29546120

  16. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births.

    PubMed

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers.

  17. Statistical analysis of sparse infection data and its implications for retroviral treatment trials in primates.

    PubMed Central

    Spouge, J L

    1992-01-01

    Reports on retroviral primate trials rarely publish any statistical analysis. Present statistical methodology lacks appropriate tests for these trials and effectively discourages quantitative assessment. This paper describes the theory behind VACMAN, a user-friendly computer program that calculates statistics for in vitro and in vivo infectivity data. VACMAN's analysis applies to many retroviral trials using i.v. challenges and is valid whenever the viral dose-response curve has a particular shape. Statistics from actual i.v. retroviral trials illustrate some unappreciated principles of effective animal use: dilutions other than 1:10 can improve titration accuracy; infecting titration animals at the lowest doses possible can lower challenge doses; and finally, challenging test animals in small trials with more virus than controls safeguards against false successes, "reuses" animals, and strengthens experimental conclusions. The theory presented also explains the important concept of viral saturation, a phenomenon that may cause in vitro and in vivo titrations to agree for some retroviral strains and disagree for others. PMID:1323844

  18. Australia 31-GHz brightness temperature exceedance statistics

    NASA Technical Reports Server (NTRS)

    Gary, B. L.

    1988-01-01

    Water vapor radiometer measurements were made at DSS 43 during an 18 month period. Brightness temperatures at 31 GHz were subjected to a statistical analysis which included correction for the effects of occasional water on the radiometer radome. An exceedance plot was constructed, and the 1 percent exceedance statistics occurs at 120 K. The 5 percent exceedance statistics occurs at 70 K, compared with 75 K in Spain. These values are valid for all of the three month groupings that were studied.

  19. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  20. Modulation Doped GaAs/Al sub xGA sub (1-x)As Layered Structures with Applications to Field Effect Transistors.

    DTIC Science & Technology

    1982-02-15

    function of the doping density at 300 and 77 K for the classical Boltzmann statistics or depletion approximation (solid line) and for the approximate...Fermi-Dirac statistics (equation (19) dotted line)• This comparison demonstrates that the deviation from Boltzmann statistics is quite noticeable...tunneling Schottky barriers cannot be obtained at these doping levels. The dotted lines are obtained when Boltzmann statistics are used in the Al Ga

  1. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  2. Effects of Consecutive Basketball Games on the Game-Related Statistics that Discriminate Winner and Losing Teams

    PubMed Central

    Ibáñez, Sergio J.; García, Javier; Feu, Sebastian; Lorenzo, Alberto; Sampaio, Jaime

    2009-01-01

    The aim of the present study was to identify the game-related statistics that discriminated basketball winning and losing teams in each of the three consecutive games played in a condensed tournament format. The data were obtained from the Spanish Basketball Federation and included game-related statistics from the Under-20 league (2005-2006 and 2006-2007 seasons). A total of 223 games were analyzed with the following game-related statistics: two and three-point field goal (made and missed), free-throws (made and missed), offensive and defensive rebounds, assists, steals, turnovers, blocks (made and received), fouls committed, ball possessions and offensive rating. Results showed that winning teams in this competition had better values in all game-related statistics, with the exception of three point field goals made, free-throws missed and turnovers (p ≥ 0.05). The main effect of game number was only identified in turnovers, with a statistical significant decrease between the second and third game. No interaction was found in the analysed variables. A discriminant analysis allowed identifying the two-point field goals made, the defensive rebounds and the assists as discriminators between winning and losing teams in all three games. Additionally to these, only the three-point field goals made contributed to discriminate teams in game three, suggesting a moderate effect of fatigue. Coaches may benefit from being aware of this variation in game determinant related statistics and, also, from using offensive and defensive strategies in the third game, allowing to explore or hide the three point field-goals performance. Key points Overall team performances along the three consecutive games were very similar, not confirming an accumulated fatigue effect. The results from the three-point field goals in the third game suggested that winning teams were able to shoot better from longer distances and this could be the result of exhibiting higher conditioning status and/or the losing teams’ exhibiting low conditioning in defense. PMID:24150011

  3. The thresholds for statistical and clinical significance – a five-step procedure for evaluation of intervention effects in randomised clinical trials

    PubMed Central

    2014-01-01

    Background Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. Methods Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity and comprehensiveness, a simple five-step procedure was developed. Results For a more valid assessment of results from a randomised clinical trial we propose the following five-steps: (1) report the confidence intervals and the exact P-values; (2) report Bayes factor for the primary outcome, being the ratio of the probability that a given trial result is compatible with a ‘null’ effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. Conclusions If the proposed five-step procedure is followed, this may increase the validity of assessments of intervention effects in randomised clinical trials. PMID:24588900

  4. Reporting of statistically significant results at ClinicalTrials.gov for completed superiority randomized controlled trials.

    PubMed

    Dechartres, Agnes; Bond, Elizabeth G; Scheer, Jordan; Riveros, Carolina; Atal, Ignacio; Ravaud, Philippe

    2016-11-30

    Publication bias and other reporting bias have been well documented for journal articles, but no study has evaluated the nature of results posted at ClinicalTrials.gov. We aimed to assess how many randomized controlled trials (RCTs) with results posted at ClinicalTrials.gov report statistically significant results and whether the proportion of trials with significant results differs when no treatment effect estimate or p-value is posted. We searched ClinicalTrials.gov in June 2015 for all studies with results posted. We included completed RCTs with a superiority hypothesis and considered results for the first primary outcome with results posted. For each trial, we assessed whether a treatment effect estimate and/or p-value was reported at ClinicalTrials.gov and if yes, whether results were statistically significant. If no treatment effect estimate or p-value was reported, we calculated the treatment effect and corresponding p-value using results per arm posted at ClinicalTrials.gov when sufficient data were reported. From the 17,536 studies with results posted at ClinicalTrials.gov, we identified 2823 completed phase 3 or 4 randomized trials with a superiority hypothesis. Of these, 1400 (50%) reported a treatment effect estimate and/or p-value. Results were statistically significant for 844 trials (60%), with a median p-value of 0.01 (Q1-Q3: 0.001-0.26). For the 1423 trials with no treatment effect estimate or p-value posted, we could calculate the treatment effect and corresponding p-value using results reported per arm for 929 (65%). For 494 trials (35%), p-values could not be calculated mainly because of insufficient reporting, censored data, or repeated measurements over time. For the 929 trials we could calculate p-values, we found statistically significant results for 342 (37%), with a median p-value of 0.19 (Q1-Q3: 0.005-0.59). Half of the trials with results posted at ClinicalTrials.gov reported a treatment effect estimate and/or p-value, with significant results for 60% of these. p-values could be calculated from results reported per arm at ClinicalTrials.gov for only 65% of the other trials. The proportion of significant results was much lower for these trials, which suggests a selective posting of treatment effect estimates and/or p-values when results are statistically significant.

  5. Power-up: A Reanalysis of 'Power Failure' in Neuroscience Using Mixture Modeling.

    PubMed

    Nord, Camilla L; Valton, Vincent; Wood, John; Roiser, Jonathan P

    2017-08-23

    Recently, evidence for endemically low statistical power has cast neuroscience findings into doubt. If low statistical power plagues neuroscience, then this reduces confidence in the reported effects. However, if statistical power is not uniformly low, then such blanket mistrust might not be warranted. Here, we provide a different perspective on this issue, analyzing data from an influential study reporting a median power of 21% across 49 meta-analyses (Button et al., 2013). We demonstrate, using Gaussian mixture modeling, that the sample of 730 studies included in that analysis comprises several subcomponents so the use of a single summary statistic is insufficient to characterize the nature of the distribution. We find that statistical power is extremely low for studies included in meta-analyses that reported a null result and that it varies substantially across subfields of neuroscience, with particularly low power in candidate gene association studies. Therefore, whereas power in neuroscience remains a critical issue, the notion that studies are systematically underpowered is not the full story: low power is far from a universal problem. SIGNIFICANCE STATEMENT Recently, researchers across the biomedical and psychological sciences have become concerned with the reliability of results. One marker for reliability is statistical power: the probability of finding a statistically significant result given that the effect exists. Previous evidence suggests that statistical power is low across the field of neuroscience. Our results present a more comprehensive picture of statistical power in neuroscience: on average, studies are indeed underpowered-some very seriously so-but many studies show acceptable or even exemplary statistical power. We show that this heterogeneity in statistical power is common across most subfields in neuroscience. This new, more nuanced picture of statistical power in neuroscience could affect not only scientific understanding, but potentially policy and funding decisions for neuroscience research. Copyright © 2017 Nord, Valton et al.

  6. Effects of illness and disability on job separation.

    PubMed

    Magee, William

    2004-03-01

    Effects of illness and disability on job separation result from both voluntary and involuntary processes. Voluntary processes range from the reasoned actions of workers who weigh illness and disability in their decision-making, to reactive stress-avoidance responses. Involuntary processes include employer discrimination against ill or disabled workers. Analyses of the effects of illness and disability that differentiate reasons for job separation can illuminate the processes involved. This paper reports on an evaluation of effects of illness and disability on job separation predicted by theories of reasoned action, stress, and employer discrimination against ill and disabled workers. Effects of four illness/disability conditions on the rate of job separation for 12 reasons are estimated using data from a longitudinal study of a representative sample of the Canadian population-the Survey of Labour and Income Dynamics (SLID). Two of the four effects that are statistically significant (under conservative Bayesian criteria for statistical significance) are consistent with the idea that workers weigh illness and disability as costs, and calculate the costs and benefits of continuing to work with an illness or disability: (1) disabling illness increases the hazard of leaving a job in order to engage in caregiving, and (2) work-related disability increases the hazard of leaving a job due to poor pay. The other two significant effects indicate that: (3) disabling illness decreases the hazard of layoff, and (4) non-work disability increases the hazard of leaving one job to take a different job. This last effect is consistent with a stress-interruption process. Other effects are statistically significant under conventional criteria for statistical significance, and most of these effects are also consistent with cost-benefit and stress theories. Some effects of illness and disability are sex and age-specific, and reasons for the specificity of these effects are discussed.

  7. What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science.

    PubMed

    Patil, Prasad; Peng, Roger D; Leek, Jeffrey T

    2016-07-01

    A recent study of the replicability of key psychological findings is a major contribution toward understanding the human side of the scientific process. Despite the careful and nuanced analysis reported, the simple narrative disseminated by the mass, social, and scientific media was that in only 36% of the studies were the original results replicated. In the current study, however, we showed that 77% of the replication effect sizes reported were within a 95% prediction interval calculated using the original effect size. Our analysis suggests two critical issues in understanding replication of psychological studies. First, researchers' intuitive expectations for what a replication should show do not always match with statistical estimates of replication. Second, when the results of original studies are very imprecise, they create wide prediction intervals-and a broad range of replication effects that are consistent with the original estimates. This may lead to effects that replicate successfully, in that replication results are consistent with statistical expectations, but do not provide much information about the size (or existence) of the true effect. In this light, the results of the Reproducibility Project: Psychology can be viewed as statistically consistent with what one might expect when performing a large-scale replication experiment. © The Author(s) 2016.

  8. Transportation statistics annual report, 2012

    DOT National Transportation Integrated Search

    2013-01-01

    The Transportation Statistics Annual Report : describes the Nations transportation : system, the systems performance, its contributions : to the economy, and its effects on the : environment. This 17th edition of the report, : covering 2011 and...

  9. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    PubMed

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  10. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    PubMed Central

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  11. Nursing students' attitudes toward statistics: Effect of a biostatistics course and association with examination performance.

    PubMed

    Kiekkas, Panagiotis; Panagiotarou, Aliki; Malja, Alvaro; Tahirai, Daniela; Zykai, Rountina; Bakalis, Nick; Stefanopoulos, Nikolaos

    2015-12-01

    Although statistical knowledge and skills are necessary for promoting evidence-based practice, health sciences students have expressed anxiety about statistics courses, which may hinder their learning of statistical concepts. To evaluate the effects of a biostatistics course on nursing students' attitudes toward statistics and to explore the association between these attitudes and their performance in the course examination. One-group quasi-experimental pre-test/post-test design. Undergraduate nursing students of the fifth or higher semester of studies, who attended a biostatistics course. Participants were asked to complete the pre-test and post-test forms of The Survey of Attitudes Toward Statistics (SATS)-36 scale at the beginning and end of the course respectively. Pre-test and post-test scale scores were compared, while correlations between post-test scores and participants' examination performance were estimated. Among 156 participants, post-test scores of the overall SATS-36 scale and of the Affect, Cognitive Competence, Interest and Effort components were significantly higher than pre-test ones, indicating that the course was followed by more positive attitudes toward statistics. Among 104 students who participated in the examination, higher post-test scores of the overall SATS-36 scale and of the Affect, Difficulty, Interest and Effort components were significantly but weakly correlated with higher examination performance. Students' attitudes toward statistics can be improved through appropriate biostatistics courses, while positive attitudes contribute to higher course achievements and possibly to improved statistical skills in later professional life. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Confidence Intervals for the Between-Study Variance in Random Effects Meta-Analysis Using Generalised Cochran Heterogeneity Statistics

    ERIC Educational Resources Information Center

    Jackson, Dan

    2013-01-01

    Statistical inference is problematic in the common situation in meta-analysis where the random effects model is fitted to just a handful of studies. In particular, the asymptotic theory of maximum likelihood provides a poor approximation, and Bayesian methods are sensitive to the prior specification. Hence, less efficient, but easily computed and…

  13. A hierarchical fire frequency model to simulate temporal patterns of fire regimes in LANDIS

    Treesearch

    Jian Yang; Hong S. He; Eric J. Gustafson

    2004-01-01

    Fire disturbance has important ecological effects in many forest landscapes. Existing statistically based approaches can be used to examine the effects of a fire regime on forest landscape dynamics. Most examples of statistically based fire models divide a fire occurrence into two stages--fire ignition and fire initiation. However, the exponential and Weibull fire-...

  14. [Medical nutrition in Alzheimer's: the trials].

    PubMed

    Scheltens, Philip; Twisk, Jos W R

    2013-01-01

    We describe the small but statistically significant effects of the medical nutrition diet 'Souvenaid' on memory in early Alzheimer's disease in two published randomised clinical trials. We specifically discuss the design and statistical approach, which were predefined and meet current standards in the field. Further research is needed to substantiate the long term effects and learn more about the mode of action of Souvenaid.

  15. The statistical reason why some researchers say some silvicultural treatments "wash-out" over time

    Treesearch

    David B. South; Curtis L. VanderSchaaf

    2006-01-01

    The initial effects of a silvicultural treatment on height or volume growth sometimes decline over time, and the early gains eventually disappear with very long rotations. However, in some reports initial gains are maintained until harvest but due to statistical analyses, a researcher might conclude the treatment effect has "washed-out" by ages 10 to 18 years...

  16. Method for simulating atmospheric turbulence phase effects for multiple time slices and anisoplanatic conditions.

    PubMed

    Roggemann, M C; Welsh, B M; Montera, D; Rhoadarmer, T A

    1995-07-10

    Simulating the effects of atmospheric turbulence on optical imaging systems is an important aspect of understanding the performance of these systems. Simulations are particularly important for understanding the statistics of some adaptive-optics system performance measures, such as the mean and variance of the compensated optical transfer function, and for understanding the statistics of estimators used to reconstruct intensity distributions from turbulence-corrupted image measurements. Current methods of simulating the performance of these systems typically make use of random phase screens placed in the system pupil. Methods exist for making random draws of phase screens that have the correct spatial statistics. However, simulating temporal effects and anisoplanatism requires one or more phase screens at different distances from the aperture, possibly moving with different velocities. We describe and demonstrate a method for creating random draws of phase screens with the correct space-time statistics for a bitrary turbulence and wind-velocity profiles, which can be placed in the telescope pupil in simulations. Results are provided for both the von Kármán and the Kolmogorov turbulence spectra. We also show how to simulate anisoplanatic effects with this technique.

  17. Midweek Intensification of Rain in the U.S.: Does Air Pollution Invigorate Storms?

    NASA Technical Reports Server (NTRS)

    Bell, T. L.; Rosenfeld, D.; Hahnenberger, M.

    2005-01-01

    The effect of pollution on rainfall has been observed to depend both on the type of pollution and the precipitating environment. The climatological consequences of pollution for rainfall are uncertain. In some urban areas, pollution varies with the day of the week because of weekly variations in human activity, in effect providing a repeated experiment on the effects of pollution. Weekly variations in temperature, pressure, cloud characteristics, hails and lightning are observed in many areas. Observing a weekly cycle in rainfall statistics has proven to be more difficult, although there is some evidence for it. Here we examine rainfall statistics from the Tropical Rainfall Measuring Mission (TRMM) satellite over the southern U.S. and adjacent waters, and find that there is a distinct, statistically significant weekly cycle in summertime rainfall over the southeast U.S., as well as weekly variations in rainfall over the nearby Atlantic and the Gulf of Mexico. Rainfall over land peaks in the middle of the week, suggesting that summer rainfall on large scales may increase as pollution levels rise. Both rain statistics over land and what appear to be compensating effects over adjacent seas support the suggestion that air pollution invigorates convection and outflow aloft.

  18. A Didactic Experience of Statistical Analysis for the Determination of Glycine in a Nonaqueous Medium Using ANOVA and a Computer Program

    ERIC Educational Resources Information Center

    Santos-Delgado, M. J.; Larrea-Tarruella, L.

    2004-01-01

    The back-titration methods are compared statistically to establish glycine in a nonaqueous medium of acetic acid. Important variations in the mean values of glycine are observed due to the interaction effects between the analysis of variance (ANOVA) technique and a statistical study through a computer software.

  19. Reasoning, Not Recipes: Helping Your Students Develop Statistical Understanding and Enjoy the Experience!

    ERIC Educational Resources Information Center

    Mooney, Gai

    2010-01-01

    Statistics is often presented to students as a series of algorithms to be learnt by heart and applied at the appropriate time to get "the correct answer". This approach, while it may in fact produce the right answer, has been shown to be minimally effective at helping students understand the underlying statistical concepts. As Holmes noted,…

  20. Effects of Matching Multiple Memory Strategies with Computer-Assisted Instruction on Students' Statistics Learning Achievement

    ERIC Educational Resources Information Center

    Liao, Ying; Lin, Wen-He

    2016-01-01

    In the era when digitalization is pursued, numbers are the major medium of information performance and statistics is the primary instrument to interpret and analyze numerical information. For this reason, the cultivation of fundamental statistical literacy should be a key in the learning area of mathematics at the stage of compulsory education.…

Top