Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall
2016-01-01
Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
Classical Statistics and Statistical Learning in Imaging Neuroscience
Bzdok, Danilo
2017-01-01
Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques. PMID:29056896
Georges, Patrick
2017-01-01
This paper proposes a statistical analysis that captures similarities and differences between classical music composers with the eventual aim to understand why particular composers 'sound' different even if their 'lineages' (influences network) are similar or why they 'sound' alike if their 'lineages' are different. In order to do this we use statistical methods and measures of association or similarity (based on presence/absence of traits such as specific 'ecological' characteristics and personal musical influences) that have been developed in biosystematics, scientometrics, and bibliographic coupling. This paper also represents a first step towards a more ambitious goal of developing an evolutionary model of Western classical music.
Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.
Verde, Pablo E
2010-12-30
In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.
Data Analysis Techniques for Physical Scientists
NASA Astrophysics Data System (ADS)
Pruneau, Claude A.
2017-10-01
Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.
The Shock and Vibration Bulletin. Part 2. Invited Papers, Structural Dynamics
1974-08-01
VIKING LANDER DYNAMICS 41 Mr. Joseph C. Pohlen, Martin Marietta Aerospace, Denver, Colorado Structural Dynamics PERFORMANCE OF STATISTICAL ENERGY ANALYSIS 47...aerospace structures. Analytical prediction of these environments is beyond the current scope of classical modal techniques. Statistical energy analysis methods...have been developed that circumvent the difficulties of high-frequency nodal analysis. These statistical energy analysis methods are evaluated
Davis, J.C.
2000-01-01
Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.
A Review of Classical Methods of Item Analysis.
ERIC Educational Resources Information Center
French, Christine L.
Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Bayesian Statistics for Biological Data: Pedigree Analysis
ERIC Educational Resources Information Center
Stanfield, William D.; Carlton, Matthew A.
2004-01-01
The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.
Probability and Statistics: A Prelude.
ERIC Educational Resources Information Center
Goodman, A. F.; Blischke, W. R.
Probability and statistics have become indispensable to scientific, technical, and management progress. They serve as essential dialects of mathematics, the classical language of science, and as instruments necessary for intelligent generation and analysis of information. A prelude to probability and statistics is presented by examination of the…
ERIC Educational Resources Information Center
Mauriello, David
1984-01-01
Reviews an interactive statistical analysis package (designed to run on 8- and 16-bit machines that utilize CP/M 80 and MS-DOS operating systems), considering its features and uses, documentation, operation, and performance. The package consists of 40 general purpose statistical procedures derived from the classic textbook "Statistical…
Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices
NASA Astrophysics Data System (ADS)
Passemier, Damien; McKay, Matthew R.; Chen, Yang
2015-07-01
Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.
Frequent statistics of link-layer bit stream data based on AC-IM algorithm
NASA Astrophysics Data System (ADS)
Cao, Chenghong; Lei, Yingke; Xu, Yiming
2017-08-01
At present, there are many relevant researches on data processing using classical pattern matching and its improved algorithm, but few researches on statistical data of link-layer bit stream. This paper adopts a frequent statistical method of link-layer bit stream data based on AC-IM algorithm for classical multi-pattern matching algorithms such as AC algorithm has high computational complexity, low efficiency and it cannot be applied to binary bit stream data. The method's maximum jump distance of the mode tree is length of the shortest mode string plus 3 in case of no missing? In this paper, theoretical analysis is made on the principle of algorithm construction firstly, and then the experimental results show that the algorithm can adapt to the binary bit stream data environment and extract the frequent sequence more accurately, the effect is obvious. Meanwhile, comparing with the classical AC algorithm and other improved algorithms, AC-IM algorithm has a greater maximum jump distance and less time-consuming.
Statistical Methodology for the Analysis of Repeated Duration Data in Behavioral Studies
ERIC Educational Resources Information Center
Letué, Frédérique; Martinez, Marie-José; Samson, Adeline; Vilain, Anne; Vilain, Coriandre
2018-01-01
Purpose: Repeated duration data are frequently used in behavioral studies. Classical linear or log-linear mixed models are often inadequate to analyze such data, because they usually consist of nonnegative and skew-distributed variables. Therefore, we recommend use of a statistical methodology specific to duration data. Method: We propose a…
Accessible Information Without Disturbing Partially Known Quantum States on a von Neumann Algebra
NASA Astrophysics Data System (ADS)
Kuramochi, Yui
2018-04-01
This paper addresses the problem of how much information we can extract without disturbing a statistical experiment, which is a family of partially known normal states on a von Neumann algebra. We define the classical part of a statistical experiment as the restriction of the equivalent minimal sufficient statistical experiment to the center of the outcome space, which, in the case of density operators on a Hilbert space, corresponds to the classical probability distributions appearing in the maximal decomposition by Koashi and Imoto (Phys. Rev. A 66, 022,318 2002). We show that we can access by a Schwarz or completely positive channel at most the classical part of a statistical experiment if we do not disturb the states. We apply this result to the broadcasting problem of a statistical experiment. We also show that the classical part of the direct product of statistical experiments is the direct product of the classical parts of the statistical experiments. The proof of the latter result is based on the theorem that the direct product of minimal sufficient statistical experiments is also minimal sufficient.
Statistical mechanics based on fractional classical and quantum mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Quintela-del-Río, Alejandro; Francisco-Fernández, Mario
2011-02-01
The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
Characterizing chaotic melodies in automatic music composition
NASA Astrophysics Data System (ADS)
Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang
2010-09-01
In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
ERIC Educational Resources Information Center
Shirley, Dennis
1986-01-01
Makes accessible Bourdieu's comprehensive and systematic sociology of French education; which integrates classical sociological theory and statistical analysis. Isolates and explicates key terminology, links these concepts together, and critiques the work from the perspective of the philosophy of praxis. (LHW)
Little, Max A.; Costello, Declan A. E.; Harries, Meredydd L.
2010-01-01
Summary Clinical acoustic voice-recording analysis is usually performed using classical perturbation measures, including jitter, shimmer, and noise-to-harmonic ratios (NHRs). However, restrictive mathematical limitations of these measures prevent analysis for severely dysphonic voices. Previous studies of alternative nonlinear random measures addressed wide varieties of vocal pathologies. Here, we analyze a single vocal pathology cohort, testing the performance of these alternative measures alongside classical measures. We present voice analysis pre- and postoperatively in 17 patients with unilateral vocal fold paralysis (UVFP). The patients underwent standard medialization thyroplasty surgery, and the voices were analyzed using jitter, shimmer, NHR, nonlinear recurrence period density entropy (RPDE), detrended fluctuation analysis (DFA), and correlation dimension. In addition, we similarly analyzed 11 healthy controls. Systematizing the preanalysis editing of the recordings, we found that the novel measures were more stable and, hence, reliable than the classical measures on healthy controls. RPDE and jitter are sensitive to improvements pre- to postoperation. Shimmer, NHR, and DFA showed no significant change (P > 0.05). All measures detect statistically significant and clinically important differences between controls and patients, both treated and untreated (P < 0.001, area under curve [AUC] > 0.7). Pre- to postoperation grade, roughness, breathiness, asthenia, and strain (GRBAS) ratings show statistically significant and clinically important improvement in overall dysphonia grade (G) (AUC = 0.946, P < 0.001). Recalculating AUCs from other study data, we compare these results in terms of clinical importance. We conclude that, when preanalysis editing is systematized, nonlinear random measures may be useful for monitoring UVFP-treatment effectiveness, and there may be applications to other forms of dysphonia. PMID:19900790
DOE Office of Scientific and Technical Information (OSTI.GOV)
Öztürk, Hande; Noyan, I. Cevdet
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Öztürk, Hande; Noyan, I. Cevdet
2017-08-24
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Quantum formalism for classical statistics
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-06-01
In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.
[Bayesian statistics in medicine -- part II: main applications and inference].
Montomoli, C; Nichelatti, M
2008-01-01
Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.
ERIC Educational Resources Information Center
Trueman, Mark
1985-01-01
Critically reviews the influential study "Malnutrition and Environmental Enrichment" by Winick et al. (1975) and highlights what are considered to be statistical flaws in its analysis. Data in the classic study of height, weight, and IQ changes in three groups of adopted, malnourished Korean girls are reanalysed and conclusions…
Carvalho, Pedro; Marques, Rui Cunha
2016-02-15
This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nedic, Vladimir, E-mail: vnedic@kg.ac.rs; Despotovic, Danijela, E-mail: ddespotovic@kg.ac.rs; Cvetanovic, Slobodan, E-mail: slobodan.cvetanovic@eknfak.ni.ac.rs
2014-11-15
Traffic is the main source of noise in urban environments and significantly affects human mental and physical health and labor productivity. Therefore it is very important to model the noise produced by various vehicles. Techniques for traffic noise prediction are mainly based on regression analysis, which generally is not good enough to describe the trends of noise. In this paper the application of artificial neural networks (ANNs) for the prediction of traffic noise is presented. As input variables of the neural network, the proposed structure of the traffic flow and the average speed of the traffic flow are chosen. Themore » output variable of the network is the equivalent noise level in the given time period L{sub eq}. Based on these parameters, the network is modeled, trained and tested through a comparative analysis of the calculated values and measured levels of traffic noise using the originally developed user friendly software package. It is shown that the artificial neural networks can be a useful tool for the prediction of noise with sufficient accuracy. In addition, the measured values were also used to calculate equivalent noise level by means of classical methods, and comparative analysis is given. The results clearly show that ANN approach is superior in traffic noise level prediction to any other statistical method. - Highlights: • We proposed an ANN model for prediction of traffic noise. • We developed originally designed user friendly software package. • The results are compared with classical statistical methods. • The results are much better predictive capabilities of ANN model.« less
Statistical benchmark for BosonSampling
NASA Astrophysics Data System (ADS)
Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas
2016-03-01
Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.
Color stability of shade guides after autoclave sterilization.
Schmeling, Max; Sartori, Neimar; Monteiro, Sylvio; Baratieri, Luiz
2014-01-01
This study evaluated the influence of 120 autoclave sterilization cycles on the color stability of two commercial shade guides (Vita Classical and Vita System 3D-Master). The specimens were evaluated by spectrophotometer before and after the sterilization cycles. The color was described using the three-dimensional CIELab system. The statistical analysis was performed in three chromaticity coordinates, before and after sterilization cycles, using the paired samples t test. All specimens became darker after autoclave sterilization cycles. However, specimens of Vita Classical became redder, while those of the Vita System 3D-Master became more yellow. Repeated cycles of autoclave sterilization caused statistically significant changes in the color coordinates of the two shade guides. However, these differences are considered clinically acceptable.
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2011-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741
Asymptotic modal analysis and statistical energy analysis
NASA Technical Reports Server (NTRS)
Dowell, Earl H.
1992-01-01
Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.
Statistical analysis of 4 types of neck whiplash injuries based on classical meridian theory.
Chen, Yemeng; Zhao, Yan; Xue, Xiaolin; Li, Hui; Wu, Xiuyan; Zhang, Qunce; Zheng, Xin; Wang, Tianfang
2015-01-01
As one component of the Chinese medicine meridian system, the meridian sinew (Jingjin, (see text), tendino-musculo) is specially described as being for acupuncture treatment of the musculoskeletal system because of its dynamic attributes and tender point correlations. In recent decades, the therapeutic importance of the sinew meridian has become revalued in clinical application. Based on this theory, the authors have established therapeutic strategies of acupuncture treatment in Whiplash-Associated Disorders (WAD) by categorizing four types of neck symptom presentations. The advantage of this new system is to make it much easier for the clinician to find effective acupuncture points. This study attempts to prove the significance of the proposed therapeutic strategies by analyzing data collected from a clinical survey of various WAD using non-supervised statistical methods, such as correlation analysis, factor analysis, and cluster analysis. The clinical survey data have successfully verified discrete characteristics of four neck syndromes, based upon the range of motion (ROM) and tender point location findings. A summary of the relationships among the symptoms of the four neck syndromes has shown the correlation coefficient as having a statistical significance (P < 0.01 or P < 0.05), especially with regard to ROM. Furthermore, factor and cluster analyses resulted in a total of 11 categories of general symptoms, which implies syndrome factors are more related to the Liver, as originally described in classical theory. The hypothesis of meridian sinew syndromes in WAD is clearly supported by the statistical analysis of the clinical trials. This new discovery should be beneficial in improving therapeutic outcomes.
Cerruela García, G; García-Pedrajas, N; Luque Ruiz, I; Gómez-Nieto, M Á
2018-03-01
This paper proposes a method for molecular activity prediction in QSAR studies using ensembles of classifiers constructed by means of two supervised subspace projection methods, namely nonparametric discriminant analysis (NDA) and hybrid discriminant analysis (HDA). We studied the performance of the proposed ensembles compared to classical ensemble methods using four molecular datasets and eight different models for the representation of the molecular structure. Using several measures and statistical tests for classifier comparison, we observe that our proposal improves the classification results with respect to classical ensemble methods. Therefore, we show that ensembles constructed using supervised subspace projections offer an effective way of creating classifiers in cheminformatics.
Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory
ERIC Educational Resources Information Center
Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya
2015-01-01
Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…
ERIC Educational Resources Information Center
Ammentorp, William
There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…
Deconstructing multivariate decoding for the study of brain function.
Hebart, Martin N; Baker, Chris I
2017-08-04
Multivariate decoding methods were developed originally as tools to enable accurate predictions in real-world applications. The realization that these methods can also be employed to study brain function has led to their widespread adoption in the neurosciences. However, prior to the rise of multivariate decoding, the study of brain function was firmly embedded in a statistical philosophy grounded on univariate methods of data analysis. In this way, multivariate decoding for brain interpretation grew out of two established frameworks: multivariate decoding for predictions in real-world applications, and classical univariate analysis based on the study and interpretation of brain activation. We argue that this led to two confusions, one reflecting a mixture of multivariate decoding for prediction or interpretation, and the other a mixture of the conceptual and statistical philosophies underlying multivariate decoding and classical univariate analysis. Here we attempt to systematically disambiguate multivariate decoding for the study of brain function from the frameworks it grew out of. After elaborating these confusions and their consequences, we describe six, often unappreciated, differences between classical univariate analysis and multivariate decoding. We then focus on how the common interpretation of what is signal and noise changes in multivariate decoding. Finally, we use four examples to illustrate where these confusions may impact the interpretation of neuroimaging data. We conclude with a discussion of potential strategies to help resolve these confusions in interpreting multivariate decoding results, including the potential departure from multivariate decoding methods for the study of brain function. Copyright © 2017. Published by Elsevier Inc.
Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach
ERIC Educational Resources Information Center
Holmes, Karen Y.; Dodd, Brett A.
2012-01-01
In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)
The Importance of Variance in Statistical Analysis: Don't Throw Out the Baby with the Bathwater.
ERIC Educational Resources Information Center
Peet, Martha W.
This paper analyzes what happens to the effect size of a given dataset when the variance is removed by categorization for the purpose of applying "OVA" methods (analysis of variance, analysis of covariance). The dataset is from a classic study by Holzinger and Swinefors (1939) in which more than 20 ability test were administered to 301…
The Development of Bayesian Theory and Its Applications in Business and Bioinformatics
NASA Astrophysics Data System (ADS)
Zhang, Yifei
2018-03-01
Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata
2012-05-01
The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
NASA Astrophysics Data System (ADS)
Carnevale, V.; Raugei, S.
2009-12-01
Lysine acetylation is a post-translational modification, which modulates the affinity of protein-protein and/or protein-DNA complexes. Its crucial role as a switch in signaling pathways highlights the relevance of charged chemical groups in determining the interactions between water and biomolecules. A great effort has been recently devoted to assess the reliability of classical molecular dynamics simulations in describing the solvation properties of charged moieties. In the spirit of these investigations, we performed classical and Car-Parrinello molecular dynamics simulations on lysine and acetylated-lysine in aqueous solution. A comparative analysis between the two computational schemes is presented with a focus on the first solvation shell of the charged groups. An accurate structural analysis unveils subtle, yet statistically significant, differences which are discussed in connection to the significant electronic density charge transfer occurring between the solute and the surrounding water molecules.
A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale
Pérez Sánchez, Carlos Javier
2014-01-01
Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002
Statistical correlation analysis for comparing vibration data from test and analysis
NASA Technical Reports Server (NTRS)
Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.
1986-01-01
A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
High-Speed Imaging Analysis of Register Transitions in Classically and Jazz-Trained Male Voices.
Dippold, Sebastian; Voigt, Daniel; Richter, Bernhard; Echternach, Matthias
2015-01-01
Little data are available concerning register functions in different styles of singing such as classically or jazz-trained voices. Differences between registers seem to be much more audible in jazz singing than classical singing, and so we hypothesized that classically trained singers exhibit a smoother register transition, stemming from more regular vocal fold oscillation patterns. High-speed digital imaging (HSDI) was used for 19 male singers (10 jazz-trained singers, 9 classically trained) who performed a glissando from modal to falsetto register across the register transition. Vocal fold oscillation patterns were analyzed in terms of different parameters of regularity such as relative average perturbation (RAP), correlation dimension (D2) and shimmer. HSDI observations showed more regular vocal fold oscillation patterns during the register transition for the classically trained singers. Additionally, the RAP and D2 values were generally lower and more consistent for the classically trained singers compared to the jazz singers. However, intergroup comparisons showed no statistically significant differences. Some of our results may support the hypothesis that classically trained singers exhibit a smoother register transition from modal to falsetto register. © 2015 S. Karger AG, Basel.
Atzori, F; Sabatini, L; Deledda, D; Schirò, M; Lo Baido, R; Baido, R L; Massè, A
2015-04-01
Total knee arthroplasty gives excellent objective results. Nevertheless, the subjective findings do not match the normal knee perception: Often, it depends on patellar pain onset. In this study, we analyzed clinical and radiological items that can affect resurfaced patellar tracking, and role of a patella-friendly femoral component and patellar size on patellar pain onset. Thirty consecutive patients were implanted using the same-cemented posterior-stabilized TKA associated with patella resurfacing. Fifteen patients were implanted using a classical femoral component, while another 15 patients were implanted using a patella-friendly femoral component. The statistical analysis was set to detect a significant difference (p < 0.05) in clinical and radiological outcomes related to several surgical parameters. Clinical and functional outcomes were recorded using the Knee Society Scoring System (KSS) and patellar pain with the Burnett questionnaire. Mean follow-up was 25 months. KSS results were excellent in both groups. Group 2 (patella-friendly femoral model) reached a higher percentage of 100 points in the clinical and functional KSS, but there was no statistical difference. Also, no statistical differences for Burnett Questionnaire results were recorded. We had one case of patellar clunk syndrome in the standard femoral component group and one poor result in the second group. Postoperative radiographic measurements evidenced no statistical differences in both groups. In group 1 (classical femoral component), better significant result (p < 0.05) war recorded at clinical evaluation according to the Knee Society Scoring System (KSS) in case of wider patellar component resurfaced. The present study reveals no statistically significant difference in the incidence of anterior knee pain between classical and "patella-friendly" femoral components. With the particular type of implant design utilized in this study, when the classical femoral component is used, bigger patellar implant sizes (38 and 41 mm) showed superior clinical outcome.
The coordinate-based meta-analysis of neuroimaging data.
Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E; Johnson, Timothy D
2017-01-01
Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research.
The coordinate-based meta-analysis of neuroimaging data
Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E.; Johnson, Timothy D.
2017-01-01
Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research. PMID:29545671
Nonlinear multivariate and time series analysis by neural network methods
NASA Astrophysics Data System (ADS)
Hsieh, William W.
2004-03-01
Methods in multivariate statistical analysis are essential for working with large amounts of geophysical data, data from observational arrays, from satellites, or from numerical model output. In classical multivariate statistical analysis, there is a hierarchy of methods, starting with linear regression at the base, followed by principal component analysis (PCA) and finally canonical correlation analysis (CCA). A multivariate time series method, the singular spectrum analysis (SSA), has been a fruitful extension of the PCA technique. The common drawback of these classical methods is that only linear structures can be correctly extracted from the data. Since the late 1980s, neural network methods have become popular for performing nonlinear regression and classification. More recently, neural network methods have been extended to perform nonlinear PCA (NLPCA), nonlinear CCA (NLCCA), and nonlinear SSA (NLSSA). This paper presents a unified view of the NLPCA, NLCCA, and NLSSA techniques and their applications to various data sets of the atmosphere and the ocean (especially for the El Niño-Southern Oscillation and the stratospheric quasi-biennial oscillation). These data sets reveal that the linear methods are often too simplistic to describe real-world systems, with a tendency to scatter a single oscillatory phenomenon into numerous unphysical modes or higher harmonics, which can be largely alleviated in the new nonlinear paradigm.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
A simple white noise analysis of neuronal light responses.
Chichilnisky, E J
2001-05-01
A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
Free Fermions and the Classical Compact Groups
NASA Astrophysics Data System (ADS)
Cunden, Fabio Deelan; Mezzadri, Francesco; O'Connell, Neil
2018-06-01
There is a close connection between the ground state of non-interacting fermions in a box with classical (absorbing, reflecting, and periodic) boundary conditions and the eigenvalue statistics of the classical compact groups. The associated determinantal point processes can be extended in two natural directions: (i) we consider the full family of admissible quantum boundary conditions (i.e., self-adjoint extensions) for the Laplacian on a bounded interval, and the corresponding projection correlation kernels; (ii) we construct the grand canonical extensions at finite temperature of the projection kernels, interpolating from Poisson to random matrix eigenvalue statistics. The scaling limits in the bulk and at the edges are studied in a unified framework, and the question of universality is addressed. Whether the finite temperature determinantal processes correspond to the eigenvalue statistics of some matrix models is, a priori, not obvious. We complete the picture by constructing a finite temperature extension of the Haar measure on the classical compact groups. The eigenvalue statistics of the resulting grand canonical matrix models (of random size) corresponds exactly to the grand canonical measure of free fermions with classical boundary conditions.
NEUROBEHAVIORAL EVALUATIONS OF BINARY AND TERTIARY MIXTURES OF CHEMICALS: LESSIONS LEARNING.
The classical approach to the statistical analysis of binary chemical mixtures is to construct full dose-response curves for one compound in the presence of a range of doses of the second compound (isobolographic analyses). For interaction studies using more than two chemicals, ...
A Bayesian approach to meta-analysis of plant pathology studies.
Mila, A L; Ngugi, H K
2011-01-01
Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.
A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinitskiy, Anton V.; Voth, Gregory A., E-mail: gavoth@uchicago.edu
2015-09-07
Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman’s imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionistmore » perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.« less
A reductionist perspective on quantum statistical mechanics: Coarse-graining of path integrals.
Sinitskiy, Anton V; Voth, Gregory A
2015-09-07
Computational modeling of the condensed phase based on classical statistical mechanics has been rapidly developing over the last few decades and has yielded important information on various systems containing up to millions of atoms. However, if a system of interest contains important quantum effects, well-developed classical techniques cannot be used. One way of treating finite temperature quantum systems at equilibrium has been based on Feynman's imaginary time path integral approach and the ensuing quantum-classical isomorphism. This isomorphism is exact only in the limit of infinitely many classical quasiparticles representing each physical quantum particle. In this work, we present a reductionist perspective on this problem based on the emerging methodology of coarse-graining. This perspective allows for the representations of one quantum particle with only two classical-like quasiparticles and their conjugate momenta. One of these coupled quasiparticles is the centroid particle of the quantum path integral quasiparticle distribution. Only this quasiparticle feels the potential energy function. The other quasiparticle directly provides the observable averages of quantum mechanical operators. The theory offers a simplified perspective on quantum statistical mechanics, revealing its most reductionist connection to classical statistical physics. By doing so, it can facilitate a simpler representation of certain quantum effects in complex molecular environments.
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan
2016-03-29
Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.
Models of dyadic social interaction.
Griffin, Dale; Gonzalez, Richard
2003-01-01
We discuss the logic of research designs for dyadic interaction and present statistical models with parameters that are tied to psychologically relevant constructs. Building on Karl Pearson's classic nineteenth-century statistical analysis of within-organism similarity, we describe several approaches to indexing dyadic interdependence and provide graphical methods for visualizing dyadic data. We also describe several statistical and conceptual solutions to the 'levels of analytic' problem in analysing dyadic data. These analytic strategies allow the researcher to examine and measure psychological questions of interdependence and social influence. We provide illustrative data from casually interacting and romantic dyads. PMID:12689382
Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis
NASA Astrophysics Data System (ADS)
Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.
2013-05-01
Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.
Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...
2014-12-04
Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
Laganà, Pasqualina; Moscato, Umberto; Poscia, Andrea; La Milia, Daniele Ignazio; Boccia, Stefania; Avventuroso, Emanuela; Delia, Santi
2015-01-01
Legionnaires' disease is normally acquired by inhalation of legionellae from a contaminated environmental source. Water systems of large buildings, such as hospitals, are often contaminated with legionellae and therefore represent a potential risk for the hospital population. The aim of this study was to evaluate the potential contamination of Legionella pneumophila (LP) in a large hospital in Italy through georeferential statistical analysis to assess the possible sources of dispersion and, consequently, the risk of exposure for both health care staff and patients. LP serogroups 1 and 2-14 distribution was considered in the wards housed on two consecutive floors of the hospital building. On the basis of information provided by 53 bacteriological analysis, a 'random' grid of points was chosen and spatial geostatistics or FAIk Kriging was applied and compared with the results of classical statistical analysis. Over 50% of the examined samples were positive for Legionella pneumophila. LP 1 was isolated in 69% of samples from the ground floor and in 60% of sample from the first floor; LP 2-14 in 36% of sample from the ground floor and 24% from the first. The iso-estimation maps show clearly the most contaminated pipe and the difference in the diffusion of the different L. pneumophila serogroups. Experimental work has demonstrated that geostatistical methods applied to the microbiological analysis of water matrices allows a better modeling of the phenomenon under study, a greater potential for risk management and a greater choice of methods of prevention and environmental recovery to be put in place with respect to the classical statistical analysis.
Camara, Jorge G; Ruszkowski, Joseph M; Worak, Sandra R
2008-06-25
Music and surgery. To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Retrospective case series. 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Mean arterial pressure, heart rate, and respiratory rate. 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery.
NASA Technical Reports Server (NTRS)
Hong, Z. C.
1975-01-01
A review of various methods of calculating turbulent chemically reacting flow such as the Green Function, Navier-Stokes equation, and others is presented. Nonequilibrium degrees of freedom were employed to study the mixing behavior of a multiscale turbulence field. Classical and modern theories are discussed.
Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz
An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…
NASA Astrophysics Data System (ADS)
Gourdol, L.; Hissler, C.; Pfister, L.
2012-04-01
The Luxembourg sandstone aquifer is of major relevance for the national supply of drinking water in Luxembourg. The city of Luxembourg (20% of the country's population) gets almost 2/3 of its drinking water from this aquifer. As a consequence, the study of both the groundwater hydrochemistry, as well as its spatial and temporal variations, are considered as of highest priority. Since 2005, a monitoring network has been implemented by the Water Department of Luxembourg City, with a view to a more sustainable management of this strategic water resource. The data collected to date forms a large and complex dataset, describing spatial and temporal variations of many hydrochemical parameters. The data treatment issue is tightly connected to this kind of water monitoring programs and complex databases. Standard multivariate statistical techniques, such as principal components analysis and hierarchical cluster analysis, have been widely used as unbiased methods for extracting meaningful information from groundwater quality data and are now classically used in many hydrogeological studies, in particular to characterize temporal or spatial hydrochemical variations induced by natural and anthropogenic factors. But these classical multivariate methods deal with two-way matrices, usually parameters/sites or parameters/time, while often the dataset resulting from qualitative water monitoring programs should be seen as a datacube parameters/sites/time. Three-way matrices, such as the one we propose here, are difficult to handle and to analyse by classical multivariate statistical tools and thus should be treated with approaches dealing with three-way data structures. One possible analysis approach consists in the use of partial triadic analysis (PTA). The PTA was previously used with success in many ecological studies but never to date in the domain of hydrogeology. Applied to the dataset of the Luxembourg Sandstone aquifer, the PTA appears as a new promising statistical instrument for hydrogeologists, in particular to characterize temporal and spatial hydrochemical variations induced by natural and anthropogenic factors. This new approach for groundwater management offers potential for 1) identifying a common multivariate spatial structure, 2) untapping the different hydrochemical patterns and explaining their controlling factors and 3) analysing the temporal variability of this structure and grasping hydrochemical changes.
Quantile regression for the statistical analysis of immunological data with many non-detects.
Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth
2012-07-07
Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.
Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng
2016-12-01
A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.
NASA Astrophysics Data System (ADS)
Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng
2016-12-01
A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.
NASA Astrophysics Data System (ADS)
Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru
2007-10-01
Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.
Molenaar, Peter C M
2008-01-01
It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.
Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
2010-08-15
One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less
Pauli structures arising from confined particles interacting via a statistical potential
NASA Astrophysics Data System (ADS)
Batle, Josep; Ciftja, Orion; Farouk, Ahmed; Alkhambashi, Majid; Abdalla, Soliman
2017-09-01
There have been suggestions that the Pauli exclusion principle alone can lead a non-interacting (free) system of identical fermions to form crystalline structures dubbed Pauli crystals. Single-shot imaging experiments for the case of ultra-cold systems of free spin-polarized fermionic atoms in a two-dimensional harmonic trap appear to show geometric arrangements that cannot be characterized as Wigner crystals. This work explores this idea and considers a well-known approach that enables one to treat a quantum system of free fermions as a system of classical particles interacting with a statistical interaction potential. The model under consideration, though classical in nature, incorporates the quantum statistics by endowing the classical particles with an effective interaction potential. The reasonable expectation is that possible Pauli crystal features seen in experiments may manifest in this model that captures the correct quantum statistics as a first order correction. We use the Monte Carlo simulated annealing method to obtain the most stable configurations of finite two-dimensional systems of confined particles that interact with an appropriate statistical repulsion potential. We consider both an isotropic harmonic and a hard-wall confinement potential. Despite minor differences, the most stable configurations observed in our model correspond to the reported Pauli crystals in single-shot imaging experiments of free spin-polarized fermions in a harmonic trap. The crystalline configurations observed appear to be different from the expected classical Wigner crystal structures that would emerge should the confined classical particles had interacted with a pair-wise Coulomb repulsion.
A Pedagogical Approach to the Boltzmann Factor through Experiments and Simulations
ERIC Educational Resources Information Center
Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.
2009-01-01
The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to…
Smoking and Cancers: Case-Robust Analysis of a Classic Data Set
ERIC Educational Resources Information Center
Bentler, Peter M.; Satorra, Albert; Yuan, Ke-Hai
2009-01-01
A typical structural equation model is intended to reproduce the means, variances, and correlations or covariances among a set of variables based on parameter estimates of a highly restricted model. It is not widely appreciated that the sample statistics being modeled can be quite sensitive to outliers and influential observations, leading to bias…
Developing a Test for Assessing Elementary Students' Comprehension of Science Texts
ERIC Educational Resources Information Center
Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien
2012-01-01
This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…
Minimized state complexity of quantum-encoded cryptic processes
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
Single-snapshot DOA estimation by using Compressed Sensing
NASA Astrophysics Data System (ADS)
Fortunati, Stefano; Grasso, Raffaele; Gini, Fulvio; Greco, Maria S.; LePage, Kevin
2014-12-01
This paper deals with the problem of estimating the directions of arrival (DOA) of multiple source signals from a single observation vector of an array data. In particular, four estimation algorithms based on the theory of compressed sensing (CS), i.e., the classical ℓ 1 minimization (or Least Absolute Shrinkage and Selection Operator, LASSO), the fast smooth ℓ 0 minimization, and the Sparse Iterative Covariance-Based Estimator, SPICE and the Iterative Adaptive Approach for Amplitude and Phase Estimation, IAA-APES algorithms, are analyzed, and their statistical properties are investigated and compared with the classical Fourier beamformer (FB) in different simulated scenarios. We show that unlike the classical FB, a CS-based beamformer (CSB) has some desirable properties typical of the adaptive algorithms (e.g., Capon and MUSIC) even in the single snapshot case. Particular attention is devoted to the super-resolution property. Theoretical arguments and simulation analysis provide evidence that a CS-based beamformer can achieve resolution beyond the classical Rayleigh limit. Finally, the theoretical findings are validated by processing a real sonar dataset.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, S.; Favalli, Andrea; Weaver, Brian Phillip
2015-10-06
In this paper we develop and investigate several criteria for assessing how well a proposed spectral form fits observed spectra. We consider the classical improved figure of merit (FOM) along with several modifications, as well as criteria motivated by Poisson regression from the statistical literature. We also develop a new FOM that is based on the statistical idea of the bootstrap. A spectral simulator has been developed to assess the performance of these different criteria under multiple data configurations.
Janssen, Dirk P
2012-03-01
Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.
Statistical Analysis for Collision-free Boson Sampling.
Huang, He-Liang; Zhong, Han-Sen; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su
2017-11-10
Boson sampling is strongly believed to be intractable for classical computers but solvable with photons in linear optics, which raises widespread concern as a rapid way to demonstrate the quantum supremacy. However, due to its solution is mathematically unverifiable, how to certify the experimental results becomes a major difficulty in the boson sampling experiment. Here, we develop a statistical analysis scheme to experimentally certify the collision-free boson sampling. Numerical simulations are performed to show the feasibility and practicability of our scheme, and the effects of realistic experimental conditions are also considered, demonstrating that our proposed scheme is experimentally friendly. Moreover, our broad approach is expected to be generally applied to investigate multi-particle coherent dynamics beyond the boson sampling.
NASA Technical Reports Server (NTRS)
Stephens, J. B.; Sloan, J. C.
1976-01-01
A method is described for developing a statistical air quality assessment for the launch of an aerospace vehicle from the Kennedy Space Center in terms of existing climatological data sets. The procedure can be refined as developing meteorological conditions are identified for use with the NASA-Marshall Space Flight Center Rocket Exhaust Effluent Diffusion (REED) description. Classical climatological regimes for the long range analysis can be narrowed as the synoptic and mesoscale structure is identified. Only broad synoptic regimes are identified at this stage of analysis. As the statistical data matrix is developed, synoptic regimes will be refined in terms of the resulting eigenvectors as applicable to aerospace air quality predictions.
Camara, Jorge G.; Ruszkowski, Joseph M.; Worak, Sandra R.
2008-01-01
Context Music and surgery. Objective To determine the effect of live classical piano music on vital signs of patients undergoing ophthalmic surgery. Design Retrospective case series. Setting and Patients 203 patients who underwent various ophthalmologic procedures in a period during which a piano was present in the operating room of St. Francis Medical Center. [Note: St. Francis Medical Center has recently been renamed Hawaii Medical Center East.] Intervention Demographic data, surgical procedures, and the vital signs of 203 patients who underwent ophthalmic procedures were obtained from patient records. Blood pressure, heart rate, and respiratory rate measured in the preoperative holding area were compared with the same parameters taken in the operating room, with and without exposure to live piano music. A paired t-test was used for statistical analysis. Main outcome measure Mean arterial pressure, heart rate, and respiratory rate. Results 115 patients who were exposed to live piano music showed a statistically significant decrease in mean arterial blood pressure, heart rate, and respiratory rate in the operating room compared with their vital signs measured in the preoperative holding area (P < .0001). The control group of 88 patients not exposed to live piano music showed a statistically significant increase in mean arterial blood pressure (P < .0002) and heart rate and respiratory rate (P < .0001). Conclusion Live classical piano music lowered the blood pressure, heart rate, and respiratory rate in patients undergoing ophthalmic surgery. PMID:18679538
Zhu, Zhaozhong; Anttila, Verneri; Smoller, Jordan W; Lee, Phil H
2018-01-01
Advances in recent genome wide association studies (GWAS) suggest that pleiotropic effects on human complex traits are widespread. A number of classic and recent meta-analysis methods have been used to identify genetic loci with pleiotropic effects, but the overall performance of these methods is not well understood. In this work, we use extensive simulations and case studies of GWAS datasets to investigate the power and type-I error rates of ten meta-analysis methods. We specifically focus on three conditions commonly encountered in the studies of multiple traits: (1) extensive heterogeneity of genetic effects; (2) characterization of trait-specific association; and (3) inflated correlation of GWAS due to overlapping samples. Although the statistical power is highly variable under distinct study conditions, we found the superior power of several methods under diverse heterogeneity. In particular, classic fixed-effects model showed surprisingly good performance when a variant is associated with more than a half of study traits. As the number of traits with null effects increases, ASSET performed the best along with competitive specificity and sensitivity. With opposite directional effects, CPASSOC featured the first-rate power. However, caution is advised when using CPASSOC for studying genetically correlated traits with overlapping samples. We conclude with a discussion of unresolved issues and directions for future research.
NASA Astrophysics Data System (ADS)
Waqas, Abi; Melati, Daniele; Manfredi, Paolo; Grassi, Flavia; Melloni, Andrea
2018-02-01
The Building Block (BB) approach has recently emerged in photonic as a suitable strategy for the analysis and design of complex circuits. Each BB can be foundry related and contains a mathematical macro-model of its functionality. As well known, statistical variations in fabrication processes can have a strong effect on their functionality and ultimately affect the yield. In order to predict the statistical behavior of the circuit, proper analysis of the uncertainties effects is crucial. This paper presents a method to build a novel class of Stochastic Process Design Kits for the analysis of photonic circuits. The proposed design kits directly store the information on the stochastic behavior of each building block in the form of a generalized-polynomial-chaos-based augmented macro-model obtained by properly exploiting stochastic collocation and Galerkin methods. Using this approach, we demonstrate that the augmented macro-models of the BBs can be calculated once and stored in a BB (foundry dependent) library and then used for the analysis of any desired circuit. The main advantage of this approach, shown here for the first time in photonics, is that the stochastic moments of an arbitrary photonic circuit can be evaluated by a single simulation only, without the need for repeated simulations. The accuracy and the significant speed-up with respect to the classical Monte Carlo analysis are verified by means of classical photonic circuit example with multiple uncertain variables.
Wilson's Disease: a challenge of diagnosis. The 5-year experience of a tertiary centre.
Gheorghe, Liana; Popescu, Irinel; Iacob, Speranta; Gheorghe, Cristian; Vaidan, Roxana; Constantinescu, Alexandra; Iacob, Razvan; Becheanu, Gabriel; Angelescu, Corina; Diculescu, Mircea
2004-09-01
Because molecular diagnosis is considered impractical and no patognomonic features have been described, diagnosis of Wilson's disease (WD) using clinical and biochemical findings is still challenging. We analysed predictive factors for the diagnosis in 55 patients with WD diagnosed in our centre between 1st January 1999 and 1st April 2004. All patients presented predominant liver disease classified as: 1) asymptomatic, found incidentally, 2) chronic hepatitis or cirrhosis, or 3) fulminant hepatic failure. Diagnosis was considered as classic (two out of the three following criteria: 1) serum ceruloplasmin < 20 mg/dl, 2) the presence of Kayser-Fleischer rings and/or 3) hepatic copper > 250 mg/g dry weight liver tissue), and non-classic (clinical manifestations plus laboratory parameters suggesting impaired copper metabolism). The association between the predictive factors and non-classic diagnosis was assessed based on the level of statistical significance (p value<0.05) associated with the chi-squared test in contingency tables. Multivariate analysis was performed by logistic regression using SPSS 10. There were 31 males (56.3%) and 24 females (43.7%) with the mean age at diagnosis of 20.92 +/- 9.97 years (4-52 years); 51 patients (92.7%) were younger than 40 years. Asymptomatic WD was diagnosed in 14 patients (25.4%), chronic liver disease due to WD in 29 patients (52.8%) and fulminant hepatic failure in 12 patients (21.8%). The classic diagnosis was made in 32 patients (58.18%). In the univariate analysis the non-classic diagnosis was associated with: age>18 years (p=0.03), increased copper excretion (p<0.0001), Coombs-negative hemolysis (p=0.03), absence of neurological manifestations (p<0.0001). Multivariate analysis identified age over 18 years, increased urinary copper, and isolated hepatic involvement as independent predictors. In clinical practice, WD should be considered also in patients who do not fulfil classic criteria. Independent factors associated with non-classic diagnosis were age over 18 years, increased cupruresis and isolated liver disease.
Nonclassical light revealed by the joint statistics of simultaneous measurements.
Luis, Alfredo
2016-04-15
Nonclassicality cannot be a single-observable property, since the statistics of any quantum observable is compatible with classical physics. We develop a general procedure to reveal nonclassical behavior of light states from the joint statistics arising in the practical measurement of multiple observables. Beside embracing previous approaches, this protocol can disclose nonclassical features for standard examples of classical-like behavior, such as SU(2) and Glauber coherent states. When combined with other criteria, this would imply that every light state is nonclassical.
PDF-based heterogeneous multiscale filtration model.
Gong, Jian; Rutland, Christopher J
2015-04-21
Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.
Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum
NASA Astrophysics Data System (ADS)
Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.
The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.
Space-time models based on random fields with local interactions
NASA Astrophysics Data System (ADS)
Hristopulos, Dionissios T.; Tsantili, Ivi C.
2016-08-01
The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.
A scaling procedure for the response of an isolated system with high modal overlap factor
NASA Astrophysics Data System (ADS)
De Rosa, S.; Franco, F.
2008-10-01
The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).
NASA Astrophysics Data System (ADS)
Šantić, Neven; Fusaro, Adrien; Salem, Sabeur; Garnier, Josselin; Picozzi, Antonio; Kaiser, Robin
2018-02-01
The nonlinear Schrödinger equation, used to describe the dynamics of quantum fluids, is known to be valid not only for massive particles but also for the propagation of light in a nonlinear medium, predicting condensation of classical waves. Here we report on the initial evolution of random waves with Gaussian statistics using atomic vapors as an efficient two dimensional nonlinear medium. Experimental and theoretical analysis of near field images reveal a phenomenon of nonequilibrium precondensation, characterized by a fast relaxation towards a precondensate fraction of up to 75%. Such precondensation is in contrast to complete thermalization to the Rayleigh-Jeans equilibrium distribution, requiring prohibitive long interaction lengths.
NASA Astrophysics Data System (ADS)
Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane
2016-05-01
Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.
HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.
Song, Chi; Tseng, George C
2014-01-01
Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.
Perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases.
Mohammadzadeh, Hosein; Adli, Fereshteh; Nouri, Sahereh
2016-12-01
We investigate perturbative thermodynamic geometry of nonextensive ideal classical, Bose, and Fermi gases. We show that the intrinsic statistical interaction of nonextensive Bose (Fermi) gas is attractive (repulsive) similar to the extensive case but the value of thermodynamic curvature is changed by a nonextensive parameter. In contrary to the extensive ideal classical gas, the nonextensive one may be divided to two different regimes. According to the deviation parameter of the system to the nonextensive case, one can find a special value of fugacity, z^{*}, where the sign of thermodynamic curvature is changed. Therefore, we argue that the nonextensive parameter induces an attractive (repulsive) statistical interaction for z
NASA Astrophysics Data System (ADS)
Verrier, Sébastien; Crépon, Michel; Thiria, Sylvie
2014-09-01
Spectral scaling properties have already been evidenced on oceanic numerical simulations and have been subject to several interpretations. They can be used to evaluate classical turbulence theories that predict scaling with specific exponents and to evaluate the quality of GCM outputs from a statistical and multiscale point of view. However, a more complete framework based on multifractal cascades is able to generalize the classical but restrictive second-order spectral framework to other moment orders, providing an accurate description of probability distributions of the fields at multiple scales. The predictions of this formalism still needed systematic verification in oceanic GCM while they have been confirmed recently for their atmospheric counterparts by several papers. The present paper is devoted to a systematic analysis of several oceanic fields produced by the NEMO oceanic GCM. Attention is focused to regional, idealized configurations that permit to evaluate the NEMO engine core from a scaling point of view regardless of limitations involved by land masks. Based on classical multifractal analysis tools, multifractal properties were evidenced for several oceanic state variables (sea surface temperature and salinity, velocity components, etc.). While first-order structure functions estimated a different nonconservativity parameter H in two scaling ranges, the multiorder statistics of turbulent fluxes were scaling over almost the whole available scaling range. This multifractal scaling was then parameterized with the help of the universal multifractal framework, providing parameters that are coherent with existing empirical literature. Finally, we argue that the knowledge of these properties may be useful for oceanographers. The framework seems very well suited for the statistical evaluation of OGCM outputs. Moreover, it also provides practical solutions to simulate subpixel variability stochastically for GCM downscaling purposes. As an independent perspective, the existence of multifractal properties in oceanic flows seems also interesting for investigating scale dependencies in remote sensing inversion algorithms.
ERIC Educational Resources Information Center
Martuza, Victor R.; Engel, John D.
Results from classical power analysis (Brewer, 1972) suggest that a researcher should not set a=p (when p is less than a) in a posteriori fashion when a study yields statistically significant results because of a resulting decrease in power. The purpose of the present report is to use Bayesian theory in examining the validity of this…
Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania
NASA Astrophysics Data System (ADS)
Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu
2017-09-01
This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.
Plastino, A; Rocca, M C
2017-06-01
Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.
Naguib, Ibrahim A; Abdelrahman, Maha M; El Ghobashy, Mohamed R; Ali, Nesma A
2016-01-01
Two accurate, sensitive, and selective stability-indicating methods are developed and validated for simultaneous quantitative determination of agomelatine (AGM) and its forced degradation products (Deg I and Deg II), whether in pure forms or in pharmaceutical formulations. Partial least-squares regression (PLSR) and spectral residual augmented classical least-squares (SRACLS) are two chemometric models that are being subjected to a comparative study through handling UV spectral data in range (215-350 nm). For proper analysis, a three-factor, four-level experimental design was established, resulting in a training set consisting of 16 mixtures containing different ratios of interfering species. An independent test set consisting of eight mixtures was used to validate the prediction ability of the suggested models. The results presented indicate the ability of mentioned multivariate calibration models to analyze AGM, Deg I, and Deg II with high selectivity and accuracy. The analysis results of the pharmaceutical formulations were statistically compared to the reference HPLC method, with no significant differences observed regarding accuracy and precision. The SRACLS model gives comparable results to the PLSR model; however, it keeps the qualitative spectral information of the classical least-squares algorithm for analyzed components.
EHME: a new word database for research in Basque language.
Acha, Joana; Laka, Itziar; Landa, Josu; Salaburu, Pello
2014-11-14
This article presents EHME, the frequency dictionary of Basque structure, an online program that enables researchers in psycholinguistics to extract word and nonword stimuli, based on a broad range of statistics concerning the properties of Basque words. The database consists of 22.7 million tokens, and properties available include morphological structure frequency and word-similarity measures, apart from classical indexes: word frequency, orthographic structure, orthographic similarity, bigram and biphone frequency, and syllable-based measures. Measures are indexed at the lemma, morpheme and word level. We include reliability and validation analysis. The application is freely available, and enables the user to extract words based on concrete statistical criteria 1 , as well as to obtain statistical characteristics from a list of words
Williamson, N B
1975-03-01
This paper reports a decrease in the interval from calving to conception in two commercial dairy herds, associated with the use of KaMaR Heat Mount Detectors. An economic analysis of the results uses a neoclassical decision theory approach to demonstrate that the use of heat mount detectors is likely to be profitable, with an expected net return of $154.18 per 100 calvings. The analysis demonstrates the suitability of a decision-theoretic approach to the analysis of applied research, and illustrates some of the weaknesses of "Classical" statistical analysis in such circumstances.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Wenning, Thomas J.; Guo, Wei
In the United States, manufacturing facilities account for about 32% of total domestic energy consumption in 2014. Robust energy tracking methodologies are critical to understanding energy performance in manufacturing facilities. Due to its simplicity and intuitiveness, the classic energy intensity method (i.e. the ratio of total energy use over total production) is the most widely adopted. However, the classic energy intensity method does not take into account the variation of other relevant parameters (i.e. product type, feed stock type, weather, etc.). Furthermore, the energy intensity method assumes that the facilities’ base energy consumption (energy use at zero production) is zero,more » which rarely holds true. Therefore, it is commonly recommended to utilize regression models rather than the energy intensity approach for tracking improvements at the facility level. Unfortunately, many energy managers have difficulties understanding why regression models are statistically better than utilizing the classic energy intensity method. While anecdotes and qualitative information may convince some, many have major reservations about the accuracy of regression models and whether it is worth the time and effort to gather data and build quality regression models. This paper will explain why regression models are theoretically and quantitatively more accurate for tracking energy performance improvements. Based on the analysis of data from 114 manufacturing plants over 12 years, this paper will present quantitative results on the importance of utilizing regression models over the energy intensity methodology. This paper will also document scenarios where regression models do not have significant relevance over the energy intensity method.« less
Influence of complaints and singing style in singers voice handicap.
Moreti, Felipe; Ávila, Maria Emília Barros de; Rocha, Clara; Borrego, Maria Cristina de Menezes; Oliveira, Gisele; Behlau, Mara
2012-01-01
The aim of this research was to verify whether the difference of singing styles and the presence of vocal complaints influence the perception of voice handicap of singers. One hundred eighteen singing voice handicap self-assessment protocols were selected: 17 popular singers with vocal complaints, 42 popular singers without complaints, 17 classic singers with complaints, and 42 classic singers without complaints. The groups were similar regarding age, gender and voice types. Both protocols used--Modern Singing Handicap Index (MSHI) and Classical Singing Handicap Index (CSHI)--have specific questions to their respective singing styles, and consist of 30 items equally divided into three subscales: disability (functional domain), handicap (emotional domain) and impairment (organic domain), answered according to the frequency of occurrence. Each subscale has a maximum of 40 points, and the total score is 120 points. The higher the score, the higher the singing voice handicap perceived. For statistical analysis, we used the ANOVA test, with 5% of significance. Classical and popular singers referred higher impairment, followed by disability and handicap. However, the degree of this perception varied according to the singing style and the presence of vocal complaints. The classical singers with vocal complaints showed higher voice handicap than popular singers with vocal complaints, while the classic singers without complaints reported lower handicap than popular singers without complaints. This evidences that classical singers have higher perception of their own voice, and that vocal disturbances in this group may cause greater voice handicap when compared to popular singers.
New robust statistical procedures for the polytomous logistic regression models.
Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro
2018-05-17
This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.
Microgravity experiments on vibrated granular gases in a dilute regime: non-classical statistics
NASA Astrophysics Data System (ADS)
Leconte, M.; Garrabos, Y.; Falcon, E.; Lecoutre-Chabot, C.; Palencia, F.; Évesque, P.; Beysens, D.
2006-07-01
We report on an experimental study of a dilute gas of steel spheres colliding inelastically and excited by a piston performing sinusoidal vibration, in low gravity. Using improved experimental apparatus, here we present some results concerning the collision statistics of particles on a wall of the container. We also propose a simple model where the non-classical statistics obtained from our data are attributed to the boundary condition playing the role of a 'velostat' instead of a thermostat. The significant differences from the kinetic theory of usual gas are related to the inelasticity of collisions.
NASA Astrophysics Data System (ADS)
Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.
2018-01-01
We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.
Quantum Mechanics From the Cradle?
ERIC Educational Resources Information Center
Martin, John L.
1974-01-01
States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)
Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard
2017-11-01
Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantum-mechanical analysis of low-gain free-electron laser oscillators
NASA Astrophysics Data System (ADS)
Fares, H.; Yamada, M.; Chiadroni, E.; Ferrario, M.
2018-05-01
In the previous classical theory of the low-gain free-electron laser (FEL) oscillators, the electron is described as a point-like particle, a delta function in the spatial space. On the other hand, in the previous quantum treatments, the electron is described as a plane wave with a single momentum state, a delta function in the momentum space. In reality, an electron must have statistical uncertainties in the position and momentum domains. Then, the electron is neither a point-like charge nor a plane wave of a single momentum. In this paper, we rephrase the theory of the low-gain FEL where the interacting electron is represented quantum mechanically by a plane wave with a finite spreading length (i.e., a wave packet). Using the concepts of the transformation of reference frames and the statistical quantum mechanics, an expression for the single-pass radiation gain is derived. The spectral broadening of the radiation is expressed in terms of the spreading length of an electron, the relaxation time characterizing the energy spread of electrons, and the interaction time. We introduce a comparison between our results and those obtained in the already known classical analyses where a good agreement between both results is shown. While the correspondence between our results and the classical results are shown, novel insights into the electron dynamics and the interaction mechanism are presented.
NASA Astrophysics Data System (ADS)
Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig
2011-03-01
Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!
A quantitative approach to evolution of music and philosophy
NASA Astrophysics Data System (ADS)
Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira, Osvaldo N., Jr.; da Fontoura Costa, Luciano
2012-08-01
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master-apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Entropy in sound and vibration: towards a new paradigm.
Le Bot, A
2017-01-01
This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart.
Chae, K-H; Biggs, A D; Blandford, R D; Browne, I W A; De Bruyn, A G; Fassnacht, C D; Helbig, P; Jackson, N J; King, L J; Koopmans, L V E; Mao, S; Marlow, D R; McKean, J P; Myers, S T; Norbury, M; Pearson, T J; Phillips, P M; Readhead, A C S; Rusin, D; Sykes, C M; Wilkinson, P N; Xanthopoulos, E; York, T
2002-10-07
We derive constraints on cosmological parameters and the properties of the lensing galaxies from gravitational lens statistics based on the final Cosmic Lens All Sky Survey data. For a flat universe with a classical cosmological constant, we find that the present matter fraction of the critical density is Omega(m)=0.31(+0.27)(-0.14) (68%)+0.12-0.10 (syst). For a flat universe with a constant equation of state for dark energy w=p(x)(pressure)/rho(x)(energy density), we find w<-0.55(+0.18)(-0.11) (68%).
Statistical Thermodynamics and Microscale Thermophysics
NASA Astrophysics Data System (ADS)
Carey, Van P.
1999-08-01
Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.
A heuristic statistical stopping rule for iterative reconstruction in emission tomography.
Ben Bouallègue, F; Crouzet, J F; Mariano-Goulart, D
2013-01-01
We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for MLEM reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the GATE platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time.
Information transport in classical statistical systems
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-02-01
For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.
Multi-Scale Surface Descriptors
Cipriano, Gregory; Phillips, George N.; Gleicher, Michael
2010-01-01
Local shape descriptors compactly characterize regions of a surface, and have been applied to tasks in visualization, shape matching, and analysis. Classically, curvature has be used as a shape descriptor; however, this differential property characterizes only an infinitesimal neighborhood. In this paper, we provide shape descriptors for surface meshes designed to be multi-scale, that is, capable of characterizing regions of varying size. These descriptors capture statistically the shape of a neighborhood around a central point by fitting a quadratic surface. They therefore mimic differential curvature, are efficient to compute, and encode anisotropy. We show how simple variants of mesh operations can be used to compute the descriptors without resorting to expensive parameterizations, and additionally provide a statistical approximation for reduced computational cost. We show how these descriptors apply to a number of uses in visualization, analysis, and matching of surfaces, particularly to tasks in protein surface analysis. PMID:19834190
Optical nonclassicality test based on third-order intensity correlations
NASA Astrophysics Data System (ADS)
Rigovacca, L.; Kolthammer, W. S.; Di Franco, C.; Kim, M. S.
2018-03-01
We develop a nonclassicality criterion for the interference of three delayed, but otherwise identical, light fields in a three-mode Bell interferometer. We do so by comparing the prediction of quantum mechanics with those of a classical framework in which independent sources emit electric fields with random phases. In particular, we evaluate third-order correlations among output intensities as a function of the delays, and show how the presence of a correlation revival for small delays cannot be explained by the classical model of light. The observation of a revival is thus a nonclassicality signature, which can be achieved only by sources with a photon-number statistics that is highly sub-Poissonian. Our analysis provides strong evidence for the nonclassicality of the experiment discussed by Menssen et al. [Phys. Rev. Lett. 118, 153603 (2017), 10.1103/PhysRevLett.118.153603], and shows how a collective "triad" phase affects the interference of any three or more light fields, irrespective of their quantum or classical character.
Quantum theory of multiscale coarse-graining.
Han, Yining; Jin, Jaehyeok; Wagner, Jacob W; Voth, Gregory A
2018-03-14
Coarse-grained (CG) models serve as a powerful tool to simulate molecular systems at much longer temporal and spatial scales. Previously, CG models and methods have been built upon classical statistical mechanics. The present paper develops a theory and numerical methodology for coarse-graining in quantum statistical mechanics, by generalizing the multiscale coarse-graining (MS-CG) method to quantum Boltzmann statistics. A rigorous derivation of the sufficient thermodynamic consistency condition is first presented via imaginary time Feynman path integrals. It identifies the optimal choice of CG action functional and effective quantum CG (qCG) force field to generate a quantum MS-CG (qMS-CG) description of the equilibrium system that is consistent with the quantum fine-grained model projected onto the CG variables. A variational principle then provides a class of algorithms for optimally approximating the qMS-CG force fields. Specifically, a variational method based on force matching, which was also adopted in the classical MS-CG theory, is generalized to quantum Boltzmann statistics. The qMS-CG numerical algorithms and practical issues in implementing this variational minimization procedure are also discussed. Then, two numerical examples are presented to demonstrate the method. Finally, as an alternative strategy, a quasi-classical approximation for the thermal density matrix expressed in the CG variables is derived. This approach provides an interesting physical picture for coarse-graining in quantum Boltzmann statistical mechanics in which the consistency with the quantum particle delocalization is obviously manifest, and it opens up an avenue for using path integral centroid-based effective classical force fields in a coarse-graining methodology.
Wang, Yinqing; Cai, Ranze; Wang, Rui; Wang, Chunhua; Chen, Chunmei
2018-06-01
This is a retrospective study.The aim of this study was to illustrate the survival outcomes of patients with classic ependymoma (CE) and identify potential prognostic factors.CE is the most common category of spinal ependymomas, but few published studies have discussed predictors of the survival outcome.A Boolean search of the PubMed, Embase, and OVID databases was conducted by 2 investigators independently. The objects were intramedullary grade II ependymoma according to 2007 WHO classification. Univariate Kaplan-Meier analysis and Log-Rank tests were performed to identify variables associated with progression-free survival (PFS) or overall survival (OS). Multivariate Cox regression was performed to assess hazard ratios (HRs) with 95% confidence intervals (95% CIs). Statistical analysis was performed by SPSS version 23.0 (IBM Corp.) with statistical significance defined as P < .05.A total of 35 studies were identified, including 169 cases of CE. The mean follow-up time across cases was 64.2 ± 51.5 months. Univariate analysis showed that patients who had undergone total resection (TR) had better PFS and OS than those with subtotal resection (STR) and biopsy (P = .002, P = .004, respectively). Within either univariate or multivariate analysis (P = .000, P = .07, respectively), histological type was an independent prognostic factor for PFS of CE [papillary type: HR 0.002, 95% CI (0.000-0.073), P = .001, tanycytic type: HR 0.010, 95% CI (0.000-0.218), P = .003].It was the first integrative analysis of CE to elucidate the correlation between kinds of factors and prognostic outcomes. Definite histological type and safely TR were foundation of CE's management. 4.
Thermodynamics and statistical mechanics. [thermodynamic properties of gases
NASA Technical Reports Server (NTRS)
1976-01-01
The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.
An astronomer's guide to period searching
NASA Astrophysics Data System (ADS)
Schwarzenberg-Czerny, A.
2003-03-01
We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.
Quick Overview Scout 2008 Version 1.0
The Scout 2008 version 1.0 statistical software package has been updated from past DOS and Windows versions to provide classical and robust univariate and multivariate graphical and statistical methods that are not typically available in commercial or freeware statistical softwar...
ERIC Educational Resources Information Center
Falcon, Evelyn
2017-01-01
The purpose of this study was to examine if there is any relationship on reading comprehension when background classical music is played in the setting of a 7th and 8th grade classroom. This study also examined if there was a statistically significant difference in test anxiety when listening to classical music while completing a test. Reading…
Algorithms for tensor network renormalization
NASA Astrophysics Data System (ADS)
Evenbly, G.
2017-01-01
We discuss in detail algorithms for implementing tensor network renormalization (TNR) for the study of classical statistical and quantum many-body systems. First, we recall established techniques for how the partition function of a 2 D classical many-body system or the Euclidean path integral of a 1 D quantum system can be represented as a network of tensors, before describing how TNR can be implemented to efficiently contract the network via a sequence of coarse-graining transformations. The efficacy of the TNR approach is then benchmarked for the 2 D classical statistical and 1 D quantum Ising models; in particular the ability of TNR to maintain a high level of accuracy over sustained coarse-graining transformations, even at a critical point, is demonstrated.
Pre-attentive auditory discrimination skill in Indian classical vocal musicians and non-musicians.
Sanju, Himanshu Kumar; Kumar, Prawin
2016-09-01
To test for pre-attentive auditory discrimination skills in Indian classical vocal musicians and non-musicians. Mismatch negativity (MMN) was recorded to test for pre-attentive auditory discrimination skills with a pair of stimuli of /1000 Hz/ and /1100 Hz/, with /1000 Hz/ as the frequent stimulus and /1100 Hz/ as the infrequent stimulus. Onset, offset and peak latencies were the considered latency parameters, whereas peak amplitude and area under the curve were considered for amplitude analysis. Exactly 50 participants, out of which the experimental group had 25 adult Indian classical vocal musicians and 25 age-matched non-musicians served as the control group, were included in the study. Experimental group participants had a minimum professional music experience in Indian classic vocal music of 10 years. However, control group participants did not have any formal training in music. Descriptive statistics showed better waveform morphology in the experimental group as compared to the control. MANOVA showed significantly better onset latency, peak amplitude and area under the curve in the experimental group but no significant difference in the offset and peak latencies between the two groups. The present study probably points towards the enhancement of pre-attentive auditory discrimination skills in Indian classical vocal musicians compared to non-musicians. It indicates that Indian classical musical training enhances pre-attentive auditory discrimination skills in musicians, leading to higher peak amplitude and a greater area under the curve compared to non-musicians.
Rescaled earthquake recurrence time statistics: application to microrepeaters
NASA Astrophysics Data System (ADS)
Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru
2009-01-01
Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.
Introduction to bioinformatics.
Can, Tolga
2014-01-01
Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.
Continuous quantum measurement and the quantum to classical transition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Tanmoy; Habib, Salman; Jacobs, Kurt
2003-04-01
While ultimately they are described by quantum mechanics, macroscopic mechanical systems are nevertheless observed to follow the trajectories predicted by classical mechanics. Hence, in the regime defining macroscopic physics, the trajectories of the correct classical motion must emerge from quantum mechanics, a process referred to as the quantum to classical transition. Extending previous work [Bhattacharya, Habib, and Jacobs, Phys. Rev. Lett. 85, 4852 (2000)], here we elucidate this transition in some detail, showing that once the measurement processes that affect all macroscopic systems are taken into account, quantum mechanics indeed predicts the emergence of classical motion. We derive inequalities thatmore » describe the parameter regime in which classical motion is obtained, and provide numerical examples. We also demonstrate two further important properties of the classical limit: first, that multiple observers all agree on the motion of an object, and second, that classical statistical inference may be used to correctly track the classical motion.« less
ERIC Educational Resources Information Center
Hester, Yvette
Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…
Scout 2008 Version 1.0 User Guide
The Scout 2008 version 1.0 software package provides a wide variety of classical and robust statistical methods that are not typically available in other commercial software packages. A major part of Scout deals with classical, robust, and resistant univariate and multivariate ou...
Machine learning of frustrated classical spin models. I. Principal component analysis
NASA Astrophysics Data System (ADS)
Wang, Ce; Zhai, Hui
2017-10-01
This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.
Liu, Lu; Wei, Jianrong; Zhang, Huishu; Xin, Jianhong; Huang, Jiping
2013-01-01
Because classical music has greatly affected our life and culture in its long history, it has attracted extensive attention from researchers to understand laws behind it. Based on statistical physics, here we use a different method to investigate classical music, namely, by analyzing cumulative distribution functions (CDFs) and autocorrelation functions of pitch fluctuations in compositions. We analyze 1,876 compositions of five representative classical music composers across 164 years from Bach, to Mozart, to Beethoven, to Mendelsohn, and to Chopin. We report that the biggest pitch fluctuations of a composer gradually increase as time evolves from Bach time to Mendelsohn/Chopin time. In particular, for the compositions of a composer, the positive and negative tails of a CDF of pitch fluctuations are distributed not only in power laws (with the scale-free property), but also in symmetry (namely, the probability of a treble following a bass and that of a bass following a treble are basically the same for each composer). The power-law exponent decreases as time elapses. Further, we also calculate the autocorrelation function of the pitch fluctuation. The autocorrelation function shows a power-law distribution for each composer. Especially, the power-law exponents vary with the composers, indicating their different levels of long-range correlation of notes. This work not only suggests a way to understand and develop music from a viewpoint of statistical physics, but also enriches the realm of traditional statistical physics by analyzing music.
Multiscale hidden Markov models for photon-limited imaging
NASA Astrophysics Data System (ADS)
Nowak, Robert D.
1999-06-01
Photon-limited image analysis is often hindered by low signal-to-noise ratios. A novel Bayesian multiscale modeling and analysis method is developed in this paper to assist in these challenging situations. In addition to providing a very natural and useful framework for modeling an d processing images, Bayesian multiscale analysis is often much less computationally demanding compared to classical Markov random field models. This paper focuses on a probabilistic graph model called the multiscale hidden Markov model (MHMM), which captures the key inter-scale dependencies present in natural image intensities. The MHMM framework presented here is specifically designed for photon-limited imagin applications involving Poisson statistics, and applications to image intensity analysis are examined.
Local coexistence of VO 2 phases revealed by deep data analysis
Strelcov, Evgheni; Ievlev, Anton; Tselev, Alexander; ...
2016-07-07
We report a synergistic approach of micro-Raman spectroscopic mapping and deep data analysis to study the distribution of crystallographic phases and ferroelastic domains in a defected Al-doped VO 2 microcrystal. Bayesian linear unmixing revealed an uneven distribution of the T phase, which is stabilized by the surface defects and uneven local doping that went undetectable by other classical analysis techniques such as PCA and SIMPLISMA. This work demonstrates the impact of information recovery via statistical analysis and full mapping in spectroscopic studies of vanadium dioxide systems, which is commonly substituted by averaging or single point-probing approaches, both of which suffermore » from information misinterpretation due to low resolving power.« less
Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.
2011-01-01
The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.
Rossoe, Ed Wilson Tsuneo; Tebcherani, Antonio José; Sittart, José Alexandre; Pires, Mario Cezar
2011-01-01
Chronic actinic cheilitis is actinic keratosis located on the vermilion border. Treatment is essential because of the potential for malignant transformation. To evaluate the aesthetic and functional results of vermilionectomy using the classic and W-plasty techniques in actinic cheilitis. In the classic technique, the scar is linear and in the W-plasty one, it is a broken line. 32 patients with clinical and histopathological diagnosis of actinic cheilitis were treated. Out of the 32 patients, 15 underwent the W-plasty technique and 17 underwent the classic one. We evaluated parameters such as scar retraction and functional changes. A statistically significant association between the technique used and scar retraction was found, which was positive when using the classic technique (p = 0.01 with Yates' correction). The odds ratio was calculated at 11.25, i.e., there was a greater chance of retraction in patients undergoing the classic technique. Both techniques revealed no functional changes. We evaluated postoperative complications such as the presence of crusts, dry lips, paresthesia, and suture dehiscence. There was no statistically significant association between complications and the technique used (p = 0.69). We concluded that vermilionectomy using the W-plasty technique shows better cosmetic results and similar complication rates.
Entropy in sound and vibration: towards a new paradigm
2017-01-01
This paper describes a discussion on the method and the status of a statistical theory of sound and vibration, called statistical energy analysis (SEA). SEA is a simple theory of sound and vibration in elastic structures that applies when the vibrational energy is diffusely distributed. We show that SEA is a thermodynamical theory of sound and vibration, based on a law of exchange of energy analogous to the Clausius principle. We further investigate the notion of entropy in this context and discuss its meaning. We show that entropy is a measure of information lost in the passage from the classical theory of sound and vibration and SEA, its thermodynamical counterpart. PMID:28265190
Use of Fermi-Dirac statistics for defects in solids
NASA Astrophysics Data System (ADS)
Johnson, R. A.
1981-12-01
The Fermi-Dirac distribution function is an approximation describing a special case of Boltzmann statistics. A general occupation probability formula is derived and a criterion given for the use of Fermi-Dirac statistics. Application to classical problems of defects in solids is discussed.
Comparison of Classical and Quantum Mechanical Uncertainties.
ERIC Educational Resources Information Center
Peslak, John, Jr.
1979-01-01
Comparisons are made for the particle-in-a-box, the harmonic oscillator, and the one-electron atom. A classical uncertainty principle is derived and compared with its quantum-mechanical counterpart. The results are discussed in terms of the statistical interpretation of the uncertainty principle. (Author/BB)
Information categorization approach to literary authorship disputes
NASA Astrophysics Data System (ADS)
Yang, Albert C.-C.; Peng, C.-K.; Yien, H.-W.; Goldberger, Ary L.
2003-11-01
Scientific analysis of the linguistic styles of different authors has generated considerable interest. We present a generic approach to measuring the similarity of two symbolic sequences that requires minimal background knowledge about a given human language. Our analysis is based on word rank order-frequency statistics and phylogenetic tree construction. We demonstrate the applicability of this method to historic authorship questions related to the classic Chinese novel “The Dream of the Red Chamber,” to the plays of William Shakespeare, and to the Federalist papers. This method may also provide a simple approach to other large databases based on their information content.
Révész, Kinga M; Landwehr, Jurate M
2002-01-01
A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 +/- 20 micro g) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H(3)PO(4)/CaCO(3)) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H(3)PO(4)/CaCO(3) reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 degrees C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was =0.1 and =0.2 per mill or per thousand, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for delta(18)O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings.
On Some Assumptions of the Null Hypothesis Statistical Testing
ERIC Educational Resources Information Center
Patriota, Alexandre Galvão
2017-01-01
Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…
NASA Astrophysics Data System (ADS)
Cohen, E. G. D.
Lecture notes are organized around the key word dissipation, while focusing on a presentation of modern theoretical developments in the study of irreversible phenomena. A broad cross-disciplinary perspective towards non-equilibrium statistical mechanics is backed by the general theory of nonlinear and complex dynamical systems. The classical-quantum intertwine and semiclassical dissipative borderline issue (decoherence, "classical out of quantum") are here included . Special emphasis is put on links between the theory of classical and quantum dynamical systems (temporal disorder, dynamical chaos and transport processes) with central problems of non-equilibrium statistical mechanics like e.g. the connection between dynamics and thermodynamics, relaxation towards equilibrium states and mechanisms capable to drive and next maintain the physical system far from equilibrium, in a non-equilibrium steady (stationary) state. The notion of an equilibrium state - towards which a system naturally evolves if left undisturbed - is a fundamental concept of equilibrium statistical mechanics. Taken as a primitive point of reference that allows to give an unambiguous status to near equilibrium and far from equilibrium systems, together with the dynamical notion of a relaxation (decay) towards a prescribed asymptotic invariant measure or probability distribution (properties of ergodicity and mixing are implicit). A related issue is to keep under control the process of driving a physical system away from an initial state of equilibrium and either keeping it in another (non-equilibrium) steady state or allowing to restore the initial data (return back, relax). To this end various models of environment (heat bath, reservoir, thermostat, measuring instrument etc.), and the environment - system coupling are analyzed. The central theme of the book is the dynamics of dissipation and various mechanisms responsible for the irreversible behaviour (transport properties) of open systems on classical and quantum levels of description. A distinguishing feature of these lecture notes is that microscopic foundations of irreversibility are investigated basically in terms of "small" systems, when the "system" and/or "environment" may have a finite (and small) number of degrees of freedom and may be bounded. This is to be contrasted with the casual understanding of statistical mechanics which is regarded to refer to systems with a very large number of degrees of freedom. In fact, it is commonly accepted that the accumulation of effects due to many (range of the Avogadro number) particles is required for statistical mechanics reasoning. Albeit those large numbers are not at all sufficient for transport properties. A helpful hint towards this conceptual turnover comes from the observation that for chaotic dynamical systems the random time evolution proves to be compatible with the underlying purely deterministic laws of motion. Chaotic features of the classical dynamics already appear in systems with two degrees of freedom and such systems need to be described in statistical terms, if we wish to quantify the dynamics of relaxation towards an invariant ergodic measure. The relaxation towards equilibrium finds a statistical description through an analysis of statistical ensembles. This entails an extension of the range of validity of statistical mechanics to small classical systems. On the other hand, the dynamics of fluctuations in macroscopic dissipative systems (due to their molecular composition and thermal mobility) may render a characterization of such systems as being chaotic. That motivates attempts of understanding the role of microscopic chaos and various "chaotic hypotheses" - dynamical systems approach is being pushed down to the level of atoms, molecules and complex matter constituents, whose natural substitute are low-dimensional model subsystems (encompassing as well the mesoscopic "quantum chaos") - in non-equilibrium transport phenomena. On the way a number of questions is addressed like e.g.: is there, or what is the nature of a connection between chaos (modern theory of dynamical systems) and irreversible thermodynamics; can really quantum chaos explain some peculiar features of quantum transport? The answer in both cases is positive, modulo a careful discrimination between viewing the dynamical chaos as a necessary or sufficient basis for irreversibility. In those dynamical contexts, another key term dynamical semigroups refers to major technical tools appropriate for the "dissipative mathematics", modelling irreversible behaviour on the classical and quantum levels of description. Dynamical systems theory and "quantum chaos" research involve both a high level of mathematical sophistication and heavy computer "experimentation". One of the present volume specific flavors is a tutorial access to quite advanced mathematical tools. They gradually penetrate the classical and quantum dynamical semigroup description, while culminating in the noncommutative Brillouin zone construction as a prerequisite to understand transport in aperiodic solids. Lecture notes are structured into chapters to give a better insight into major conceptual streamlines. Chapter I is devoted to a discussion of non-equilibrium steady states and, through so-called chaotic hypothesis combined with suitable fluctuation theorems, elucidates the role of Sinai-Ruelle-Bowen distribution in both equilibrium and non-equilibrium statistical physics frameworks (E. G. D. Cohen). Links between dynamics and statistics (Boltzmann versus Tsallis) are also discussed. Fluctuation relations and a survey of deterministic thermostats are given in the context of non-equilibrium steady states of fluids (L. Rondoni). Response of systems driven far from equilibrium is analyzed on the basis of a central assertion about the existence of the statistical representation in terms of an ensemble of dynamical realizations of the driving process. Non-equilibrium work relation is deduced for irreversible processes (C. Jarzynski). The survey of non-equilibrium steady states in statistical mechanics of classical and quantum systems employs heat bath models and the random matrix theory input. The quantum heat bath analysis and derivation of fluctuation-dissipation theorems is performed by means of the influence functional technique adopted to solve quantum master equations (D. Kusnezov). Chapter II deals with an issue of relaxation and its dynamical theory in both classical and quantum contexts. Pollicott-Ruelle resonance background for the exponential decay scenario is discussed for irreversible processes of diffusion in the Lorentz gas and multibaker models (P. Gaspard). The Pollicott-Ruelle theory reappears as a major inspiration in the survey of the behaviour of ensembles of chaotic systems, with a focus on model systems for which no rigorous results concerning the exponential decay of correlations in time is available (S. Fishman). The observation, that non-equilibrium transport processes in simple classical chaotic systems can be described in terms of fractal structures developing in the system phase space, links their formation and properties with the entropy production in the course of diffusion processes displaying a low dimensional deterministic (chaotic) origin (J. R. Dorfman). Chapter III offers an introduction to the theory of dynamical semigroups. Asymptotic properties of Markov operators and Markov semigroups acting in the set of probability densities (statistical ensemble notion is implicit) are analyzed. Ergodicity, mixing, strong (complete) mixing and sweeping are discussed in the familiar setting of "noise, chaos and fractals" (R. Rudnicki). The next step comprises a passage to quantum dynamical semigroups and completely positive dynamical maps, with an ultimate goal to introduce a consistent framework for the analysis of irreversible phenomena in open quantum systems, where dissipation and decoherence are crucial concepts (R. Alicki). Friction and damping in classical and quantum mechanics of finite dissipative systems is analyzed by means of Markovian quantum semigroups with special emphasis on the issue of complete positivity (M. Fannes). Specific two-level model systems of elementary particle physics (kaons) and rudiments of neutron interferometry are employed to elucidate a distinction between positivity and complete positivity (F. Benatti). Quantization of dynamics of stochastic models related to equilibrium Gibbs states results in dynamical maps which form quantum stochastic dynamical semigroups (W. A. Majewski). Chapter IV addresses diverse but deeply interrelated features of driven chaotic (mesoscopic) classical and quantum systems, their dissipative properties, notions of quantum irreversibility, entanglement, dephasing and decoherence. A survey of non-perturbative quantum effects for open quantum systems is concluded by outlining the discrepancies between random matrix theory and non-perturbative semiclassical predictions (D. Cohen). As a useful supplement to the subject of bounded open systems, methods of quantum state control in a cavity (coherent versus incoherent dynamics and dissipation) are described for low dimensional quantum systems (A. Buchleitner). The dynamics of open quantum systems can be alternatively described by means of non-Markovian stochastic Schrödinger equation, jointly for an open system and its environment, which moves us beyond the Linblad evolution scenario of Markovian dynamical semigroups. The quantum Brownian motion is considered (W. Strunz) . Chapter V enforces a conceptual transition 'from "small" to "large" systems with emphasis on irreversible thermodynamics of quantum transport. Typical features of the statistical mechanics of infinitely extended systems and the dynamical (small) systems approach are described by means of representative examples of relaxation towards asymptotic steady states: quantum one-dimensional lattice conductor and an open multibaker map (S. Tasaki). Dissipative transport in aperiodic solids is reviewed by invoking methods on noncommutative geometry. The anomalous Drude formula is derived. The occurence of quantum chaos is discussed together with its main consequences (J. Bellissard). The chapter is concluded by a survey of scaling limits of the N-body Schrödinger quantum dynamics, where classical evolution equations of irreversible statistical mechanics (linear Boltzmann, Hartree, Vlasov) emerge "out of quantum". In particular, a scaling limit of one body quantum dynamics with impurities (static random potential) and that of quantum dynamics with weakly coupled phonons are shown to yield the linear Boltzmann equation (L. Erdös). Various interrelations between chapters and individual lectures, plus a detailed fine-tuned information about the subject matter coverage of the volume, can be recovered by examining an extensive index.
Bennett, Derrick A; Landry, Denise; Little, Julian; Minelli, Cosetta
2017-09-19
Several statistical approaches have been proposed to assess and correct for exposure measurement error. We aimed to provide a critical overview of the most common approaches used in nutritional epidemiology. MEDLINE, EMBASE, BIOSIS and CINAHL were searched for reports published in English up to May 2016 in order to ascertain studies that described methods aimed to quantify and/or correct for measurement error for a continuous exposure in nutritional epidemiology using a calibration study. We identified 126 studies, 43 of which described statistical methods and 83 that applied any of these methods to a real dataset. The statistical approaches in the eligible studies were grouped into: a) approaches to quantify the relationship between different dietary assessment instruments and "true intake", which were mostly based on correlation analysis and the method of triads; b) approaches to adjust point and interval estimates of diet-disease associations for measurement error, mostly based on regression calibration analysis and its extensions. Two approaches (multiple imputation and moment reconstruction) were identified that can deal with differential measurement error. For regression calibration, the most common approach to correct for measurement error used in nutritional epidemiology, it is crucial to ensure that its assumptions and requirements are fully met. Analyses that investigate the impact of departures from the classical measurement error model on regression calibration estimates can be helpful to researchers in interpreting their findings. With regard to the possible use of alternative methods when regression calibration is not appropriate, the choice of method should depend on the measurement error model assumed, the availability of suitable calibration study data and the potential for bias due to violation of the classical measurement error model assumptions. On the basis of this review, we provide some practical advice for the use of methods to assess and adjust for measurement error in nutritional epidemiology.
Daniel Goodman’s empirical approach to Bayesian statistics
Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina
2016-01-01
Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.
Robust Statistics: What They Are, and Why They Are So Important
ERIC Educational Resources Information Center
Corlu, Sencer M.
2009-01-01
The problem with "classical" statistics all invoking the mean is that these estimates are notoriously influenced by atypical scores (outliers), partly because the mean itself is differentially influenced by outliers. In theory, "modern" statistics may generate more replicable characterizations of data, because at least in some…
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
NASA Astrophysics Data System (ADS)
VandeVondele, Joost; Rothlisberger, Ursula
2000-09-01
We present a method for calculating multidimensional free energy surfaces within the limited time scale of a first-principles molecular dynamics scheme. The sampling efficiency is enhanced using selected terms of a classical force field as a bias potential. This simple procedure yields a very substantial increase in sampling accuracy while retaining the high quality of the underlying ab initio potential surface and can thus be used for a parameter free calculation of free energy surfaces. The success of the method is demonstrated by the applications to two gas phase molecules, ethane and peroxynitrous acid, as test case systems. A statistical analysis of the results shows that the entire free energy landscape is well converged within a 40 ps simulation at 500 K, even for a system with barriers as high as 15 kcal/mol.
Inference in the age of big data: Future perspectives on neuroscience.
Bzdok, Danilo; Yeo, B T Thomas
2017-07-15
Neuroscience is undergoing faster changes than ever before. Over 100 years our field qualitatively described and invasively manipulated single or few organisms to gain anatomical, physiological, and pharmacological insights. In the last 10 years neuroscience spawned quantitative datasets of unprecedented breadth (e.g., microanatomy, synaptic connections, and optogenetic brain-behavior assays) and size (e.g., cognition, brain imaging, and genetics). While growing data availability and information granularity have been amply discussed, we direct attention to a less explored question: How will the unprecedented data richness shape data analysis practices? Statistical reasoning is becoming more important to distill neurobiological knowledge from healthy and pathological brain measurements. We argue that large-scale data analysis will use more statistical models that are non-parametric, generative, and mixing frequentist and Bayesian aspects, while supplementing classical hypothesis testing with out-of-sample predictions. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.
O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A
2016-02-01
The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.
Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua
2014-10-01
This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.
Classical Electrodynamics: Lecture notes
NASA Astrophysics Data System (ADS)
Likharev, Konstantin K.
2018-06-01
Essential Advanced Physics is a series comprising four parts: Classical Mechanics, Classical Electrodynamics, Quantum Mechanics and Statistical Mechanics. Each part consists of two volumes, Lecture notes and Problems with solutions, further supplemented by an additional collection of test problems and solutions available to qualifying university instructors. This volume, Classical Electrodynamics: Lecture notes is intended to be the basis for a two-semester graduate-level course on electricity and magnetism, including not only the interaction and dynamics charged point particles, but also properties of dielectric, conducting, and magnetic media. The course also covers special relativity, including its kinematics and particle-dynamics aspects, and electromagnetic radiation by relativistic particles.
Greaves, Paul; Clear, Andrew; Coutinho, Rita; Wilson, Andrew; Matthews, Janet; Owen, Andrew; Shanyinde, Milensu; Lister, T. Andrew; Calaminici, Maria; Gribben, John G.
2013-01-01
Purpose The immune microenvironment is key to the pathophysiology of classical Hodgkin lymphoma (CHL). Twenty percent of patients experience failure of their initial treatment, and others receive excessively toxic treatment. Prognostic scores and biomarkers have yet to influence outcomes significantly. Previous biomarker studies have been limited by the extent of tissue analyzed, statistical inconsistencies, and failure to validate findings. We aimed to overcome these limitations by validating recently identified microenvironment biomarkers (CD68, FOXP3, and CD20) in a new patient cohort with a greater extent of tissue and by using rigorous statistical methodology. Patients and Methods Diagnostic tissue from 122 patients with CHL was microarrayed and stained, and positive cells were counted across 10 to 20 high-powered fields per patient by using an automated system. Two statistical analyses were performed: a categorical analysis with test/validation set-defined cut points and Kaplan-Meier estimated outcome measures of 5-year overall survival (OS), disease-specific survival (DSS), and freedom from first-line treatment failure (FFTF) and an independent multivariate analysis of absolute uncategorized counts. Results Increased CD20 expression confers superior OS. Increased FOXP3 expression confers superior OS, and increased CD68 confers inferior FFTF and OS. FOXP3 varies independently of CD68 expression and retains significance when analyzed as a continuous variable in multivariate analysis. A simple score combining FOXP3 and CD68 discriminates three groups: FFTF 93%, 62%, and 47% (P < .001), DSS 93%, 82%, and 63% (P = .03), and OS 93%, 82%, and 59% (P = .002). Conclusion We have independently validated CD68, FOXP3, and CD20 as prognostic biomarkers in CHL, and we demonstrate, to the best of our knowledge for the first time, that combining FOXP3 and CD68 may further improve prognostic stratification. PMID:23045593
Ehrenfest dynamics is purity non-preserving: A necessary ingredient for decoherence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alonso, J. L.; Instituto de Biocomputacion y Fisica de Sistemas Complejos; Unidad Asociada IQFR-BIFI, Universidad de Zaragoza, Mariano Esquillor s/n, E-50018 Zaragoza
2012-08-07
We discuss the evolution of purity in mixed quantum/classical approaches to electronic nonadiabatic dynamics in the context of the Ehrenfest model. As it is impossible to exactly determine initial conditions for a realistic system, we choose to work in the statistical Ehrenfest formalism that we introduced in Alonso et al. [J. Phys. A: Math. Theor. 44, 396004 (2011)]. From it, we develop a new framework to determine exactly the change in the purity of the quantum subsystem along with the evolution of a statistical Ehrenfest system. In a simple case, we verify how and to which extent Ehrenfest statistical dynamicsmore » makes a system with more than one classical trajectory, and an initial quantum pure state become a quantum mixed one. We prove this numerically showing how the evolution of purity depends on time, on the dimension of the quantum state space D, and on the number of classical trajectories N of the initial distribution. The results in this work open new perspectives for studying decoherence with Ehrenfest dynamics.« less
Generalized relative entropies in the classical limit
NASA Astrophysics Data System (ADS)
Kowalski, A. M.; Martin, M. T.; Plastino, A.
2015-03-01
Our protagonists are (i) the Cressie-Read family of divergences (characterized by the parameter γ), (ii) Tsallis' generalized relative entropies (characterized by the q one), and, as a particular instance of both, (iii) the Kullback-Leibler (KL) relative entropy. In their normalized versions, we ascertain the equivalence between (i) and (ii). Additionally, we employ these three entropic quantifiers in order to provide a statistical investigation of the classical limit of a semiclassical model, whose properties are well known from a purely dynamic viewpoint. This places us in a good position to assess the appropriateness of our statistical quantifiers for describing involved systems. We compare the behaviour of (i), (ii), and (iii) as one proceeds towards the classical limit. We determine optimal ranges for γ and/or q. It is shown the Tsallis-quantifier is better than KL's for 1.5 < q < 2.5.
Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio
2012-09-07
In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.
NASA Astrophysics Data System (ADS)
Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio
2012-09-01
In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.
APPROACH TO EQUILIBRIUM OF A QUANTUM PLASMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balescu, R.
1961-01-01
The treatment of irreversible processes in a classical plasma (R. Balescu, Phys. Fluids 3, 62(1960)) was extended to a gas of charged particles obeying quantum statistics. The various contributions to the equation of evolution for the reduced one-particle Wigner function were written in a form analogous to the classical formalism. The summation was then performed in a straightforward manner. The resulting equation describes collisions between particles "dressed" by their polarization clouds, exactly as in the classical situation. (auth)
Unbiased estimators for spatial distribution functions of classical fluids
NASA Astrophysics Data System (ADS)
Adib, Artur B.; Jarzynski, Christopher
2005-01-01
We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.
GWAR: robust analysis and meta-analysis of genome-wide association studies.
Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G
2017-05-15
In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Realistic finite temperature simulations of magnetic systems using quantum statistics
NASA Astrophysics Data System (ADS)
Bergqvist, Lars; Bergman, Anders
2018-01-01
We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.
Imperial College near infrared spectroscopy neuroimaging analysis framework.
Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong
2018-01-01
This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.
NASA Astrophysics Data System (ADS)
Aragonès, Àngels; Maxit, Laurent; Guasch, Oriol
2015-08-01
Statistical modal energy distribution analysis (SmEdA) extends classical statistical energy analysis (SEA) to the mid frequency range by establishing power balance equations between modes in different subsystems. This circumvents the SEA requirement of modal energy equipartition and enables applying SmEdA to the cases of low modal overlap, locally excited subsystems and to deal with complex heterogeneous subsystems as well. Yet, widening the range of application of SEA is done at a price with large models because the number of modes per subsystem can become considerable when the frequency increases. Therefore, it would be worthwhile to have at one's disposal tools for a quick identification and ranking of the resonant and non-resonant paths involved in modal energy transmission between subsystems. It will be shown that previously developed graph theory algorithms for transmission path analysis (TPA) in SEA can be adapted to SmEdA and prove useful for that purpose. The case of airborne transmission between two cavities separated apart by homogeneous and ribbed plates will be first addressed to illustrate the potential of the graph approach. A more complex case representing transmission between non-contiguous cavities in a shipbuilding structure will be also presented.
NASA Astrophysics Data System (ADS)
Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia
2014-02-01
How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.
Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows
NASA Astrophysics Data System (ADS)
Qi, Di; Majda, Andrew J.
2018-04-01
Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.
Gavrilenko, T V; Es'kov, V M; Khadartsev, A A; Khimikova, O I; Sokolova, A A
2014-01-01
The behavior of the state vector of human cardio-vascular system in different age groups according to methods of theory of chaos-self-organization and methods of classical statistics was investigated. Observations were made on the indigenous people of North of the Russian Federation. Using methods of the theory of chaos-self-organization the differences in the parameters of quasi-attractors of the human state vector of cardio-vascular system of the people of Russian Federation North were shown. Comparison with the results obtained by classical statistics was made.
The Multiphoton Interaction of Lambda Model Atom and Two-Mode Fields
NASA Technical Reports Server (NTRS)
Liu, Tang-Kun
1996-01-01
The system of two-mode fields interacting with atom by means of multiphotons is addressed, and the non-classical statistic quality of two-mode fields with interaction is discussed. Through mathematical calculation, some new rules of non-classical effects of two-mode fields which evolue with time, are established.
Non-classical State via Superposition of Two Opposite Coherent States
NASA Astrophysics Data System (ADS)
Ren, Gang; Du, Jian-ming; Yu, Hai-jun
2018-04-01
We study the non-classical properties of the states generated by superpositions of two opposite coherent states with the arbitrary relative phase factors. We show that the relative phase factors plays an important role in these superpositions. We demonstrate this result by discussing their squeezing properties, quantum statistical properties and fidelity in principle.
For a statistical interpretation of Helmholtz' thermal displacement
NASA Astrophysics Data System (ADS)
Podio-Guidugli, Paolo
2016-11-01
On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.
Estimation of plant sampling uncertainty: an example based on chemical analysis of moss samples.
Dołęgowska, Sabina
2016-11-01
In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m 2 whereas duplicate samples were collected in the same way at a distance of 1-2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO 3 (1:1) + 1 mL 30 % H 2 O 2 ) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.
Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations
NASA Astrophysics Data System (ADS)
Müller, Meinard; Kurth, Frank
2006-12-01
One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.
Fenton, Norman; Neil, Martin; Berger, Daniel
2016-01-01
Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes’ theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law. PMID:27398389
Fenton, Norman; Neil, Martin; Berger, Daniel
2016-06-01
Although the last forty years has seen considerable growth in the use of statistics in legal proceedings, it is primarily classical statistical methods rather than Bayesian methods that have been used. Yet the Bayesian approach avoids many of the problems of classical statistics and is also well suited to a broader range of problems. This paper reviews the potential and actual use of Bayes in the law and explains the main reasons for its lack of impact on legal practice. These include misconceptions by the legal community about Bayes' theorem, over-reliance on the use of the likelihood ratio and the lack of adoption of modern computational methods. We argue that Bayesian Networks (BNs), which automatically produce the necessary Bayesian calculations, provide an opportunity to address most concerns about using Bayes in the law.
Uniform quantized electron gas
NASA Astrophysics Data System (ADS)
Høye, Johan S.; Lomba, Enrique
2016-10-01
In this work we study the correlation energy of the quantized electron gas of uniform density at temperature T = 0. To do so we utilize methods from classical statistical mechanics. The basis for this is the Feynman path integral for the partition function of quantized systems. With this representation the quantum mechanical problem can be interpreted as, and is equivalent to, a classical polymer problem in four dimensions where the fourth dimension is imaginary time. Thus methods, results, and properties obtained in the statistical mechanics of classical fluids can be utilized. From this viewpoint we recover the well known RPA (random phase approximation). Then to improve it we modify the RPA by requiring the corresponding correlation function to be such that electrons with equal spins can not be on the same position. Numerical evaluations are compared with well known results of a standard parameterization of Monte Carlo correlation energies.
Classical Electrodynamics: Problems with solutions; Problems with solutions
NASA Astrophysics Data System (ADS)
Likharev, Konstantin K.
2018-06-01
l Advanced Physics is a series comprising four parts: Classical Mechanics, Classical Electrodynamics, Quantum Mechanics and Statistical Mechanics. Each part consists of two volumes, Lecture notes and Problems with solutions, further supplemented by an additional collection of test problems and solutions available to qualifying university instructors. This volume, Classical Electrodynamics: Lecture notes is intended to be the basis for a two-semester graduate-level course on electricity and magnetism, including not only the interaction and dynamics charged point particles, but also properties of dielectric, conducting, and magnetic media. The course also covers special relativity, including its kinematics and particle-dynamics aspects, and electromagnetic radiation by relativistic particles.
Bekiroğlu, Tansel; Ovayolu, Nimet; Ergün, Yusuf; Ekerbiçer, Hasan Çetin
2013-06-01
Existing studies suggest that music therapy can have favorable effects on hypertension and anxiety. We therefore set out to investigate the effect of Turkish classical music. To investigate whether Turkish classical music has positive effects on blood pressures and anxiety levels in elderly patients. This was a randomized controlled trial performed on 60 hypertensive patients living in a local elderly home in Adana, Turkey. Following the completion of a socio-demographic form for each patient, Hamilton anxiety scale was applied. Thereafter, the subjects were randomly divided into two equal-size groups and were allowed to either listen to Turkish classical music (music therapy group) or have a resting period (control group) for 25 min. The primary and secondary outcome measures were blood pressure and Hamilton anxiety scale scores, respectively. The mean reduction in systolic blood pressure was 13.00 mmHg in the music therapy group and 6.50 mmHg in the control group. The baseline adjusted between treatment group difference was not statistically significant (95% CI 6.80-9.36). The median reductions in diastolic blood pressures were 10 mmHg both in the music therapy and control groups. The between treatment group difference was not statistically significant (Mann-Whitney U test, P = 0.839). The mean reduction in HAMA-A was 1.63 in the music therapy group and 0.77 in the control group. The baseline adjusted between treatment group difference was not statistically significant (95% CI 0.82-1.92). The study demonstrated that both Turkish classical music and resting alone have positive effects on blood pressure in patients with hypertension. Copyright © 2013 Elsevier Ltd. All rights reserved.
Shapes of rotating superfluid helium nanodroplets
Bernando, Charles; Tanyag, Rico Mayro P.; Jones, Curtis; ...
2017-02-16
Rotating superfluid He droplets of approximately 1 μm in diameter were obtained in a free nozzle beam expansion of liquid He in vacuum and were studied by single-shot coherent diffractive imaging using an x-ray free electron laser. The formation of strongly deformed droplets is evidenced by large anisotropies and intensity anomalies (streaks) in the obtained diffraction images. The analysis of the images shows that in addition to previously described axially symmetric oblate shapes, some droplets exhibit prolate shapes. Forward modeling of the diffraction images indicates that the shapes of rotating superfluid droplets are very similar to their classical counterparts, givingmore » direct access to the droplet angular momenta and angular velocities. Here, the analyses of the radial intensity distribution and appearance statistics of the anisotropic images confirm the existence of oblate metastable superfluid droplets with large angular momenta beyond the classical bifurcation threshold.« less
Shapes of rotating superfluid helium nanodroplets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernando, Charles; Tanyag, Rico Mayro P.; Jones, Curtis
Rotating superfluid He droplets of approximately 1 μm in diameter were obtained in a free nozzle beam expansion of liquid He in vacuum and were studied by single-shot coherent diffractive imaging using an x-ray free electron laser. The formation of strongly deformed droplets is evidenced by large anisotropies and intensity anomalies (streaks) in the obtained diffraction images. The analysis of the images shows that in addition to previously described axially symmetric oblate shapes, some droplets exhibit prolate shapes. Forward modeling of the diffraction images indicates that the shapes of rotating superfluid droplets are very similar to their classical counterparts, givingmore » direct access to the droplet angular momenta and angular velocities. Here, the analyses of the radial intensity distribution and appearance statistics of the anisotropic images confirm the existence of oblate metastable superfluid droplets with large angular momenta beyond the classical bifurcation threshold.« less
Thermal stability of charged rotating quantum black holes
NASA Astrophysics Data System (ADS)
Sinha, Aloke Kumar; Majumdar, Parthasarathi
2017-12-01
Criteria for thermal stability of charged rotating black holes of any dimension are derived for horizon areas that are large relative to the Planck area (in these dimensions). The derivation is based on generic assumptions of quantum geometry, supported by some results of loop quantum gravity, and equilibrium statistical mechanics of the Grand Canonical ensemble. There is no explicit use of classical spacetime geometry in this analysis. The only assumption is that the mass of the black hole is a function of its horizon area, charge and angular momentum. Our stability criteria are then tested in detail against specific classical black holes in spacetime dimensions 4 and 5, whose metrics provide us with explicit relations for the dependence of the mass on the charge and angular momentum of the black holes. This enables us to predict which of these black holes are expected to be thermally unstable under Hawking radiation.
Medlicott, Shaun A C; Guggisberg, Kelly A; DesCôteaux, Jean-Gaston; Beck, Paul
2006-07-01
Enterocolic lymphocytic phlebitis is a rare cause of segmental ischemic enterocolitis. This artery-sparing transmural vasculitis is classically a circumferential phlebitis with perivenular lymphocyte cuffing and thrombi in the absence of systemic manifestations. Myointimal hyperplasia may represent a chronic phase of enterocolic lymphocytic phlebitis. Subclinical or early stage enterocolic lymphocytic phlebitis is not well delineated. We analyzed 600 submucosal and subserosal veins from both ischemic and intact bowel segments to discern if vascular morphology varied between sites. Crescentic and circumferential lymphocytic phlebitis is more common in viable bowel than in the ischemic segment. A nonsignificant trend was found for increased crescentic morphology between intact bowel remote from the ischemic focus compared with that adjacent to the ischemic focus. Hallmarks of ischemic bowel are necrotizing phlebitis and thrombi formation. Thrombophlebitis morphology is distinctly different in viable and ischemic bowel, changing from the classic lymphocytic to necrotizing lesions respectively.
Connected Text Reading and Differences in Text Reading Fluency in Adult Readers
Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke
2013-01-01
The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177
Chatterji, Madhabi
2002-01-01
This study examines validity of data generated by the School Readiness for Reforms: Leader Questionnaire (SRR-LQ) using an iterative procedure that combines classical and Rasch rating scale analysis. Following content-validation and pilot-testing, principal axis factor extraction and promax rotation of factors yielded a five factor structure consistent with the content-validated subscales of the original instrument. Factors were identified based on inspection of pattern and structure coefficients. The rotated factor pattern, inter-factor correlations, convergent validity coefficients, and Cronbach's alpha reliability estimates supported the hypothesized construct properties. To further examine unidimensionality and efficacy of the rating scale structures, item-level data from each factor-defined subscale were subjected to analysis with the Rasch rating scale model. Data-to-model fit statistics and separation reliability for items and persons met acceptable criteria. Rating scale results suggested consistency of expected and observed step difficulties in rating categories, and correspondence of step calibrations with increases in the underlying variables. The combined approach yielded more comprehensive diagnostic information on the quality of the five SRR-LQ subscales; further research is continuing.
Poloni, Soraia; Leistner-Segal, Sandra; Bandeira, Isabel Cristina; D'Almeida, Vânia; de Souza, Carolina Fischinger Moura; Spritzer, Poli Mara; Castro, Kamila; Tonon, Tássia; Nalin, Tatiéle; Imbard, Apolline; Blom, Henk J; Schwartz, Ida V D
2014-08-10
Classical homocystinuria is a rare genetic disease caused by cystathionine β-synthase deficiency, resulting in homocysteine accumulation. Growing evidence suggests that reduced fat mass in patients with classical homocystinuria may be associated with alterations in choline and homocysteine pathways. This study aimed to evaluate the body composition of patients with classical homocystinuria, identifying changes in body fat percentage and correlating findings with biochemical markers of homocysteine and choline pathways, lipoprotein levels and bone mineral density (BMD) T-scores. Nine patients with classical homocystinuria were included in the study. Levels of homocysteine, methionine, cysteine, choline, betaine, dimethylglycine and ethanolamine were determined. Body composition was assessed by bioelectrical impedance analysis (BIA) in patients and in 18 controls. Data on the last BMD measurement and lipoprotein profile were obtained from medical records. Of 9 patients, 4 (44%) had a low body fat percentage, but no statistically significant differences were found between patients and controls. Homocysteine and methionine levels were negatively correlated with body mass index (BMI), while cysteine showed a positive correlation with BMI (p<0.05). There was a trend between total choline levels and body fat percentage (r=0.439, p=0.07). HDL cholesterol correlated with choline and ethanolamine levels (r=0.757, p=0.049; r=0.847, p=0.016, respectively), and total cholesterol also correlated with choline levels (r=0.775, p=0.041). There was no association between BMD T-scores and body composition. These results suggest that reduced fat mass is common in patients with classical homocystinuria, and that alterations in homocysteine and choline pathways affect body mass and lipid metabolism. Copyright © 2014 Elsevier B.V. All rights reserved.
A Gibbs sampler for Bayesian analysis of site-occupancy data
Dorazio, Robert M.; Rodriguez, Daniel Taylor
2012-01-01
1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.
USDA-ARS?s Scientific Manuscript database
A classic paper on the integrated control concept appeared in the later part of the 1950’s, led by Vernon Stern, Ray Smith, Robert van den Bosch, and Kenneth Hagen. Numerous concepts and definitions were formulated at that time. In this presentation, a short philosophical summary will be presented...
Yang, James J; Li, Jia; Williams, L Keoki; Buu, Anne
2016-01-05
In genome-wide association studies (GWAS) for complex diseases, the association between a SNP and each phenotype is usually weak. Combining multiple related phenotypic traits can increase the power of gene search and thus is a practically important area that requires methodology work. This study provides a comprehensive review of existing methods for conducting GWAS on complex diseases with multiple phenotypes including the multivariate analysis of variance (MANOVA), the principal component analysis (PCA), the generalizing estimating equations (GEE), the trait-based association test involving the extended Simes procedure (TATES), and the classical Fisher combination test. We propose a new method that relaxes the unrealistic independence assumption of the classical Fisher combination test and is computationally efficient. To demonstrate applications of the proposed method, we also present the results of statistical analysis on the Study of Addiction: Genetics and Environment (SAGE) data. Our simulation study shows that the proposed method has higher power than existing methods while controlling for the type I error rate. The GEE and the classical Fisher combination test, on the other hand, do not control the type I error rate and thus are not recommended. In general, the power of the competing methods decreases as the correlation between phenotypes increases. All the methods tend to have lower power when the multivariate phenotypes come from long tailed distributions. The real data analysis also demonstrates that the proposed method allows us to compare the marginal results with the multivariate results and specify which SNPs are specific to a particular phenotype or contribute to the common construct. The proposed method outperforms existing methods in most settings and also has great applications in GWAS on complex diseases with multiple phenotypes such as the substance abuse disorders.
CTTITEM: SAS macro and SPSS syntax for classical item analysis.
Lei, Pui-Wa; Wu, Qiong
2007-08-01
This article describes the functions of a SAS macro and an SPSS syntax that produce common statistics for conventional item analysis including Cronbach's alpha, item difficulty index (p-value or item mean), and item discrimination indices (D-index, point biserial and biserial correlations for dichotomous items and item-total correlation for polytomous items). These programs represent an improvement over the existing SAS and SPSS item analysis routines in terms of completeness and user-friendliness. To promote routine evaluations of item qualities in instrument development of any scale, the programs are available at no charge for interested users. The program codes along with a brief user's manual that contains instructions and examples are downloadable from suen.ed.psu.edu/-pwlei/plei.htm.
Wright, L E
1997-02-01
Enamel hypoplasias, which record interacting stresses of nutrition and illness during the period of tooth formation, are a key tool in the study of childhood health in prehistory. But interpretation of the age of peak morbidity is complicated by differences in susceptibility to stress both between tooth positions and within a single tooth. Here, hypoplasias are used to evaluate the prevailing ecological model for the collapse of Classic Period Lowland Maya civilization, circa AD 900. Hypoplasias were recorded in the full dentition of 160 adult skeletons from six archaeological sites in the Pasion River region of Guatemala. Instead of constructing a composite scale of stress experience, teeth are considered separately by position in the analysis. No statistical differences are found in the proportion of teeth affected by hypoplasia between "Early," Late Classic, and Terminal Classic Periods for anterior teeth considered to be most susceptible to stress, indicating stability in the overall stress loads affecting children of the three chronological periods. However, hypoplasia trends in posterior teeth may imply a change in the ontogenetic-timing of more severe stress episodes during the final occupation and perhaps herald a shift in child-care practices. These results provide little support for the ecological model of collapse but do call to attention the potential of posterior teeth to reveal subtle changes in childhood morbidity when consideredindividually.
Prudlo, Johannes; Bißbort, Charlotte; Glass, Aenne; Grossmann, Annette; Hauenstein, Karlheinz; Benecke, Reiner; Teipel, Stefan J
2012-09-01
The aim of this work was to investigate white-matter microstructural changes within and outside the corticospinal tract in classical amyotrophic lateral sclerosis (ALS) and in lower motor neuron (LMN) ALS variants by means of diffusion tensor imaging (DTI). We investigated 22 ALS patients and 21 age-matched controls utilizing a whole-brain approach with a 1.5-T scanner for DTI. The patient group was comprised of 15 classical ALS- and seven LMN ALS-variant patients (progressive muscular atrophy, flail arm and flail leg syndrome). Disease severity was measured by the revised version of the functional rating scale. White matter fractional anisotropy (FA) was assessed using tract-based spatial statistics (TBSS) and a region of interest (ROI) approach. We found significant FA reductions in motor and extra-motor cerebral fiber tracts in classical ALS and in the LMN ALS-variant patients compared to controls. The voxel-based TBSS results were confirmed by the ROI findings. The white matter damage correlated with the disease severity in the patient group and was found in a similar distribution, but to a lesser extent, among the LMN ALS-variant subgroup. ALS and LMN ALS variants are multisystem degenerations. DTI shows the potential to determine an earlier diagnosis, particularly in LMN ALS variants. The statistically identical findings of white matter lesions in classical ALS and LMN variants as ascertained by DTI further underline that these variants should be regarded as part of the ALS spectrum.
Interference in the classical probabilistic model and its representation in complex Hilbert space
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei Yu.
2005-10-01
The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.
Epistemic View of Quantum States and Communication Complexity of Quantum Channels
NASA Astrophysics Data System (ADS)
Montina, Alberto
2012-09-01
The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.
Asymptotic modal analysis and statistical energy analysis
NASA Technical Reports Server (NTRS)
Dowell, Earl H.
1988-01-01
Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.
Revesz, Kinga M.; Landwehr, Jurate M.
2002-01-01
A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 ± 20 µg) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H3PO4/CaCO3) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H3PO4/CaCO3 reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 °C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was ≤0.1 and ≤0.2 per mill or ‰, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for δ18O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings.
Bravini, Elisabetta; Franchignoni, Franco; Giordano, Andrea; Sartorio, Francesco; Ferriero, Giorgio; Vercelli, Stefano; Foti, Calogero
2015-01-01
To perform a comprehensive analysis of the psychometric properties and dimensionality of the Upper Limb Functional Index (ULFI) using both classical test theory and Rasch analysis (RA). Prospective, single-group observational design. Freestanding rehabilitation center. Convenience sample of Italian-speaking subjects with upper limb musculoskeletal disorders (N=174). Not applicable. The Italian version of the ULFI. Data were analyzed using parallel analysis, exploratory factor analysis, and RA for evaluating dimensionality, functioning of rating scale categories, item fit, hierarchy of item difficulties, and reliability indices. Parallel analysis revealed 2 factors explaining 32.5% and 10.7% of the response variance. RA confirmed the failure of the unidimensionality assumption, and 6 items out of the 25 misfitted the Rasch model. When the analysis was rerun excluding the misfitting items, the scale showed acceptable fit values, loading meaningfully to a single factor. Item separation reliability and person separation reliability were .98 and .89, respectively. Cronbach alpha was .92. RA revealed weakness of the scale concerning dimensionality and internal construct validity. However, a set of 19 ULFI items defined through the statistical process demonstrated a unidimensional structure, good psychometric properties, and clinical meaningfulness. These findings represent a useful starting point for further analyses of the tool (based on modern psychometric approaches and confirmatory factor analysis) in larger samples, including different patient populations and nationalities. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Clopper-Pearson bounds from HEP data cuts
NASA Astrophysics Data System (ADS)
Berg, B. A.
2001-08-01
For the measurement of Ns signals in N events rigorous confidence bounds on the true signal probability pexact were established in a classical paper by Clopper and Pearson [Biometrica 26, 404 (1934)]. Here, their bounds are generalized to the HEP situation where cuts on the data tag signals with probability Ps and background data with likelihood Pb
Wash-out in N{sub 2}-dominated leptogenesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hahn-Woernle, F., E-mail: fhahnwo@mppmu.mpg.de
2010-08-01
We study the wash-out of a cosmological baryon asymmetry produced via leptogenesis by subsequent interactions. Therefore we focus on a scenario in which a lepton asymmetry is established in the out-of-equilibrium decays of the next-to-lightest right-handed neutrino. We apply the full classical Boltzmann equations without the assumption of kinetic equilibrium and including all quantum statistical factors to calculate the wash-out of the lepton asymmetry by interactions of the lightest right-handed state. We include scattering processes with top quarks in our analysis. This is of particular interest since the wash-out is enhanced by scatterings and the use of mode equations withmore » quantum statistical distribution functions. In this way we provide a restriction on the parameter space for this scenario.« less
On the Chronological Structure of the Solutrean in Southern Iberia
Cascalheira, João; Bicho, Nuno
2015-01-01
The Solutrean techno-complex has gained particular significance over time for representing a clear demographic and techno-typological deviation from the developments occurred during the course of the Upper Paleolithic in Western Europe. Some of Solutrean’s most relevant features are the diversity and techno-typological characteristics of the lithic armatures. These have been recurrently used as pivotal elements in numerous Solutrean-related debates, including the chronological organization of the techno-complex across Iberia and Southwestern France. In Southern Iberia, patterns of presence and/or absence of specific point types in stratified sequences tend to validate the classical ordering of the techno-complex into Lower, Middle and Upper phases, although some evidence, namely radiocarbon determinations, have not always been corroborative. Here we present the first comprehensive analysis of the currently available radiocarbon data for the Solutrean in Southern Iberia. We use a Bayesian statistical approach from 13 stratified sequences to compare the duration, and the start and end moments of each classic Solutrean phase across sites. We conclude that, based on the current data, the traditional organization of the Solutrean cannot be unquestionably confirmed for Southern Iberia, calling into doubt the status of the classically-defined type-fossils as precise temporal markers. PMID:26355459
Wetting of heterogeneous substrates. A classical density-functional-theory approach
NASA Astrophysics Data System (ADS)
Yatsyshin, Peter; Parry, Andrew O.; Rascón, Carlos; Duran-Olivencia, Miguel A.; Kalliadasis, Serafim
2017-11-01
Wetting is a nucleation of a third phase (liquid) on the interface between two different phases (solid and gas). In many experimentally accessible cases of wetting, the interplay between the substrate structure, and the fluid-fluid and fluid-substrate intermolecular interactions leads to the appearance of a whole ``zoo'' of exciting interface phase transitions, associated with the formation of nano-droplets/bubbles, and thin films. Practical applications of wetting at small scales are numerous and include the design of lab-on-a-chip devices and superhydrophobic surfaces. In this talk, we will use a fully microscopic approach to explore the phase space of a planar wall, decorated with patches of different hydrophobicity, and demonstrate the highly non-trivial behaviour of the liquid-gas interface near the substrate. We will present fluid density profiles, adsorption isotherms and wetting phase diagrams. Our analysis is based on a formulation of statistical mechanics, commonly known as classical density-functional theory. It provides a computationally-friendly and rigorous framework, suitable for probing small-scale physics of classical fluids and other soft-matter systems. EPSRC Grants No. EP/L027186,EP/K503733;ERC Advanced Grant No. 247031.
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
Statistical Interpretation of the Local Field Inside Dielectrics.
ERIC Educational Resources Information Center
Berrera, Ruben G.; Mello, P. A.
1982-01-01
Compares several derivations of the Clausius-Mossotti relation to analyze consistently the nature of approximations used and their range of applicability. Also presents a statistical-mechanical calculation of the local field for classical system of harmonic oscillators interacting via the Coulomb potential. (Author/SK)
Portfolio Analysis for Vector Calculus
ERIC Educational Resources Information Center
Kaplan, Samuel R.
2015-01-01
Classic stock portfolio analysis provides an applied context for Lagrange multipliers that undergraduate students appreciate. Although modern methods of portfolio analysis are beyond the scope of vector calculus, classic methods reinforce the utility of this material. This paper discusses how to introduce classic stock portfolio analysis in a…
Properties of the Boltzmann equation in the classical approximation
Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...
2014-12-30
We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Hongqiang; Chen, Hao; Bao, Lei
2005-01-01
Genetic loci that regulate inherited traits are routinely identified using quantitative trait locus (QTL) mapping methods. However, the genotype-phenotype associations do not provide information on the gene expression program through which the genetic loci regulate the traits. Transcription modules are 'selfconsistent regulatory units' and are closely related to the modular components of gene regulatory network [Ihmels, J., Friedlander, G., Bergmann, S., Sarig, O., Ziv, Y. and Barkai, N. (2002) Revealing modular organization in the yeast transcriptional network. Nat. Genet., 31, 370-377; Segal, E., Shapira, M., Regev, A., Pe'er, D., Botstein, D., Koller, D. and Friedman, N. (2003) Module networks: identifyingmore » regulatory modules and their condition-specific regulators from gene expression data. Nat. Genet., 34, 166-176]. We used genome-wide genotype and gene expression data of a genetic reference population that consists of mice of 32 recombinant inbred strains to identify the transcription modules and the genetic loci regulating them. Twenty-nine transcription modules defined by genetic variations were identified. Statistically significant associations between the transcription modules and 18 classical physiological and behavioral traits were found. Genome-wide interval mapping showed that major QTLs regulating the transcription modules are often co-localized with the QTLs regulating the associated classical traits. The association and the possible co-regulation of the classical trait and transcription module indicate that the transcription module may be involved in the gene pathways connecting the QTL and the classical trait. Our results show that a transcription module may associate with multiple seemingly unrelated classical traits and a classical trait may associate with different modules. Literature mining results provided strong independent evidences for the relations among genes of the transcription modules, genes in the regions of the QTLs regulating the transcription modules and the keywords representing the classical traits.« less
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Evidence of non-classical (squeezed) light in biological systems
NASA Astrophysics Data System (ADS)
Popp, F. A.; Chang, J. J.; Herzog, A.; Yan, Z.; Yan, Y.
2002-01-01
By use of coincidence measurements on “ultraweak” photon emission, the photocount statistics (PCS) of artificial visible light turns out to follow-as expected-super-Poissonian PCS. Biophotons, originating from spontaneous or light-induced living systems, display super-Poissonian, Poissonian and even sub-Poissonian PCS. This result shows the first time evidence of non-classical (squeezed) light in living tissues.
Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.
NASA Astrophysics Data System (ADS)
Courtney, Michael
1995-01-01
Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).
Prequantum classical statistical field theory: background field as a source of everything?
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2011-07-01
Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's "double solution" approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special "prequantum fields": the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the "photonic field" (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of "vacuum fluctuations") might play the role of a source of such pulses, i.e., the source of everything.
Scaling properties of the two-dimensional randomly stirred Navier-Stokes equation.
Mazzino, Andrea; Muratore-Ginanneschi, Paolo; Musacchio, Stefano
2007-10-05
We inquire into the scaling properties of the 2D Navier-Stokes equation sustained by a force field with Gaussian statistics, white noise in time, and with a power-law correlation in momentum space of degree 2 - 2 epsilon. This is at variance with the setting usually assumed to derive Kraichnan's classical theory. We contrast accurate numerical experiments with the different predictions provided for the small epsilon regime by Kraichnan's double cascade theory and by renormalization group analysis. We give clear evidence that for all epsilon, Kraichnan's theory is consistent with the observed phenomenology. Our results call for a revision in the renormalization group analysis of (2D) fully developed turbulence.
Tukiendorf, Andrzej; Mansournia, Mohammad Ali; Wydmański, Jerzy; Wolny-Rokicka, Edyta
2017-04-01
Background: Clinical datasets for epithelial ovarian cancer brain metastatic patients are usually small in size. When adequate case numbers are lacking, resulting estimates of regression coefficients may demonstrate bias. One of the direct approaches to reduce such sparse-data bias is based on penalized estimation. Methods: A re- analysis of formerly reported hazard ratios in diagnosed patients was performed using penalized Cox regression with a popular SAS package providing additional software codes for a statistical computational procedure. Results: It was found that the penalized approach can readily diminish sparse data artefacts and radically reduce the magnitude of estimated regression coefficients. Conclusions: It was confirmed that classical statistical approaches may exaggerate regression estimates or distort study interpretations and conclusions. The results support the thesis that penalization via weak informative priors and data augmentation are the safest approaches to shrink sparse data artefacts frequently occurring in epidemiological research. Creative Commons Attribution License
Quantum signature of chaos and thermalization in the kicked Dicke model
NASA Astrophysics Data System (ADS)
Ray, S.; Ghosh, A.; Sinha, S.
2016-09-01
We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.
Quantum signature of chaos and thermalization in the kicked Dicke model.
Ray, S; Ghosh, A; Sinha, S
2016-09-01
We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.
1984-01-01
Summary studies are presented for the entire cloud observation archieve from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long range airline routes, and to assess the probability and extent of laminar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical.
Counting statistics of chaotic resonances at optical frequencies: Theory and experiments
NASA Astrophysics Data System (ADS)
Lippolis, Domenico; Wang, Li; Xiao, Yun-Feng
2017-07-01
A deformed dielectric microcavity is used as an experimental platform for the analysis of the statistics of chaotic resonances, in the perspective of testing fractal Weyl laws at optical frequencies. In order to surmount the difficulties that arise from reading strongly overlapping spectra, we exploit the mixed nature of the phase space at hand, and only count the high-Q whispering-gallery modes (WGMs) directly. That enables us to draw statistical information on the more lossy chaotic resonances, coupled to the high-Q regular modes via dynamical tunneling. Three different models [classical, Random-Matrix-Theory (RMT) based, semiclassical] to interpret the experimental data are discussed. On the basis of least-squares analysis, theoretical estimates of Ehrenfest time, and independent measurements, we find that a semiclassically modified RMT-based expression best describes the experiment in all its realizations, particularly when the resonator is coupled to visible light, while RMT alone still works quite well in the infrared. In this work we reexamine and substantially extend the results of a short paper published earlier [L. Wang et al., Phys. Rev. E 93, 040201(R) (2016), 10.1103/PhysRevE.93.040201].
Statistical Methodology for the Analysis of Repeated Duration Data in Behavioral Studies.
Letué, Frédérique; Martinez, Marie-José; Samson, Adeline; Vilain, Anne; Vilain, Coriandre
2018-03-15
Repeated duration data are frequently used in behavioral studies. Classical linear or log-linear mixed models are often inadequate to analyze such data, because they usually consist of nonnegative and skew-distributed variables. Therefore, we recommend use of a statistical methodology specific to duration data. We propose a methodology based on Cox mixed models and written under the R language. This semiparametric model is indeed flexible enough to fit duration data. To compare log-linear and Cox mixed models in terms of goodness-of-fit on real data sets, we also provide a procedure based on simulations and quantile-quantile plots. We present two examples from a data set of speech and gesture interactions, which illustrate the limitations of linear and log-linear mixed models, as compared to Cox models. The linear models are not validated on our data, whereas Cox models are. Moreover, in the second example, the Cox model exhibits a significant effect that the linear model does not. We provide methods to select the best-fitting models for repeated duration data and to compare statistical methodologies. In this study, we show that Cox models are best suited to the analysis of our data set.
Hidden Statistics of Schroedinger Equation
NASA Technical Reports Server (NTRS)
Zak, Michail
2011-01-01
Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.
2014-02-14
Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less
Steenbergen, K G; Gaston, N
2014-02-14
Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.
Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Brunett, Acacia
2015-04-26
The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Symbolic Analysis of Heart Rate Variability During Exposure to Musical Auditory Stimulation.
Vanderlei, Franciele Marques; de Abreu, Luiz Carlos; Garner, David Matthew; Valenti, Vitor Engrácia
2016-01-01
In recent years, the application of nonlinear methods for analysis of heart rate variability (HRV) has increased. However, studies on the influence of music on cardiac autonomic modulation in those circumstances are rare. The research team aimed to evaluate the acute effects on HRV of selected auditory stimulation by 2 musical styles, measuring the results using nonlinear methods of analysis: Shannon entropy, symbolic analysis, and correlation-dimension analysis. Prospective control study in which the volunteers were exposed to music and variables were compared between control (no auditory stimulation) and during exposure to music. All procedures were performed in a sound-proofed room at the Faculty of Science and Technology at São Paulo State University (UNESP), São Paulo, Brazil. Participants were 22 healthy female students, aged between 18 and 30 y. Prior to the actual intervention, the participants remained at rest for 20 min, and then they were exposed to one of the selected types of music, either classical baroque (64-84 dB) or heavy-metal (75-84 dB). Each musical session lasted a total of 5 min and 15 s. At a point occurring up to 1 wk after that day, the participants listened to the second type of music. The 2 types of music were delivered in a random sequence that depended on the group to which the participant was assigned. The study analyzed the following HRV indices through Shannon entropy; symbolic analysis-0V%, 1V%, 2LV%, and 2ULV%; and correlation-dimension analysis. During exposure to auditory stimulation by heavy-metal or classical baroque music, the study established no statistically significant variations regarding the indices for the Shannon entropy; the symbolic analysis-0V%, 1V%, and 2ULV%; and the correlation-dimension analysis. However, during heavy-metal music, the 2LV% index in the symbolic analysis was reduced compared with the controls. Auditory stimulation with the heavy-metal music reduced the parasympathetic modulation of HRV, whereas no significant changes occurred in cardiac autonomic modulation during exposure to the classical music.
Statistical measures of Planck scale signal correlations in interferometers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Craig J.; Kwon, Ohkyung
2015-06-22
A model-independent statistical framework is presented to interpret data from systems where the mean time derivative of positional cross correlation between world lines, a measure of spreading in a quantum geometrical wave function, is measured with a precision smaller than the Planck time. The framework provides a general way to constrain possible departures from perfect independence of classical world lines, associated with Planck scale bounds on positional information. A parametrized candidate set of possible correlation functions is shown to be consistent with the known causal structure of the classical geometry measured by an apparatus, and the holographic scaling of informationmore » suggested by gravity. Frequency-domain power spectra are derived that can be compared with interferometer data. As a result, simple projections of sensitivity for specific experimental set-ups suggests that measurements will directly yield constraints on a universal time derivative of the correlation function, and thereby confirm or rule out a class of Planck scale departures from classical geometry.« less
Shaĭdakov, E V; Grigorian, A G; Iliukhin, E A; Bulatov, V L; Gal'chenko, M I
2014-01-01
Data concerning the effect of the target vein's diameter on efficacy of radiofrequency obliteration (RFO) in the current literature are limited. To assess efficacy of RFO and stripping, peculiarities of the postoperative period course with due regard for the diameter of the target veins, to compare the outcomes of RFO and classical phlebectomy in treatment of varicose disease during 1-year follow up by a composite end point. A multicenter prospective non-randomized study based on analysing therapeutic outcomes in a total of 218 patients presenting with varicose disease (C2-C3 according to the CEAP). RFO was performed in 108 patients and phlebectomy in 110 subjects. The results were assessed by means of a composite end point including four components: technical outcome at 1-year follow-up, pain, subcutaneous haemorrhage, and paresthesias. The groups of patients who endured RFO and phlebectomy were subdivided into two subgroups according to the target vein's diameter with a border of 14 mm. Statistical analysis. We used the methods of non-parametric statistics (contingency tables, chi squared test), calculating the odds ratio (OR) for a favourable outcome with a 95% confidential interval. Pain dynamics was assessed by means of intellectual data analysis (cluster analysis). «Phelbectomy ≥ 14 mm» and «RFO ≥ 14 mm». The incidence rate of a good outcome in the subgroups amounted to 20 (30.8%) and 61 (95.3%), respectively. The odds ratio for favourable outcome between the subgroups of RFA and phlebectomy amounted to 45.8; 95% CI (44.5-47.0). "RFA ≥ 14 mm" and "RFA < 14 mm". Favourable outcome rate in the subgroups amounted to 25 (39.1%) and 17 (38.6%), respectively. The differences were not statistically significant, p=0.24. The odds ratio for a good outcome between the RFO subgroups amounted to: OR=0.98; 95% CI (0.18-1.77). Comparative analysis of RFO outcomes between the clinics. Favourable outcome rate in the first clinic was 50 (92.6%), in the second 34 (87.2%), and in the third 13 (86.6%), with the difference being statistically insignificant, p=0.7. The cluster analysis of the pain dynamics after the intervention. The clusters with moderate pain were composed of the patients after phlebectomy. These clusters showed association of pain intensity with increased BMI and greater vein diameter. 1) RFA of great-diameter veins by a favourable outcome by the composite end point (CED) turned out to be superior to the classic phlebectomy. 2) For RFA the incidence rate of a favourable outcome by the CED does not depend on the target vein's diameter. 3) A pronounced pain syndrome after phlebectomy was associated with excessive body weight or obesity and greater diameter of the vein.
An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.
ERIC Educational Resources Information Center
Capraro, Mary Margaret
This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…
ERIC Educational Resources Information Center
Sevilla, F. J.; Olivares-Quiroz, L.
2012-01-01
In this work, we address the concept of the chemical potential [mu] in classical and quantum gases towards the calculation of the equation of state [mu] = [mu](n, T) where n is the particle density and "T" the absolute temperature using the methods of equilibrium statistical mechanics. Two cases seldom discussed in elementary textbooks are…
Recurrence theorems: A unified account
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wallace, David, E-mail: david.wallace@balliol.ox.ac.uk
I discuss classical and quantum recurrence theorems in a unified manner, treating both as generalisations of the fact that a system with a finite state space only has so many places to go. Along the way, I prove versions of the recurrence theorem applicable to dynamics on linear and metric spaces and make some comments about applications of the classical recurrence theorem in the foundations of statistical mechanics.
Fisher information as a generalized measure of coherence in classical and quantum optics.
Luis, Alfredo
2012-10-22
We show that metrological resolution in the detection of small phase shifts provides a suitable generalization of the degrees of coherence and polarization. Resolution is estimated via Fisher information. Besides the standard two-beam Gaussian case, this approach provides also good results for multiple field components and nonGaussian statistics. This works equally well in quantum and classical optics.
Quantum approach to classical statistical mechanics.
Somma, R D; Batista, C D; Ortiz, G
2007-07-20
We present a new approach to study the thermodynamic properties of d-dimensional classical systems by reducing the problem to the computation of ground state properties of a d-dimensional quantum model. This classical-to-quantum mapping allows us to extend the scope of standard optimization methods by unifying them under a general framework. The quantum annealing method is naturally extended to simulate classical systems at finite temperatures. We derive the rates to assure convergence to the optimal thermodynamic state using the adiabatic theorem of quantum mechanics. For simulated and quantum annealing, we obtain the asymptotic rates of T(t) approximately (pN)/(k(B)logt) and gamma(t) approximately (Nt)(-c/N), for the temperature and magnetic field, respectively. Other annealing strategies are also discussed.
Karyomorphometric analysis of Fritillaria montana group in Greece.
Samaropoulou, Sofia; Bareka, Pepy; Kamari, Georgia
2016-01-01
Fritillaria Linnaeus, 1753 (Liliaceae) is a genus of geophytes, represented in Greece by 29 taxa. Most of the Greek species are endemic to the country and/or threatened. Although their classical cytotaxonomic studies have already been presented, no karyomorphometric analysis has ever been given. In the present study, the cytological results of Fritillaria montana Hoppe ex W.D.J. Koch, 1832 group, which includes Fritillaria epirotica Turrill ex Rix, 1975 and Fritillaria montana are statistically evaluated for the first time. Further indices about interchromosomal and intrachromosomal asymmetry are given. A new population of Fritillaria epirotica is also investigated, while for Fritillaria montana , a diploid individual was found in a known as triploid population. Paired t-tests and PCoA analysis have been applied to compare the two species.
Colours of minor bodies in the outer solar system. II. A statistical analysis revisited
NASA Astrophysics Data System (ADS)
Hainaut, O. R.; Boehnhardt, H.; Protopapa, S.
2012-10-01
We present an update of the visible and near-infrared colour database of Minor Bodies in the Outer Solar System (MBOSSes), which now includes over 2000 measurement epochs of 555 objects, extracted from over 100 articles. The list is fairly complete as of December 2011. The database is now large enough to enable any dataset with a large dispersion to be safely identified and rejected from the analysis. The selection method used is quite insensitive to individual outliers. Most of the rejected datasets were observed during the early days of MBOSS photometry. The individual measurements are combined in a way that avoids possible rotational artifacts. The spectral gradient over the visible range is derived from the colours, as well as the R absolute magnitude M(1,1). The average colours, absolute magnitude, and spectral gradient are listed for each object, as well as the physico-dynamical classes using a classification adapted from Gladman and collaborators. Colour-colour diagrams, histograms, and various other plots are presented to illustrate and investigate class characteristics and trends with other parameters, whose significances are evaluated using standard statistical tests. Except for a small discrepancy for the J-H colour, the largest objects, with M(1,1) < 5, are indistinguishable from the smaller ones. The larger ones are slightly bluer than the smaller ones in J-H. Short-period comets, Plutinos and other resonant objects, hot classical disk objects, scattered disk objects and detached disk objects have similar properties in the visible, while the cold classical disk objects and the Jupiter Trojans form two separate groups of their spectral properties in the visible wavelength range. The well-known colour bimodality of Centaurs is confirmed. The hot classical disk objects with large inclinations, or large orbital excitations are found to be bluer than the others, confirming a previously known result. Additionally, the hot classical disk objects with a smaller perihelion distance are bluer than those that do not come as close to the Sun. The bluer hot classical disk objects and resonant objects have fainter absolute magnitudes than the redder ones of the same class. Finally, we discuss possible scenarios for the origin of the colour diversity observed in MBOSSes, i.e. colouration caused by evolutionary or formation processes. The colour tables and all plots are also available on the MBOSS colour web page, which will be updated when new measurements are published Full Tables 2 and 3 are only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/546/A115
Limit Theorems for Dispersing Billiards with Cusps
NASA Astrophysics Data System (ADS)
Bálint, P.; Chernov, N.; Dolgopyat, D.
2011-12-01
Dispersing billiards with cusps are deterministic dynamical systems with a mild degree of chaos, exhibiting "intermittent" behavior that alternates between regular and chaotic patterns. Their statistical properties are therefore weak and delicate. They are characterized by a slow (power-law) decay of correlations, and as a result the classical central limit theorem fails. We prove that a non-classical central limit theorem holds, with a scaling factor of {sqrt{nlog n}} replacing the standard {sqrt{n}} . We also derive the respective Weak Invariance Principle, and we identify the class of observables for which the classical CLT still holds.
Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan
2018-03-01
Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.
MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data
Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.
2014-01-01
Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674
Quantum probabilistic logic programming
NASA Astrophysics Data System (ADS)
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong
2014-06-04
Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be improved.
A quantum–quantum Metropolis algorithm
Yung, Man-Hong; Aspuru-Guzik, Alán
2012-01-01
The classical Metropolis sampling method is a cornerstone of many statistical modeling applications that range from physics, chemistry, and biology to economics. This method is particularly suitable for sampling the thermal distributions of classical systems. The challenge of extending this method to the simulation of arbitrary quantum systems is that, in general, eigenstates of quantum Hamiltonians cannot be obtained efficiently with a classical computer. However, this challenge can be overcome by quantum computers. Here, we present a quantum algorithm which fully generalizes the classical Metropolis algorithm to the quantum domain. The meaning of quantum generalization is twofold: The proposed algorithm is not only applicable to both classical and quantum systems, but also offers a quantum speedup relative to the classical counterpart. Furthermore, unlike the classical method of quantum Monte Carlo, this quantum algorithm does not suffer from the negative-sign problem associated with fermionic systems. Applications of this algorithm include the study of low-temperature properties of quantum systems, such as the Hubbard model, and preparing the thermal states of sizable molecules to simulate, for example, chemical reactions at an arbitrary temperature. PMID:22215584
NASA Astrophysics Data System (ADS)
Oberlack, Martin; Rosteck, Andreas; Avsarkisov, Victor
2013-11-01
Text-book knowledge proclaims that Lie symmetries such as Galilean transformation lie at the heart of fluid dynamics. These important properties also carry over to the statistical description of turbulence, i.e. to the Reynolds stress transport equations and its generalization, the multi-point correlation equations (MPCE). Interesting enough, the MPCE admit a much larger set of symmetries, in fact infinite dimensional, subsequently named statistical symmetries. Most important, theses new symmetries have important consequences for our understanding of turbulent scaling laws. The symmetries form the essential foundation to construct exact solutions to the infinite set of MPCE, which in turn are identified as classical and new turbulent scaling laws. Examples on various classical and new shear flow scaling laws including higher order moments will be presented. Even new scaling have been forecasted from these symmetries and in turn validated by DNS. Turbulence modellers have implicitly recognized at least one of the statistical symmetries as this is the basis for the usual log-law which has been employed for calibrating essentially all engineering turbulence models. An obvious conclusion is to generally make turbulence models consistent with the new statistical symmetries.
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
1982-02-15
function of the doping density at 300 and 77 K for the classical Boltzmann statistics or depletion approximation (solid line) and for the approximate...Fermi-Dirac statistics (equation (19) dotted line)• This comparison demonstrates that the deviation from Boltzmann statistics is quite noticeable...tunneling Schottky barriers cannot be obtained at these doping levels. The dotted lines are obtained when Boltzmann statistics are used in the Al Ga
Reliability of a Measure of Institutional Discrimination against Minorities
1979-12-01
samples are presented. The first is based upon classical statistical theory and the second derives from a series of computer-generated Monte Carlo...Institutional racism and sexism . Englewood Cliffs, N. J.: Prentice-Hall, Inc., 1978. Hays, W. L. and Winkler, R. L. Statistics : probability, inference... statistical measure of the e of institutional discrimination are discussed. Two methods of dealing with the problem of reliability of the measure in small
Quantum vertex model for reversible classical computing.
Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C
2017-05-12
Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.
Quantum vertex model for reversible classical computing
NASA Astrophysics Data System (ADS)
Chamon, C.; Mucciolo, E. R.; Ruckenstein, A. E.; Yang, Z.-C.
2017-05-01
Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without `learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.
Statistical methods for biodosimetry in the presence of both Berkson and classical measurement error
NASA Astrophysics Data System (ADS)
Miller, Austin
In radiation epidemiology, the true dose received by those exposed cannot be assessed directly. Physical dosimetry uses a deterministic function of the source term, distance and shielding to estimate dose. For the atomic bomb survivors, the physical dosimetry system is well established. The classical measurement errors plaguing the location and shielding inputs to the physical dosimetry system are well known. Adjusting for the associated biases requires an estimate for the classical measurement error variance, for which no data-driven estimate exists. In this case, an instrumental variable solution is the most viable option to overcome the classical measurement error indeterminacy. Biological indicators of dose may serve as instrumental variables. Specification of the biodosimeter dose-response model requires identification of the radiosensitivity variables, for which we develop statistical definitions and variables. More recently, researchers have recognized Berkson error in the dose estimates, introduced by averaging assumptions for many components in the physical dosimetry system. We show that Berkson error induces a bias in the instrumental variable estimate of the dose-response coefficient, and then address the estimation problem. This model is specified by developing an instrumental variable mixed measurement error likelihood function, which is then maximized using a Monte Carlo EM Algorithm. These methods produce dose estimates that incorporate information from both physical and biological indicators of dose, as well as the first instrumental variable based data-driven estimate for the classical measurement error variance.
Molenaar, Peter C M
2007-03-01
I am in general agreement with Toomela's (Integrative Psychological and Behavioral Science doi:10.1007/s12124-007-9004-0, 2007) plea for an alternative psychological methodology inspired by his description of the German-Austrian orientation. I will argue, however, that this alternative methodology has to be based on the classical ergodic theorems, using state-of-the-art statistical time series analysis of intra-individual variation as its main tool. Some more specific points made by Toomela will be criticized, while for others a more extreme elaboration along the lines indicated by Toomela is proposed.
Rolling-Bearing Service Life Based on Probable Cause for Removal: A Tutorial
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Branzai, Emanuel V.
2017-01-01
In 1947 and 1952, Gustaf Lundberg and Arvid Palmgren developed what is now referred to as the Lundberg-Palmgren Model for Rolling Bearing Life Prediction based on classical rolling-element fatigue. Today, bearing fatigue probably accounts for less than 5 percent of bearings removed from service for cause. A bearing service life prediction methodology and tutorial indexed to eight probable causes for bearing removal, including fatigue, are presented, which incorporate strict series reliability; Weibull statistical analysis; available published field data from the Naval Air Rework Facility; and 224,000 rolling-element bearings removed for rework from commercial aircraft engines.
J. Grabinsky; A. Aldama; A. Chacalo; H. J. Vazquez
2000-01-01
Inventory data of Mexico City's street trees were studied using classical statistical arboricultural and ecological statistical approaches. Multivariate techniques were applied to both. Results did not differ substantially and were complementary. It was possible to reduce inventory data and to group species, boroughs, blocks, and variables.
Seed Dispersal Near and Far: Patterns Across Temperate and Tropical Forests
James S. Clark; Miles Silman; Ruth Kern; Eric Macklin; Janneke HilleRisLambers
1999-01-01
Dispersal affects community dynamics and vegetation response to global change. Understanding these effects requires descriptions of dispersal at local and regional scales and statistical models that permit estimation. Classical models of dispersal describe local or long-distance dispersal, but not both. The lack of statistical methods means that models have rarely been...
Teaching Bayesian Statistics to Undergraduate Students through Debates
ERIC Educational Resources Information Center
Stewart, Sepideh; Stewart, Wayne
2014-01-01
This paper describes a lecturer's approach to teaching Bayesian statistics to students who were only exposed to the classical paradigm. The study shows how the lecturer extended himself by making use of ventriloquist dolls to grab hold of students' attention and embed important ideas in revealing the differences between the Bayesian and classical…
Martin, Daniel R; Matyushov, Dmitry V
2012-08-30
We show that electrostatic fluctuations of the protein-water interface are globally non-Gaussian. The electrostatic component of the optical transition energy (energy gap) in a hydrated green fluorescent protein is studied here by classical molecular dynamics simulations. The distribution of the energy gap displays a high excess in the breadth of electrostatic fluctuations over the prediction of the Gaussian statistics. The energy gap dynamics include a nanosecond component. When simulations are repeated with frozen protein motions, the statistics shifts to the expectations of linear response and the slow dynamics disappear. We therefore suggest that both the non-Gaussian statistics and the nanosecond dynamics originate largely from global, low-frequency motions of the protein coupled to the interfacial water. The non-Gaussian statistics can be experimentally verified from the temperature dependence of the first two spectral moments measured at constant-volume conditions. Simulations at different temperatures are consistent with other indicators of the non-Gaussian statistics. In particular, the high-temperature part of the energy gap variance (second spectral moment) scales linearly with temperature and extrapolates to zero at a temperature characteristic of the protein glass transition. This result, violating the classical limit of the fluctuation-dissipation theorem, leads to a non-Boltzmann statistics of the energy gap and corresponding non-Arrhenius kinetics of radiationless electronic transitions, empirically described by the Vogel-Fulcher-Tammann law.
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Phoenix, S. Leigh; Grimes-Ledesma, Lorie
2010-01-01
Stress rupture failure of Carbon Composite Overwrapped Pressure Vessels (COPVs) is of serious concern to Science Mission and Constellation programs since there are a number of COPVs on board space vehicles with stored gases under high pressure for long durations of time. It has become customary to establish the reliability of these vessels using the so called classic models. The classical models are based on Weibull statistics fitted to observed stress rupture data. These stochastic models cannot account for any additional damage due to the complex pressure-time histories characteristic of COPVs being supplied for NASA missions. In particular, it is suspected that the effects of proof test could significantly reduce the stress rupture lifetime of COPVs. The focus of this paper is to present an analytical appraisal of a model that incorporates damage due to proof test. The model examined in the current paper is based on physical mechanisms such as micromechanics based load sharing concepts coupled with creep rupture and Weibull statistics. For example, the classic model cannot accommodate for damage due to proof testing which every flight vessel undergoes. The paper compares current model to the classic model with a number of examples. In addition, several applications of the model to current ISS and Constellation program issues are also examined.
Advances in studies of disease-navigating webs: Sarcoptes scabiei as a case study
2014-01-01
The discipline of epidemiology is the study of the patterns, causes and effects of health and disease conditions in defined anima populations. It is the key to evidence-based medicine, which is one of the cornerstones of public health. One of the important facets of epidemiology is disease-navigating webs (disease-NW) through which zoonotic and multi-host parasites in general move from one host to another. Epidemiology in this context includes (i) classical epidemiological approaches based on the statistical analysis of disease prevalence and distribution and, more recently, (ii) genetic approaches with approximations of disease-agent population genetics. Both approaches, classical epidemiology and population genetics, are useful for studying disease-NW. However, both have strengths and weaknesses when applied separately, which, unfortunately, is too often current practice. In this paper, we use Sarcoptes scabiei mite epidemiology as a case study to show how important an integrated approach can be in understanding disease-NW and subsequent disease control. PMID:24406101
Rhythm histograms and musical meter: A corpus study of Malian percussion music.
London, Justin; Polak, Rainer; Jacoby, Nori
2017-04-01
Studies of musical corpora have given empirical grounding to the various features that characterize particular musical styles and genres. Palmer & Krumhansl (1990) found that in Western classical music the likeliest places for a note to occur are the most strongly accented beats in a measure, and this was also found in subsequent studies using both Western classical and folk music corpora (Huron & Ommen, 2006; Temperley, 2010). We present a rhythmic analysis of a corpus of 15 performances of percussion music from Bamako, Mali. In our corpus, the relative frequency of note onsets in a given metrical position does not correspond to patterns of metrical accent, though there is a stable relationship between onset frequency and metrical position. The implications of this non-congruence between simple statistical likelihood and metrical structure for the ways in which meter and metrical accent may be learned and understood are discussed, along with importance of cross-cultural studies for psychological research.
NASA Technical Reports Server (NTRS)
Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.
2010-01-01
The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.
Ankle plantarflexion strength in rearfoot and forefoot runners: a novel clusteranalytic approach.
Liebl, Dominik; Willwacher, Steffen; Hamill, Joseph; Brüggemann, Gert-Peter
2014-06-01
The purpose of the present study was to test for differences in ankle plantarflexion strengths of habitually rearfoot and forefoot runners. In order to approach this issue, we revisit the problem of classifying different footfall patterns in human runners. A dataset of 119 subjects running shod and barefoot (speed 3.5m/s) was analyzed. The footfall patterns were clustered by a novel statistical approach, which is motivated by advances in the statistical literature on functional data analysis. We explain the novel statistical approach in detail and compare it to the classically used strike index of Cavanagh and Lafortune (1980). The two groups found by the new cluster approach are well interpretable as a forefoot and a rearfoot footfall groups. The subsequent comparison study of the clustered subjects reveals that runners with a forefoot footfall pattern are capable of producing significantly higher joint moments in a maximum voluntary contraction (MVC) of their ankle plantarflexor muscles tendon units; difference in means: 0.28Nm/kg. This effect remains significant after controlling for an additional gender effect and for differences in training levels. Our analysis confirms the hypothesis that forefoot runners have a higher mean MVC plantarflexion strength than rearfoot runners. Furthermore, we demonstrate that our proposed stochastic cluster analysis provides a robust and useful framework for clustering foot strikes. Copyright © 2014 Elsevier B.V. All rights reserved.
Much Polyphony but Little Harmony: Otto Sackur's Groping for a Quantum Theory of Gases
NASA Astrophysics Data System (ADS)
Badino, Massimiliano; Friedrich, Bretislav
2013-09-01
The endeavor of Otto Sackur (1880-1914) was driven, on the one hand, by his interest in Nernst's heat theorem, statistical mechanics, and the problem of chemical equilibrium and, on the other hand, by his goal to shed light on classical mechanics from the quantum vantage point. Inspired by the interplay between classical physics and quantum theory, Sackur chanced to expound his personal take on the role of the quantum in the changing landscape of physics in the turbulent 1910s. We tell the story of this enthusiastic practitioner of the old quantum theory and early contributor to quantum statistical mechanics, whose scientific ontogenesis provides a telling clue about the phylogeny of his contemporaries.
Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors
NASA Astrophysics Data System (ADS)
Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay
2017-11-01
Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.
Quantum gas-liquid condensation in an attractive Bose gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koh, Shun-ichiro
Gas-liquid condensation (GLC) in an attractive Bose gas is studied on the basis of statistical mechanics. Using some results in combinatorial mathematics, the following are derived. (1) With decreasing temperature, the Bose-statistical coherence grows in the many-body wave function, which gives rise to the divergence of the grand partition function prior to Bose-Einstein condensation. It is a quantum-mechanical analogue to the GLC in a classical gas (quantum GLC). (2) This GLC is triggered by the bosons with zero momentum. Compared with the classical GLC, an incomparably weaker attractive force creates it. For the system showing the quantum GLC, we discussmore » a cold helium 4 gas at sufficiently low pressure.« less
Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.
Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay
2017-11-01
Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.
Alay, Asli; Usta, Taner A; Ozay, Pinar; Karadugan, Ozgur; Ates, Ugur
2014-05-01
The objective of this study was to compare classical blind endometrial tissue sampling with hysteroscopic biopsy sampling following methylene blue dyeing in premenopausal and postmenopausal patients with abnormal uterine bleeding. A prospective case-control study was carried out in the Office Hysteroscopy Unit. Fifty-four patients with complaints of abnormal uterine bleeding were evaluated. Data of 38 patients were included in the statistical analysis. Three groups were compared by examining samples obtained through hysteroscopic biopsy before and after methylene blue dyeing, and classical blind endometrial tissue sampling. First, uterine cavity was evaluated with office hysteroscopy. Methylene blue dye was administered through the hysteroscopic inlet. Tissue samples were obtained from stained and non-stained areas. Blind endometrial sampling was performed in the same patients immediately after the hysteroscopy procedure. The results of hysteroscopic biopsy from methylene blue stained and non-stained areas and blind biopsy were compared. No statistically significant differences were determined in the comparison of biopsy samples obtained from methylene-blue stained, non-stained areas and blind biopsy (P > 0.05). We suggest that chromohysteroscopy is not superior to endometrial sampling in cases of abnormal uterine bleeding. Further studies with greater sample sizes should be performed to assess the validity of routine use of endometrial dyeing. © 2014 The Authors. Journal of Obstetrics and Gynaecology Research © 2014 Japan Society of Obstetrics and Gynecology.
Electronic Nose for Identification of Lung Diseases
NASA Astrophysics Data System (ADS)
Ogorodnik, V.; Kleperis, J.; Taivans, I.; Jurka, N.; Bukovskis, M.
2008-01-01
In the paper, the authors analyze the preliminary results of testing a classical gas sensing instrument - the electronic nose (a metal oxide transistor sensor of chemical substances) in a hospital where patients with different lung diseases are treated. To reveal the correlation between the amplitudes of the sensor's responses and the patients' diagnoses, different statistical analysis methods have been used. It is shown that the lung cancer can easily be discriminated from other lung diseases if short breath sampling and analysis time (less than 1 min) is used in the test. Volatiles obtained from a breath sample of a patient with lung cancer give the major contribution to the responses of different e-nose sensors, so in these cases highly precise identification could be achieved.
NASA Technical Reports Server (NTRS)
Moore, B., III; Kaufmann, R.; Reinhold, C.
1981-01-01
Systems analysis and control theory consideration are given to simulations of both individual components and total systems, in order to develop a reliable control strategy for a Controlled Ecological Life Support System (CELSS) which includes complex biological components. Because of the numerous nonlinearities and tight coupling within the biological component, classical control theory may be inadequate and the statistical analysis of factorial experiments more useful. The range in control characteristics of particular species may simplify the overall task by providing an appropriate balance of stability and controllability to match species function in the overall design. The ultimate goal of this research is the coordination of biological and mechanical subsystems in order to achieve a self-supporting environment.
Nodal domains of a non-separable problem—the right-angled isosceles triangle
NASA Astrophysics Data System (ADS)
Aronovitch, Amit; Band, Ram; Fajman, David; Gnutzmann, Sven
2012-03-01
We study the nodal set of eigenfunctions of the Laplace operator on the right-angled isosceles triangle. A local analysis of the nodal pattern provides an algorithm for computing the number νn of nodal domains for any eigenfunction. In addition, an exact recursive formula for the number of nodal domains is found to reproduce all existing data. Eventually, we use the recursion formula to analyse a large sequence of nodal counts statistically. Our analysis shows that the distribution of nodal counts for this triangular shape has a much richer structure than the known cases of regular separable shapes or completely irregular shapes. Furthermore, we demonstrate that the nodal count sequence contains information about the periodic orbits of the corresponding classical ray dynamics.
Large-scale quantitative analysis of painting arts.
Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong
2014-12-11
Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.
Faris, Callum; van der Eerden, Paul; Vuyk, Hade
2015-01-01
This study clarifies the pedicle geometry and vascular supply of a midline forehead flap for nasal reconstruction. It reports on the vascular reliability of this flap and its ability to reduce hair transposition to the nose, a major complicating factor of previous forehead flap designs. To compare the vascular reliability of 3 different pedicle designs of the forehead flap in nasal reconstruction (classic paramedian, glabellar paramedian, and central artery flap design) and evaluate hair transposition rates and aesthetic results. Retrospective analysis of patient data and outcomes retrieved from computer files generated at the time of surgery, supplemented by data from the patient medical records and photographic documentation, from a tertiary referral nasal reconstructive practice, within a secondary-care hospital setting. The study population included all consecutive patients over a 19-year period who underwent primary forehead flap repair of nasal defects, with more than 3 months of postoperative follow-up and photographic documentation. Three sequential forehead flap patterns were used (classic paramedian flap, glabella flap, and central artery flap) for nasal reconstruction over the study duration. Data collected included patient characteristics, method of repair, complications, functional outcome, and patient satisfaction score. For cosmetic outcome, photographic documentation was scored by a medical juror. No forehead flap had vascular compromise in the first stage. Partial flap necrosis was reported in subsequent stages in 4 patients (1%), with no statistical difference in the rate of vascular compromise between the 3 flap designs. Hair transposition to the nose was lower in the central artery forehead flap (7%) compared with the classic paramedian (23%) and glabellar paramedian (13%) flaps (P < .05). Photographic evaluation in 227 patients showed that brow position (98%) and color match (83%) were good in the majority of the patients. In this series, the central artery forehead flap was as reliable (in terms of vascularity) as the glabellar and classic paramedian forehead flap. Its use resulted in a statistically significant reduction in transfer of hair to the nose in our series. 3.
NASA Astrophysics Data System (ADS)
Jordan, Andrew Noble
2002-09-01
In this dissertation, we study the quantum mechanics of classically chaotic dynamical systems. We begin by considering the decoherence effects a quantum chaotic system has on a simple quantum few state system. Typical time evolution of a quantum system whose classical limit is chaotic generates structures in phase space whose size is much smaller than Planck's constant. A naive application of Heisenberg's uncertainty principle indicates that these structures are not physically relevant. However, if we take the quantum chaotic system in question to be an environment which interacts with a simple two state quantum system (qubit), we show that these small phase-space structures cause the qubit to generically lose quantum coherence if and only if the environment has many degrees of freedom, such as a dilute gas. This implies that many-body environments may be crucial for the phenomenon of quantum decoherence. Next, we turn to an analysis of statistical properties of time correlation functions and matrix elements of quantum chaotic systems. A semiclassical evaluation of matrix elements of an operator indicates that the dominant contribution will be related to a classical time correlation function over the energy surface. For a highly chaotic class of dynamics, these correlation functions may be decomposed into sums of Ruelle resonances, which control exponential decay to the ergodic distribution. The theory is illustrated both numerically and theoretically on the Baker map. For this system, we are able to isolate individual Ruelle modes. We further consider dynamical systems whose approach to ergodicity is given by a power law rather than an exponential in time. We propose a billiard with diffusive boundary conditions, whose classical solution may be calculated analytically. We go on to compare the exact solution with an approximation scheme, as well calculate asympotic corrections. Quantum spectral statistics are calculated assuming the validity of the Again, Altshuler and Andreev ansatz. We find singular behavior of the two point spectral correlator in the limit of small spacing. Finally, we analyse the effect that slow decay to ergodicity has on the structure of the quantum propagator, as well as wavefunction localization. We introduce a statistical quantum description of systems that are composed of both an orderly region and a random region. By averaging over the random region only, we find that measures of localization in momentum space semiclassically diverge with the dimension of the Hilbert space. We illustrate this numerically with quantum maps and suggest various other systems where this behavior should be important.
The potential effects of pH and buffering capacity on dental erosion.
Owens, Barry M
2007-01-01
Soft drink pH (initial pH) has been shown to be a causative factor--but not necessarily the primary initiating factor--of dental erosion. The titratable acidity or buffering capacity has been acknowledged as playing a significant role in the etiology of these lesions. This in vitro study sought to evaluate five different soft drinks (Coca-Cola Classic, Diet Coke, Gatorade sports drink, Red Bull high-energy drink, Starbucks Frappucino coffee drink) and tap water (control) in terms of initial pH and buffering capacity. Initial pH was measured in triplicate for the six beverages. The buffering capacity of each beverage was assessed by measuring the weight (in grams) of 0.10 M sodium hydroxide necessary for titration to pH levels of 5.0, 6.0, 7.0, and 8.3. Coca-Cola Classic produced the lowest mean pH, while Starbucks Frappucino produced the highest pH of any of the drinks except for tap water. Based on statistical analysis using ANOVA and Fisher's post hoc tests at a P < 0.05 level of significance, Red Bull had the highest mean buffering capacity (indicating the strongest potential for erosion of enamel), followed by Gatorade, Coca-Cola Classic, Diet Coke, and Starbucks Frappucino.
A new method for correlation analysis of compositional (environmental) data - a worked example.
Reimann, C; Filzmoser, P; Hron, K; Kynčlová, P; Garrett, R G
2017-12-31
Most data in environmental sciences and geochemistry are compositional. Already the unit used to report the data (e.g., μg/l, mg/kg, wt%) implies that the analytical results for each element are not free to vary independently of the other measured variables. This is often neglected in statistical analysis, where a simple log-transformation of the single variables is insufficient to put the data into an acceptable geometry. This is also important for bivariate data analysis and for correlation analysis, for which the data need to be appropriately log-ratio transformed. A new approach based on the isometric log-ratio (ilr) transformation, leading to so-called symmetric coordinates, is presented here. Summarizing the correlations in a heat-map gives a powerful tool for bivariate data analysis. Here an application of the new method using a data set from a regional geochemical mapping project based on soil O and C horizon samples is demonstrated. Differences to 'classical' correlation analysis based on log-transformed data are highlighted. The fact that some expected strong positive correlations appear and remain unchanged even following a log-ratio transformation has probably led to the misconception that the special nature of compositional data can be ignored when working with trace elements. The example dataset is employed to demonstrate that using 'classical' correlation analysis and plotting XY diagrams, scatterplots, based on the original or simply log-transformed data can easily lead to severe misinterpretations of the relationships between elements. Copyright © 2017 Elsevier B.V. All rights reserved.
A Regression Framework for Effect Size Assessments in Longitudinal Modeling of Group Differences
Feingold, Alan
2013-01-01
The use of growth modeling analysis (GMA)--particularly multilevel analysis and latent growth modeling--to test the significance of intervention effects has increased exponentially in prevention science, clinical psychology, and psychiatry over the past 15 years. Model-based effect sizes for differences in means between two independent groups in GMA can be expressed in the same metric (Cohen’s d) commonly used in classical analysis and meta-analysis. This article first reviews conceptual issues regarding calculation of d for findings from GMA and then introduces an integrative framework for effect size assessments that subsumes GMA. The new approach uses the structure of the linear regression model, from which effect sizes for findings from diverse cross-sectional and longitudinal analyses can be calculated with familiar statistics, such as the regression coefficient, the standard deviation of the dependent measure, and study duration. PMID:23956615
NASA Astrophysics Data System (ADS)
Lorenzi, Marco; Simpson, Ivor J.; Mendelson, Alex F.; Vos, Sjoerd B.; Cardoso, M. Jorge; Modat, Marc; Schott, Jonathan M.; Ourselin, Sebastien
2016-04-01
The joint analysis of brain atrophy measured with magnetic resonance imaging (MRI) and hypometabolism measured with positron emission tomography with fluorodeoxyglucose (FDG-PET) is of primary importance in developing models of pathological changes in Alzheimer’s disease (AD). Most of the current multimodal analyses in AD assume a local (spatially overlapping) relationship between MR and FDG-PET intensities. However, it is well known that atrophy and hypometabolism are prominent in different anatomical areas. The aim of this work is to describe the relationship between atrophy and hypometabolism by means of a data-driven statistical model of non-overlapping intensity correlations. For this purpose, FDG-PET and MRI signals are jointly analyzed through a computationally tractable formulation of partial least squares regression (PLSR). The PLSR model is estimated and validated on a large clinical cohort of 1049 individuals from the ADNI dataset. Results show that the proposed non-local analysis outperforms classical local approaches in terms of predictive accuracy while providing a plausible description of disease dynamics: early AD is characterised by non-overlapping temporal atrophy and temporo-parietal hypometabolism, while the later disease stages show overlapping brain atrophy and hypometabolism spread in temporal, parietal and cortical areas.
Vortex Analysis of Intra-Aneurismal Flow in Cerebral Aneurysms
Sunderland, Kevin; Haferman, Christopher; Chintalapani, Gouthami
2016-01-01
This study aims to develop an alternative vortex analysis method by measuring structure ofIntracranial aneurysm (IA) flow vortexes across the cardiac cycle, to quantify temporal stability of aneurismal flow. Hemodynamics were modeled in “patient-specific” geometries, using computational fluid dynamics (CFD) simulations. Modified versions of known λ 2 and Q-criterion methods identified vortex regions; then regions were segmented out using the classical marching cube algorithm. Temporal stability was measured by the degree of vortex overlap (DVO) at each step of a cardiac cycle against a cycle-averaged vortex and by the change in number of cores over the cycle. No statistical differences exist in DVO or number of vortex cores between 5 terminal IAs and 5 sidewall IAs. No strong correlation exists between vortex core characteristics and geometric or hemodynamic characteristics of IAs. Statistical independence suggests this proposed method may provide novel IA information. However, threshold values used to determine the vortex core regions and resolution of velocity data influenced analysis outcomes and have to be addressed in future studies. In conclusions, preliminary results show that the proposed methodology may help give novel insight toward aneurismal flow characteristic and help in future risk assessment given more developments. PMID:27891172
Vortex Analysis of Intra-Aneurismal Flow in Cerebral Aneurysms.
Sunderland, Kevin; Haferman, Christopher; Chintalapani, Gouthami; Jiang, Jingfeng
2016-01-01
This study aims to develop an alternative vortex analysis method by measuring structure ofIntracranial aneurysm (IA) flow vortexes across the cardiac cycle, to quantify temporal stability of aneurismal flow. Hemodynamics were modeled in "patient-specific" geometries, using computational fluid dynamics (CFD) simulations. Modified versions of known λ 2 and Q -criterion methods identified vortex regions; then regions were segmented out using the classical marching cube algorithm. Temporal stability was measured by the degree of vortex overlap (DVO) at each step of a cardiac cycle against a cycle-averaged vortex and by the change in number of cores over the cycle. No statistical differences exist in DVO or number of vortex cores between 5 terminal IAs and 5 sidewall IAs. No strong correlation exists between vortex core characteristics and geometric or hemodynamic characteristics of IAs. Statistical independence suggests this proposed method may provide novel IA information. However, threshold values used to determine the vortex core regions and resolution of velocity data influenced analysis outcomes and have to be addressed in future studies. In conclusions, preliminary results show that the proposed methodology may help give novel insight toward aneurismal flow characteristic and help in future risk assessment given more developments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogan, Craig
It is argued by extrapolation of general relativity and quantum mechanics that a classical inertial frame corresponds to a statistically defined observable that rotationally fluctuates due to Planck scale indeterminacy. Physical effects of exotic nonlocal rotational correlations on large scale field states are estimated. Their entanglement with the strong interaction vacuum is estimated to produce a universal, statistical centrifugal acceleration that resembles the observed cosmological constant.
Re'class'ification of 'quant'ified classical simulated annealing
NASA Astrophysics Data System (ADS)
Tanaka, Toshiyuki
2009-12-01
We discuss a classical reinterpretation of quantum-mechanics-based analysis of classical Markov chains with detailed balance, that is based on the quantum-classical correspondence. The classical reinterpretation is then used to demonstrate that it successfully reproduces a sufficient condition for cooling schedule in classical simulated annealing, which has the inverse-logarithmic scaling.
Woldegebriel, Michael; Vivó-Truyols, Gabriel
2016-10-04
A novel method for compound identification in liquid chromatography-high resolution mass spectrometry (LC-HRMS) is proposed. The method, based on Bayesian statistics, accommodates all possible uncertainties involved, from instrumentation up to data analysis into a single model yielding the probability of the compound of interest being present/absent in the sample. This approach differs from the classical methods in two ways. First, it is probabilistic (instead of deterministic); hence, it computes the probability that the compound is (or is not) present in a sample. Second, it answers the hypothesis "the compound is present", opposed to answering the question "the compound feature is present". This second difference implies a shift in the way data analysis is tackled, since the probability of interfering compounds (i.e., isomers and isobaric compounds) is also taken into account.
NASA Astrophysics Data System (ADS)
Knudsen, Steven; Golubovic, Leonardo
Prospects to build Space Elevator (SE) systems have become realistic with ultra-strong materials such as carbon nano-tubes and diamond nano-threads. At cosmic length-scales, space elevators can be modeled as polymer like floppy strings of tethered mass beads. A new venue in SE science has emerged with the introduction of the Rotating Space Elevator (RSE) concept supported by novel algorithms discussed in this presentation. An RSE is a loopy string reaching into outer space. Unlike the classical geostationary SE concepts of Tsiolkovsky, Artsutanov, and Pearson, our RSE exhibits an internal rotation. Thanks to this, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth whereas the other one is in outer space. The RSE concept thus solves a major problem in SE technology which is how to supply energy to the climbers moving along space elevator strings. The investigation of the classical and statistical mechanics of a floppy string interacting with objects sliding along it required development of subtle computational algorithms described in this presentation
Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi
2012-01-01
The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastron, G. D.; Davis, R. E.; Holdeman, J. D.
1984-01-01
Summary studies are presented for the entire cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP). Studies are also presented for GASP particle-concentration data gathered concurrently with the cloud observations. Cloud encounters are shown on about 15 percent of the data samples overall, but the probability of cloud encounter is shown to vary significantly with altitude, latitude, and distance from the tropopause. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover, and the cloud-encounter statistics are shown to be consistent with the classical mid-latitude cyclone model. Observations of clouds spaced more closely than 90 minutes are shown to be statistically dependent. The statistics for cloud and particle encounter are utilized to estimate the frequency of cloud encounter on long-range airline routes, and to assess the probability and extent of laminaar flow loss due to cloud or particle encounter by aircraft utilizing laminar flow control (LFC). It is shown that the probability of extended cloud encounter is too low, of itself, to make LFC impractical. This report is presented in two volumes. Volume I contains the narrative, analysis, and conclusions. Volume II contains five supporting appendixes.
Lehoucq, R B; Sears, Mark P
2011-09-01
The purpose of this paper is to derive the energy and momentum conservation laws of the peridynamic nonlocal continuum theory using the principles of classical statistical mechanics. The peridynamic laws allow the consideration of discontinuous motion, or deformation, by relying on integral operators. These operators sum forces and power expenditures separated by a finite distance and so represent nonlocal interaction. The integral operators replace the differential divergence operators conventionally used, thereby obviating special treatment at points of discontinuity. The derivation presented employs a general multibody interatomic potential, avoiding the standard assumption of a pairwise decomposition. The integral operators are also expressed in terms of a stress tensor and heat flux vector under the assumption that these fields are differentiable, demonstrating that the classical continuum energy and momentum conservation laws are consequences of the more general peridynamic laws. An important conclusion is that nonlocal interaction is intrinsic to continuum conservation laws when derived using the principles of statistical mechanics.
NASA Astrophysics Data System (ADS)
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
NASA Astrophysics Data System (ADS)
Siegel, Z.; Siegel, Edward Carl-Ludwig
2011-03-01
RANDOMNESS of Numbers cognitive-semantics DEFINITION VIA Cognition QUERY: WHAT???, NOT HOW?) VS. computer-``science" mindLESS number-crunching (Harrel-Sipser-...) algorithmics Goldreich "PSEUDO-randomness"[Not.AMS(02)] mea-culpa is ONLY via MAXWELL-BOLTZMANN CLASSICAL-STATISTICS(NOT FDQS!!!) "hot-plasma" REPULSION VERSUS Newcomb(1881)-Weyl(1914;1916)-Benford(1938) "NeWBe" logarithmic-law digit-CLUMPING/ CLUSTERING NON-Randomness simple Siegel[AMS Joint.Mtg.(02)-Abs. # 973-60-124] algebraic-inversion to THE QUANTUM and ONLY BEQS preferentially SEQUENTIALLY lower-DIGITS CLUMPING/CLUSTERING with d = 0 BEC, is ONLY VIA Siegel-Baez FUZZYICS=CATEGORYICS (SON OF TRIZ)/"Category-Semantics"(C-S), latter intersection/union of Lawvere(1964)-Siegel(1964)] category-theory (matrix: MORPHISMS V FUNCTORS) "+" cognitive-semantics'' (matrix: ANTONYMS V SYNONYMS) yields Siegel-Baez FUZZYICS=CATEGORYICS/C-S tabular list-format matrix truth-table analytics: MBCS RANDOMNESS TRUTH/EMET!!!
Reversibility in Quantum Models of Stochastic Processes
NASA Astrophysics Data System (ADS)
Gier, David; Crutchfield, James; Mahoney, John; James, Ryan
Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.
A pedagogical approach to the Boltzmann factor through experiments and simulations
NASA Astrophysics Data System (ADS)
Battaglia, O. R.; Bonura, A.; Sperandeo-Mineo, R. M.
2009-09-01
The Boltzmann factor is the basis of a huge amount of thermodynamic and statistical physics, both classical and quantum. It governs the behaviour of all systems in nature that are exchanging energy with their environment. To understand why the expression has this specific form involves a deep mathematical analysis, whose flow of logic is hard to see and is not at the level of high school or college students' preparation. We here present some experiments and simulations aimed at directly deriving its mathematical expression and illustrating the fundamental concepts on which it is grounded. Experiments use easily available apparatuses, and simulations are developed in the Net-Logo environment that, besides having a user-friendly interface, allows an easy interaction with the algorithm. The approach supplies pedagogical support for the introduction of the Boltzmann factor at the undergraduate level to students without a background in statistical mechanics.
Nogueira, F; Couto, E G; Bernardi, C J
2002-11-01
The Pantanal of Mato Grosso presents distinct landscape units: permanently, occasionally and periodically flooded areas. In the last ones, sampling is especially difficult due to the high heterogeneity occurring inter and intrastratas. This paper presents a comparison of different methodological approaches showing that they can influence decisively the knowledge of distribution organic matter dynamics. In such an area in order to understand the role of the flood pulse in the distribution dynamics of organic matter in a wetland at the Pantanal, we considered that there is spatial dependence between points. This consideration contradicts the classical statistic principle that focuses on the aleatority, and allowed the obtainment of a larger volume of information from a minor sampling effort, which means better performance, with time and money economy.
Investigation of Particle Sampling Bias in the Shear Flow Field Downstream of a Backward Facing Step
NASA Technical Reports Server (NTRS)
Meyers, James F.; Kjelgaard, Scott O.; Hepner, Timothy E.
1990-01-01
The flow field about a backward facing step was investigated to determine the characteristics of particle sampling bias in the various flow phenomena. The investigation used the calculation of the velocity:data rate correlation coefficient as a measure of statistical dependence and thus the degree of velocity bias. While the investigation found negligible dependence within the free stream region, increased dependence was found within the boundary and shear layers. Full classic correction techniques over-compensated the data since the dependence was weak, even in the boundary layer and shear regions. The paper emphasizes the necessity to determine the degree of particle sampling bias for each measurement ensemble and not use generalized assumptions to correct the data. Further, it recommends the calculation of the velocity:data rate correlation coefficient become a standard statistical calculation in the analysis of all laser velocimeter data.
Dependence of exponents on text length versus finite-size scaling for word-frequency distributions
NASA Astrophysics Data System (ADS)
Corral, Álvaro; Font-Clos, Francesc
2017-08-01
Some authors have recently argued that a finite-size scaling law for the text-length dependence of word-frequency distributions cannot be conceptually valid. Here we give solid quantitative evidence for the validity of this scaling law, using both careful statistical tests and analytical arguments based on the generalized central-limit theorem applied to the moments of the distribution (and obtaining a novel derivation of Heaps' law as a by-product). We also find that the picture of word-frequency distributions with power-law exponents that decrease with text length [X. Yan and P. Minnhagen, Physica A 444, 828 (2016), 10.1016/j.physa.2015.10.082] does not stand with rigorous statistical analysis. Instead, we show that the distributions are perfectly described by power-law tails with stable exponents, whose values are close to 2, in agreement with the classical Zipf's law. Some misconceptions about scaling are also clarified.
Asteroid orbital error analysis: Theory and application
NASA Technical Reports Server (NTRS)
Muinonen, K.; Bowell, Edward
1992-01-01
We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).
Investigating Student Understanding for a Statistical Analysis of Two Thermally Interacting Solids
NASA Astrophysics Data System (ADS)
Loverude, Michael E.
2010-10-01
As part of an ongoing research and curriculum development project for upper-division courses in thermal physics, we have developed a sequence of tutorials in which students apply statistical methods to examine the behavior of two interacting Einstein solids. In the sequence, students begin with simple results from probability and develop a means for counting the states in a single Einstein solid. The students then consider the thermal interaction of two solids, and observe that the classical equilibrium state corresponds to the most probable distribution of energy between the two solids. As part of the development of the tutorial sequence, we have developed several assessment questions to probe student understanding of various aspects of this system. In this paper, we describe the strengths and weaknesses of student reasoning, both qualitative and quantitative, to assess the readiness of students for one tutorial in the sequence.
Physics of Electronic Materials
NASA Astrophysics Data System (ADS)
Rammer, Jørgen
2017-03-01
1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
A.N. Kolmogorov’s defence of Mendelism
Stark, Alan; Seneta, Eugene
2011-01-01
In 1939 N.I. Ermolaeva published the results of an experiment which repeated parts of Mendel’s classical experiments. On the basis of her experiment she concluded that Mendel’s principle that self-pollination of hybrid plants gave rise to segregation proportions 3:1 was false. The great probability theorist A.N. Kolmogorov reviewed Ermolaeva’s data using a test, now referred to as Kolmogorov’s, or Kolmogorov-Smirnov, test, which he had proposed in 1933. He found, contrary to Ermolaeva, that her results clearly confirmed Mendel’s principle. This paper shows that there were methodological flaws in Kolmogorov’s statistical analysis and presents a substantially adjusted approach, which confirms his conclusions. Some historical commentary on the Lysenko-era background is given, to illuminate the relationship of the disciplines of genetics and statistics in the struggle against the prevailing politically-correct pseudoscience in the Soviet Union. There is a Brazilian connection through the person of Th. Dobzhansky. PMID:21734813
Universality classes of fluctuation dynamics in hierarchical complex systems
NASA Astrophysics Data System (ADS)
Macêdo, A. M. S.; González, Iván R. Roa; Salazar, D. S. P.; Vasconcelos, G. L.
2017-03-01
A unified approach is proposed to describe the statistics of the short-time dynamics of multiscale complex systems. The probability density function of the relevant time series (signal) is represented as a statistical superposition of a large time-scale distribution weighted by the distribution of certain internal variables that characterize the slowly changing background. The dynamics of the background is formulated as a hierarchical stochastic model whose form is derived from simple physical constraints, which in turn restrict the dynamics to only two possible classes. The probability distributions of both the signal and the background have simple representations in terms of Meijer G functions. The two universality classes for the background dynamics manifest themselves in the signal distribution as two types of tails: power law and stretched exponential, respectively. A detailed analysis of empirical data from classical turbulence and financial markets shows excellent agreement with the theory.
Atlas of the Light Curves and Phase Plane Portraits of Selected Long-Period Variables
NASA Astrophysics Data System (ADS)
Kudashkina, L. S.; Andronov, I. L.
2017-12-01
For a group of the Mira-type stars, semi-regular variables and some RV Tau - type stars the limit cycles were computed and plotted using the phase plane diagrams. As generalized coordinates x and x', we have used φ - the brightness of the star and its phase derivative. We have used mean phase light curves using observations of various authors from the databases of AAVSO, AFOEV, VSOLJ, ASAS and approximated using a trigonometric polynomial of statistically optimal degree. For a simple sine-like light curve, the limit cycle is a simple ellipse. In a case of more complicated light curve, in which harmonics are statistically significant, the limit cycle has deviations from the ellipse. In an addition to a classical analysis, we use the error estimates of the smoothing function and its derivative to constrain an "error corridor" in the phase plane.
Harrison, Jay M; Breeze, Matthew L; Harrigan, George G
2011-08-01
Statistical comparisons of compositional data generated on genetically modified (GM) crops and their near-isogenic conventional (non-GM) counterparts typically rely on classical significance testing. This manuscript presents an introduction to Bayesian methods for compositional analysis along with recommendations for model validation. The approach is illustrated using protein and fat data from two herbicide tolerant GM soybeans (MON87708 and MON87708×MON89788) and a conventional comparator grown in the US in 2008 and 2009. Guidelines recommended by the US Food and Drug Administration (FDA) in conducting Bayesian analyses of clinical studies on medical devices were followed. This study is the first Bayesian approach to GM and non-GM compositional comparisons. The evaluation presented here supports a conclusion that a Bayesian approach to analyzing compositional data can provide meaningful and interpretable results. We further describe the importance of method validation and approaches to model checking if Bayesian approaches to compositional data analysis are to be considered viable by scientists involved in GM research and regulation. Copyright © 2011 Elsevier Inc. All rights reserved.
Evaluation of Adherence to Nutritional Intervention Through Trajectory Analysis.
Sevilla-Villanueva, B; Gibert, K; Sanchez-Marre, M; Fito, M; Covas, M I
2017-05-01
Classical pre-post intervention studies are often analyzed using traditional statistics. Nevertheless, the nutritional interventions have small effects on the metabolism and traditional statistics are not enough to detect these subtle nutrient effects. Generally, this kind of studies assumes that the participants are adhered to the assigned dietary intervention and directly analyzes its effects over the target parameters. Thus, the evaluation of adherence is generally omitted. Although, sometimes, participants do not effectively adhere to the assigned dietary guidelines. For this reason, the trajectory map is proposed as a visual tool where dietary patterns of individuals can be followed during the intervention and can also be related with nutritional prescriptions. The trajectory analysis is also proposed allowing both analysis: 1) adherence to the intervention and 2) intervention effects. The analysis is made by projecting the differences of the target parameters over the resulting trajectories between states of different time-stamps which might be considered either individually or by groups. The proposal has been applied over a real nutritional study showing that some individuals adhere better than others and some individuals of the control group modify their habits during the intervention. In addition, the intervention effects are different depending on the type of individuals, even some subgroups have opposite response to the same intervention.
The ambiguity of simplicity in quantum and classical simulation
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.
2017-04-01
A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.
Chance, determinism and the classical theory of probability.
Vasudevan, Anubav
2018-02-01
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Power-law distributions for a trapped ion interacting with a classical buffer gas.
DeVoe, Ralph G
2009-02-13
Classical collisions with an ideal gas generate non-Maxwellian distribution functions for a single ion in a radio frequency ion trap. The distributions have power-law tails whose exponent depends on the ratio of buffer gas to ion mass. This provides a statistical explanation for the previously observed transition from cooling to heating. Monte Carlo results approximate a Tsallis distribution over a wide range of parameters and have ab initio agreement with experiment.
NASA Astrophysics Data System (ADS)
Gao, Zhi-yu; Kang, Yu; Li, Yan-shuai; Meng, Chao; Pan, Tao
2018-04-01
Elevated-temperature flow behavior of a novel Ni-Cr-Mo-B ultra-heavy-plate steel was investigated by conducting hot compressive deformation tests on a Gleeble-3800 thermo-mechanical simulator at a temperature range of 1123 K–1423 K with a strain rate range from 0.01 s‑1 to10 s‑1 and a height reduction of 70%. Based on the experimental results, classic strain-compensated Arrhenius-type, a new revised strain-compensated Arrhenius-type and classic modified Johnson-Cook constitutive models were developed for predicting the high-temperature deformation behavior of the steel. The predictability of these models were comparatively evaluated in terms of statistical parameters including correlation coefficient (R), average absolute relative error (AARE), average root mean square error (RMSE), normalized mean bias error (NMBE) and relative error. The statistical results indicate that the new revised strain-compensated Arrhenius-type model could give prediction of elevated-temperature flow stress for the steel accurately under the entire process conditions. However, the predicted values by the classic modified Johnson-Cook model could not agree well with the experimental values, and the classic strain-compensated Arrhenius-type model could track the deformation behavior more accurately compared with the modified Johnson-Cook model, but less accurately with the new revised strain-compensated Arrhenius-type model. In addition, reasons of differences in predictability of these models were discussed in detail.
Balcı, Nilay Comuk; Yuruk, Zeliha Ozlem; Zeybek, Aslican; Gulsen, Mustafa; Tekindal, Mustafa Agah
2016-01-01
[Purpose] The aim of our study was to compare the initial effects of scapular proprioceptive neuromuscular facilitation techniques and classic exercise interventions with physiotherapy modalities on pain, scapular dyskinesis, range of motion, and function in adhesive capsulitis. [Subjects and Methods] Fifty-three subjects were allocated to 3 groups: scapular proprioceptive neuromuscular facilitation exercies and physiotherapy modalities, classic exercise and physiotherapy modalities, and only physiotherapy modalities. The intervention was applied in a single session. The Visual Analog Scale, Lateral Scapular Slide Test, range of motion and Simple Shoulder Test were evaluated before and just after the one-hour intervention in the same session (all in one session). [Results] All of the groups showed significant differences in shoulder flexion and abduction range of motion and Simple Shoulder Test scores. There were statistically significant differences in Visual Analog Scale scores in the proprioceptive neuromuscular facilitation and control groups, and no treatment method had significant effect on the Lateral Scapular Slide Test results. There were no statistically significant differences between the groups before and after the intervention. [Conclusion] Proprioceptive neuromuscular facilitation, classic exercise, and physiotherapy modalities had immediate effects on adhesive capsulitis in our study. However, there was no additional benefit of exercises in one session over physiotherapy modalities. Also, an effective treatment regimen for shoulder rehabilitation of adhesive capsulitis patients should include scapular exercises. PMID:27190456
NASA Astrophysics Data System (ADS)
Barette, Florian; Poppe, Sam; Smets, Benoît; Benbakkar, Mhammed; Kervyn, Matthieu
2017-10-01
We present an integrated, spatially-explicit database of existing geochemical major-element analyses available from (post-) colonial scientific reports, PhD Theses and international publications for the Virunga Volcanic Province, located in the western branch of the East African Rift System. This volcanic province is characterised by alkaline volcanism, including silica-undersaturated, alkaline and potassic lavas. The database contains a total of 908 geochemical analyses of eruptive rocks for the entire volcanic province with a localisation for most samples. A preliminary analysis of the overall consistency of the database, using statistical techniques on sets of geochemical analyses with contrasted analytical methods or dates, demonstrates that the database is consistent. We applied a principal component analysis and cluster analysis on whole-rock major element compositions included in the database to study the spatial variation of the chemical composition of eruptive products in the Virunga Volcanic Province. These statistical analyses identify spatially distributed clusters of eruptive products. The known geochemical contrasts are highlighted by the spatial analysis, such as the unique geochemical signature of Nyiragongo lavas compared to other Virunga lavas, the geochemical heterogeneity of the Bulengo area, and the trachyte flows of Karisimbi volcano. Most importantly, we identified separate clusters of eruptive products which originate from primitive magmatic sources. These lavas of primitive composition are preferentially located along NE-SW inherited rift structures, often at distance from the central Virunga volcanoes. Our results illustrate the relevance of a spatial analysis on integrated geochemical data for a volcanic province, as a complement to classical petrological investigations. This approach indeed helps to characterise geochemical variations within a complex of magmatic systems and to identify specific petrologic and geochemical investigations that should be tackled within a study area.
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-01-01
Abstract To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches – for example, analysis of variance (ANOVA) – are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing. PMID:24567836
Semenov, Alexander V; Elsas, Jan Dirk; Glandorf, Debora C M; Schilthuizen, Menno; Boer, Willem F
2013-08-01
To fulfill existing guidelines, applicants that aim to place their genetically modified (GM) insect-resistant crop plants on the market are required to provide data from field experiments that address the potential impacts of the GM plants on nontarget organisms (NTO's). Such data may be based on varied experimental designs. The recent EFSA guidance document for environmental risk assessment (2010) does not provide clear and structured suggestions that address the statistics of field trials on effects on NTO's. This review examines existing practices in GM plant field testing such as the way of randomization, replication, and pseudoreplication. Emphasis is placed on the importance of design features used for the field trials in which effects on NTO's are assessed. The importance of statistical power and the positive and negative aspects of various statistical models are discussed. Equivalence and difference testing are compared, and the importance of checking the distribution of experimental data is stressed to decide on the selection of the proper statistical model. While for continuous data (e.g., pH and temperature) classical statistical approaches - for example, analysis of variance (ANOVA) - are appropriate, for discontinuous data (counts) only generalized linear models (GLM) are shown to be efficient. There is no golden rule as to which statistical test is the most appropriate for any experimental situation. In particular, in experiments in which block designs are used and covariates play a role GLMs should be used. Generic advice is offered that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in this testing. The combination of decision trees and a checklist for field trials, which are provided, will help in the interpretation of the statistical analyses of field trials and to assess whether such analyses were correctly applied. We offer generic advice to risk assessors and applicants that will help in both the setting up of field testing and the interpretation and data analysis of the data obtained in field testing.
Design of off-statistics axial-flow fans by means of vortex law optimization
NASA Astrophysics Data System (ADS)
Lazari, Andrea; Cattanei, Andrea
2014-12-01
Off-statistics input data sets are common in axial-flow fans design and may easily result in some violation of the requirements of a good aerodynamic blade design. In order to circumvent this problem, in the present paper, a solution to the radial equilibrium equation is found which minimizes the outlet kinetic energy and fulfills the aerodynamic constraints, thus ensuring that the resulting blade has acceptable aerodynamic performance. The presented method is based on the optimization of a three-parameters vortex law and of the meridional channel size. The aerodynamic quantities to be employed as constraints are individuated and their suitable ranges of variation are proposed. The method is validated by means of a design with critical input data values and CFD analysis. Then, by means of systematic computations with different input data sets, some correlations and charts are obtained which are analogous to classic correlations based on statistical investigations on existing machines. Such new correlations help size a fan of given characteristics as well as study the feasibility of a given design.
Information flow and quantum cryptography using statistical fluctuations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Home, D.; Whitaker, M.A.B.
2003-02-01
A procedure is formulated, using the quantum teleportation arrangement, that communicates knowledge of an apparatus setting between the wings of the experiment, using statistical fluctuations in a sequence of measurement results. It requires an entangled state, and transmission of classical information totally unrelated to the apparatus setting actually communicated. Our procedure has conceptual interest, and has applications to quantum cryptography.
Feynman graphs and the large dimensional limit of multipartite entanglement
NASA Astrophysics Data System (ADS)
Di Martino, Sara; Facchi, Paolo; Florio, Giuseppe
2018-01-01
In this paper, we extend the analysis of multipartite entanglement, based on techniques from classical statistical mechanics, to a system composed of n d-level parties (qudits). We introduce a suitable partition function at a fictitious temperature with the average local purity of the system as Hamiltonian. In particular, we analyze the high-temperature expansion of this partition function, prove the convergence of the series, and study its asymptotic behavior as d → ∞. We make use of a diagrammatic technique, classify the graphs, and study their degeneracy. We are thus able to evaluate their contributions and estimate the moments of the distribution of the local purity.
A Random Variable Approach to Nuclear Targeting and Survivability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Undem, Halvor A.
We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C; Downing, James R; Lamba, Jatinder
2009-08-15
In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org.
Large-Scale Quantitative Analysis of Painting Arts
Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong
2014-01-01
Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877
Wigner surmises and the two-dimensional homogeneous Poisson point process.
Sakhr, Jamal; Nieminen, John M
2006-04-01
We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.
A novel bi-level meta-analysis approach: applied to biological pathway analysis.
Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin
2016-02-01
The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Novel hyperspectral prediction method and apparatus
NASA Astrophysics Data System (ADS)
Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf
2009-05-01
Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.
Quantum and classical behavior in interacting bosonic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzberg, Mark P.
It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less
Bookstein, Fred L; Domjanić, Jacqueline
2014-09-01
The relationship of geometric morphometrics (GMM) to functional analysis of the same morphological resources is currently a topic of active interest among functional morphologists. Although GMM is typically advertised as free of prior assumptions about shape features or morphological theories, it is common for GMM findings to be concordant with findings from studies based on a-priori lists of shape features whenever prior insights or theories have been properly accounted for in the study design. The present paper demonstrates this happy possibility by revisiting a previously published GMM analysis of footprint outlines for which there is also functionally relevant information in the form of a-pri-ori foot measurements. We show how to convert the conventional measurements into the language of shape, thereby affording two parallel statistical analyses. One is the classic multivariate analysis of "shape features", the other the equally classic GMM of semilandmark coordinates. In this example, the two data sets, analyzed by protocols that are remarkably different in both their geometry and their algebra, nevertheless result in one common biometrical summary: wearing high heels is bad for women inasmuch as it leads to the need for orthotic devices to treat the consequently flattened arch. This concordance bears implications for other branches of applied anthropology. To carry out a good biomedical analysis of applied anthropometric data it may not matter whether one uses GMM or instead an adequate assortment of conventional measurements. What matters is whether the conventional measurements have been selected in order to match the natural spectrum of functional variation.
Rasch analysis on OSCE data : An illustrative example.
Tor, E; Steketee, C
2011-01-01
The Objective Structured Clinical Examination (OSCE) is a widely used tool for the assessment of clinical competence in health professional education. The goal of the OSCE is to make reproducible decisions on pass/fail status as well as students' levels of clinical competence according to their demonstrated abilities based on the scores. This paper explores the use of the polytomous Rasch model in evaluating the psychometric properties of OSCE scores through a case study. The authors analysed an OSCE data set (comprised of 11 stations) for 80 fourth year medical students based on the polytomous Rasch model in an effort to answer two research questions: (1) Do the clinical tasks assessed in the 11 OSCE stations map on to a common underlying construct, namely clinical competence? (2) What other insights can Rasch analysis offer in terms of scaling, item analysis and instrument validation over and above the conventional analysis based on classical test theory? The OSCE data set has demonstrated a sufficient degree of fit to the Rasch model (Χ(2) = 17.060, DF=22, p=0.76) indicating that the 11 OSCE station scores have sufficient psychometric properties to form a measure for a common underlying construct, i.e. clinical competence. Individual OSCE station scores with good fit to the Rasch model (p > 0.1 for all Χ(2) statistics) further corroborated the characteristic of unidimensionality of the OSCE scale for clinical competence. A Person Separation Index (PSI) of 0.704 indicates sufficient level of reliability for the OSCE scores. Other useful findings from the Rasch analysis that provide insights, over and above the analysis based on classical test theory, are also exemplified using the data set. The polytomous Rasch model provides a useful and supplementary approach to the calibration and analysis of OSCE examination data.
Statistics of indicated pressure in combustion engine.
NASA Astrophysics Data System (ADS)
Sitnik, L. J.; Andrych-Zalewska, M.
2016-09-01
The paper presents the classic form of pressure waveforms in burn chamber of diesel engine but based on strict analytical basis for amending the displacement volume. The pressure measurement results are obtained in the engine running on an engine dynamometer stand. The study was conducted by a 13-phase ESC test (European Stationary Cycle). In each test phase are archived 90 waveforms of pressure. As a result of extensive statistical analysis was found that while the engine is idling distribution of 90 value of pressure at any value of the angle of rotation of the crankshaft can be described uniform distribution. In the each point of characteristic of the engine corresponding to the individual phases of the ESC test, 90 of the pressure for any value of the angle of rotation of the crankshaft can be described as normal distribution. These relationships are verified using tests: Shapiro-Wilk, Jarque-Bera, Lilliefors, Anderson-Darling. In the following part, with each value of the crank angle, are obtain values of descriptive statistics for the pressure data. In its essence, are obtained a new way to approach the issue of pressure waveform analysis in the burn chamber of engine. The new method can be used to further analysis, especially the combustion process in the engine. It was found, e.g. a very large variances of pressure near the transition from compression to expansion stroke. This lack of stationarity of the process can be important both because of the emissions of exhaust gases and fuel consumption of the engine.
Cooper, Emily A.; Norcia, Anthony M.
2015-01-01
The nervous system has evolved in an environment with structure and predictability. One of the ubiquitous principles of sensory systems is the creation of circuits that capitalize on this predictability. Previous work has identified predictable non-uniformities in the distributions of basic visual features in natural images that are relevant to the encoding tasks of the visual system. Here, we report that the well-established statistical distributions of visual features -- such as visual contrast, spatial scale, and depth -- differ between bright and dark image components. Following this analysis, we go on to trace how these differences in natural images translate into different patterns of cortical input that arise from the separate bright (ON) and dark (OFF) pathways originating in the retina. We use models of these early visual pathways to transform natural images into statistical patterns of cortical input. The models include the receptive fields and non-linear response properties of the magnocellular (M) and parvocellular (P) pathways, with their ON and OFF pathway divisions. The results indicate that there are regularities in visual cortical input beyond those that have previously been appreciated from the direct analysis of natural images. In particular, several dark/bright asymmetries provide a potential account for recently discovered asymmetries in how the brain processes visual features, such as violations of classic energy-type models. On the basis of our analysis, we expect that the dark/bright dichotomy in natural images plays a key role in the generation of both cortical and perceptual asymmetries. PMID:26020624
Penco, Silvana; Buscema, Massimo; Patrosso, Maria Cristina; Marocchi, Alessandro; Grossi, Enzo
2008-05-30
Few genetic factors predisposing to the sporadic form of amyotrophic lateral sclerosis (ALS) have been identified, but the pathology itself seems to be a true multifactorial disease in which complex interactions between environmental and genetic susceptibility factors take place. The purpose of this study was to approach genetic data with an innovative statistical method such as artificial neural networks to identify a possible genetic background predisposing to the disease. A DNA multiarray panel was applied to genotype more than 60 polymorphisms within 35 genes selected from pathways of lipid and homocysteine metabolism, regulation of blood pressure, coagulation, inflammation, cellular adhesion and matrix integrity, in 54 sporadic ALS patients and 208 controls. Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis. An unexpected discovery of a strong genetic background in sporadic ALS using a DNA multiarray panel and analytical processing of the data with advanced artificial neural networks was found. The predictive accuracy obtained with Linear Discriminant Analysis and Standard Artificial Neural Networks ranged from 70% to 79% (average 75.31%) and from 69.1 to 86.2% (average 76.6%) respectively. The corresponding value obtained with Advanced Intelligent Systems reached an average of 96.0% (range 94.4 to 97.6%). This latter approach allowed the identification of seven genetic variants essential to differentiate cases from controls: apolipoprotein E arg158cys; hepatic lipase -480 C/T; endothelial nitric oxide synthase 690 C/T and glu298asp; vitamin K-dependent coagulation factor seven arg353glu, glycoprotein Ia/IIa 873 G/A and E-selectin ser128arg. This study provides an alternative and reliable method to approach complex diseases. Indeed, the application of a novel artificial intelligence-based method offers a new insight into genetic markers of sporadic ALS pointing out the existence of a strong genetic background.
Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A
2010-11-01
Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.
Omics integrating physical techniques: aged Piedmontese meat analysis.
Lana, Alessandro; Longo, Valentina; Dalmasso, Alessandra; D'Alessandro, Angelo; Bottero, Maria Teresa; Zolla, Lello
2015-04-01
Piedmontese meat tenderness becomes higher by extending the ageing period after slaughter up to 44 days. Classical physical analysis only partially explain this evidence, so in order to discover the reason of the potential beneficial effects of prolonged ageing, we performed omic analysis in the Longissimus thoracis muscle by examining main biochemical changes through mass spectrometry-based metabolomics and proteomics. We observed a progressive decline in myofibrillar structural integrity (underpinning meat tenderness) and impaired energy metabolism. Markers of autophagic responses (e.g. serine and glutathione metabolism) and nitrogen metabolism (urea cycle intermediates) accumulated until the end of the assayed period. Key metabolites such as glutamate, a mediator of the appreciated umami taste of the meat, were found to constantly accumulate until day 44. Finally, statistical analyses revealed that glutamate, serine and arginine could serve as good predictors of ultimate meat quality parameters, even though further studies are mandatory. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hearing Outcome With the Use of Glass Ionomer Cement as an Alternative to Crimping in Stapedotomy.
Elzayat, Saad; Younes, Ahmed; Fouad, Ayman; Erfan, Fatthe; Mahrous, Ali
2017-10-01
To evaluate early hearing outcomes using glass ionomer cement to fix the Teflon piston prosthesis onto the long process of incus to minimize residual conductive hearing loss after stapedotomy. Original report of prospective randomized control study. Tertiary referral center. A total of 80 consecutive patients with otosclerosis were randomized into two groups. Group A is a control group in which 40 patients underwent small fenestra stapedotomy using the classic technique. Group B included 40 patients who were subjected to small fenestra stapedotomy with fixation of the incus-prosthesis junction with glass ionomer bone cement. Stapedotomy with the classical technique in group A and the alternative technique in group B. The audiometric results before and after surgery. Analysis of the results was performed using the paired t test to compare between pre and postoperative results. χ test was used to compare the results of the two groups. A p value less than 0.05 was considered significant from the statistical standpoint. Significant postoperative improvement of both pure-tone air conduction thresholds and air-bone gaps were reported in the two studied groups. The postoperative average residual air-bone gap and hearing gain were statistically significant in group B (p < 0.05) compared with group A. The use of glass ionomer bone cement in primary otosclerosis surgery using the aforementioned prosthesis and the surgical technique is of significant value in producing maximal closure of the air-bone gap and better audiological outcomes.
QSAR as a random event: modeling of nanoparticles uptake in PaCa2 cancer cells.
Toropov, Andrey A; Toropova, Alla P; Puzyn, Tomasz; Benfenati, Emilio; Gini, Giuseppina; Leszczynska, Danuta; Leszczynski, Jerzy
2013-06-01
Quantitative structure-property/activity relationships (QSPRs/QSARs) are a tool to predict various endpoints for various substances. The "classic" QSPR/QSAR analysis is based on the representation of the molecular structure by the molecular graph. However, simplified molecular input-line entry system (SMILES) gradually becomes most popular representation of the molecular structure in the databases available on the Internet. Under such circumstances, the development of molecular descriptors calculated directly from SMILES becomes attractive alternative to "classic" descriptors. The CORAL software (http://www.insilico.eu/coral) is provider of SMILES-based optimal molecular descriptors which are aimed to correlate with various endpoints. We analyzed data set on nanoparticles uptake in PaCa2 pancreatic cancer cells. The data set includes 109 nanoparticles with the same core but different surface modifiers (small organic molecules). The concept of a QSAR as a random event is suggested in opposition to "classic" QSARs which are based on the only one distribution of available data into the training and the validation sets. In other words, five random splits into the "visible" training set and the "invisible" validation set were examined. The SMILES-based optimal descriptors (obtained by the Monte Carlo technique) for these splits are calculated with the CORAL software. The statistical quality of all these models is good. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantum-Classical Correspondence Principle for Work Distributions
NASA Astrophysics Data System (ADS)
Jarzynski, Christopher; Quan, H. T.; Rahav, Saar
2015-07-01
For closed quantum systems driven away from equilibrium, work is often defined in terms of projective measurements of initial and final energies. This definition leads to statistical distributions of work that satisfy nonequilibrium work and fluctuation relations. While this two-point measurement definition of quantum work can be justified heuristically by appeal to the first law of thermodynamics, its relationship to the classical definition of work has not been carefully examined. In this paper, we employ semiclassical methods, combined with numerical simulations of a driven quartic oscillator, to study the correspondence between classical and quantal definitions of work in systems with 1 degree of freedom. We find that a semiclassical work distribution, built from classical trajectories that connect the initial and final energies, provides an excellent approximation to the quantum work distribution when the trajectories are assigned suitable phases and are allowed to interfere. Neglecting the interferences between trajectories reduces the distribution to that of the corresponding classical process. Hence, in the semiclassical limit, the quantum work distribution converges to the classical distribution, decorated by a quantum interference pattern. We also derive the form of the quantum work distribution at the boundary between classically allowed and forbidden regions, where this distribution tunnels into the forbidden region. Our results clarify how the correspondence principle applies in the context of quantum and classical work distributions and contribute to the understanding of work and nonequilibrium work relations in the quantum regime.
Hierarchical multivariate covariance analysis of metabolic connectivity.
Carbonell, Felix; Charil, Arnaud; Zijdenbos, Alex P; Evans, Alan C; Bedell, Barry J
2014-12-01
Conventional brain connectivity analysis is typically based on the assessment of interregional correlations. Given that correlation coefficients are derived from both covariance and variance, group differences in covariance may be obscured by differences in the variance terms. To facilitate a comprehensive assessment of connectivity, we propose a unified statistical framework that interrogates the individual terms of the correlation coefficient. We have evaluated the utility of this method for metabolic connectivity analysis using [18F]2-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. As an illustrative example of the utility of this approach, we examined metabolic connectivity in angular gyrus and precuneus seed regions of mild cognitive impairment (MCI) subjects with low and high β-amyloid burdens. This new multivariate method allowed us to identify alterations in the metabolic connectome, which would not have been detected using classic seed-based correlation analysis. Ultimately, this novel approach should be extensible to brain network analysis and broadly applicable to other imaging modalities, such as functional magnetic resonance imaging (MRI).
Statistical properties of the ice particle distribution in stratiform clouds
NASA Astrophysics Data System (ADS)
Delanoe, J.; Tinel, C.; Testud, J.
2003-04-01
This paper presents an extensive analysis of several microphysical data bases CEPEX, EUCREX, CLARE and CARL to determine statistical properties of the Particle Size Distribution (PSD). The data base covers different type of stratiform clouds : tropical cirrus (CEPEX), mid-latitude cirrus (EUCREX) and mid-latitude cirrus and stratus (CARL,CLARE) The approach for analysis uses the concept of normalisation of the PSD developed by Testud et al. (2001). The normalization aims at isolating three independent characteristics of the PSD : its "intrinsic" shape, the "average size" of the spectrum and the ice water content IWC, "average size" is meant the mean mass weighted diameter. It is shown that concentration should be normalized by N_0^* proportional to IWC/D_m^4. The "intrinsic" shape is defined as F(Deq/D_m)=N(Deq)/N_0^* where Deq is the equivalent melted diameter. The "intrinsic" shape is found to be very stable in the range 00
Mutz, Rüdiger; Daniel, Hans-Dieter
2013-06-01
It is often claimed that psychology students' attitudes towards research methods and statistics affect course enrollment, persistence, achievement, and course climate. However, the inter-institutional variability has been widely neglected in the research on students' attitudes towards research methods and statistics, but it is important for didactic purposes (heterogeneity of the student population). The paper presents a scale based on findings of the social psychology of attitudes (polar and emotion-based concept) in conjunction with a method for capturing beginning university students' attitudes towards research methods and statistics and identifying the proportion of students having positive attitudes at the institutional level. The study based on a re-analysis of a nationwide survey in Germany in August 2000 of all psychology students that enrolled in fall 1999/2000 (N= 1,490) and N= 44 universities. Using multilevel latent-class analysis (MLLCA), the aim was to group students in different student attitude types and at the same time to obtain university segments based on the incidences of the different student attitude types. Four student latent clusters were found that can be ranked on a bipolar attitude dimension. Membership in a cluster was predicted by age, grade point average (GPA) on school-leaving exam, and personality traits. In addition, two university segments were found: universities with an average proportion of students with positive attitudes and universities with a high proportion of students with positive attitudes (excellent segment). As psychology students make up a very heterogeneous group, the use of multiple learning activities as opposed to the classical lecture course is required. © 2011 The British Psychological Society.
Analysis of classical guitars' vibrational behavior based on scanning laser vibrometer measurements
NASA Astrophysics Data System (ADS)
Czajkowska, Marzena
2012-06-01
One of the main goals in musical acoustics research is to link measurable, physical properties of a musical instrument with subjective assessments of its tone quality. The aim of the research discussed in this paper was to observe the structural vibrations of different class classical guitars in relation to their quality. This work focuses on mid-low-and low-class classical (nylon-stringed) guitars. The main source of guitar body vibrations come from top and back plate vibrations therefore these were the objects of structural mode measurements and analysis. Sixteen classical guitars have been investigated, nine with cedar and seven with spruce top plate. Structural modes of top and back plates have been measured with the aid of a scanning laser vibrometer and the instruments were excited with a chirp signal transferred by bone vibrator. The issues related to excitor selection have been discussed. Correlation and descriptive statistics of top and back plates measurement results have been investigated in relation to guitar quality. The frequency range of 300 Hz to 5 kHz as well as selected narrowed frequency bands have been analyzed for cedar and spruce guitars. Furthermore, the influence of top plate wood type on vibration characteristics have been observed on three pairs of guitars. The instruments were of the same model but different top plate material. Determination and visualization of both guitar plates' modal patterns in relation to frequency are a significant attainment of the research. Scanning laser vibrometer measurements allow particular mode observation and therefore mode identification, as opposed to sound pressure response measurements. When correlating vibration characteristics of top and back plates it appears that Pearson productmoment correlation coefficient is not a parameter that associates with guitar quality. However, for best instruments with cedar top, top-back correlation coefficient has relatively greater value in 1-2 kHz band and lower in the range of 2,5-5 kHz in comparison with low-class instruments. The study showed that variance, which is a measure of statistical dispersion, is a relevant parameter. The better the quality of the guitar the greater the variance value in 1-2 kHz band. It was observed that higher-quality instruments are characterized by stronger structural resonances of top plate in the range of 4-5 kHz, which means that luthiers should pay special attention to top plate vibrational behaviour in this particular frequency band. Additionally, the result analysis show that best spruce guitars are distinguished by greater top plate vibrations in the range of 2-3 kHz in opposite to best cedar instruments. It can be assumed that top plate vibrations in this particular frequency band may be associated with subjective impression of tone brightness, as commonly known opinion indicates that guitars with spruce tops sound relatively "brighter" in comparison to cedar-top instruments.
Makarov, D V; Kon'kov, L E; Uleysky, M Yu; Petrov, P S
2013-01-01
The problem of sound propagation in a randomly inhomogeneous oceanic waveguide is considered. An underwater sound channel in the Sea of Japan is taken as an example. Our attention is concentrated on the domains of finite-range ray stability in phase space and their influence on wave dynamics. These domains can be found by means of the one-step Poincare map. To study manifestations of finite-range ray stability, we introduce the finite-range evolution operator (FREO) describing transformation of a wave field in the course of propagation along a finite segment of a waveguide. Carrying out statistical analysis of the FREO spectrum, we estimate the contribution of regular domains and explore their evanescence with increasing length of the segment. We utilize several methods of spectral analysis: analysis of eigenfunctions by expanding them over modes of the unperturbed waveguide, approximation of level-spacing statistics by means of the Berry-Robnik distribution, and the procedure used by A. Relano and coworkers [Relano et al., Phys. Rev. Lett. 89, 244102 (2002); Relano, Phys. Rev. Lett. 100, 224101 (2008)]. Comparing the results obtained with different methods, we find that the method based on the statistical analysis of FREO eigenfunctions is the most favorable for estimating the contribution of regular domains. It allows one to find directly the waveguide modes whose refraction is regular despite the random inhomogeneity. For example, it is found that near-axial sound propagation in the Sea of Japan preserves stability even over distances of hundreds of kilometers due to the presence of a shearless torus in the classical phase space. Increasing the acoustic wavelength degrades scattering, resulting in recovery of eigenfunction localization near periodic orbits of the one-step Poincaré map.
Generalized classical and quantum signal theories
NASA Astrophysics Data System (ADS)
Rundblad, E.; Labunets, V.; Novak, P.
2005-05-01
In this paper we develop two topics and show their inter- and cross-relation. The first centers on general notions of the generalized classical signal theory on finite Abelian hypergroups. The second concerns the generalized quantum hyperharmonic analysis of quantum signals (Hermitean operators associated with classical signals). We study classical and quantum generalized convolution hypergroup algebras of classical and quantum signals.
Quantum probability, choice in large worlds, and the statistical structure of reality.
Ross, Don; Ladyman, James
2013-06-01
Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.
Serum Metabolomic Profiling of Piglets Infected with Virulent Classical Swine Fever Virus
Gong, Wenjie; Jia, Junjie; Zhang, Bikai; Mi, Shijiang; Zhang, Li; Xie, Xiaoming; Guo, Huancheng; Shi, Jishu; Tu, Changchun
2017-01-01
Classical swine fever (CSF) is a highly contagious swine infectious disease and causes significant economic losses for the pig industry worldwide. The objective of this study was to determine whether small molecule metabolites contribute to the pathogenesis of CSF. Birefly, serum metabolomics of CSFV Shimen strain-infected piglets were analyzed by ultraperformance liquid chromatography/electrospray ionization time-of-flight mass spectrometry (UPLC/ESI-Q-TOF/MS) in combination with multivariate statistical analysis. In CSFV-infected piglets at days 3 and 7 post-infection changes were found in metabolites associated with several key metabolic pathways, including tryptophan catabolism and the kynurenine pathway, phenylalanine metabolism, fatty acid and lipid metabolism, the tricarboxylic acid and urea cycles, branched-chain amino acid metabolism, and nucleotide metabolism. Several pathways involved in energy metabolism including fatty acid biosynthesis and β-oxidation, branched-chain amino acid metabolism, and the tricarboxylic acid cycle were significantly inhibited. Changes were also observed in several metabolites exclusively associated with gut microbiota. The metabolomic profiles indicate that CSFV-host gut microbiome interactions play a role in the development of CSF. PMID:28496435
Prognostic scores in oesophageal or gastric variceal bleeding.
Ohmann, C; Stöltzing, H; Wins, L; Busch, E; Thon, K
1990-05-01
Numerous scoring systems have been developed for the prediction of outcome of variceal bleeding; however, only a few have been evaluated adequately. The object of this study was to improve the classical Child-Pugh score (CPS) and to test other scores from the literature. Patients (n = 82) with endoscopically confirmed variceal bleeding and long-term sclerotherapy were included in the study. Linear logistic regression (LR) was applied to different sets of prognostic variables with regard to 30-day mortality. In addition, scores from the literature were evaluated on the data set. Performance was measured by the accuracy and receiver-operating characteristic curves. The application of LR to all five CPS variables (accuracy, 80%) was superior to the classical CPS (70%). LR with selection from the CPS variables or from other sets of variables resulted in no improvement. Compared with CPS only three scores from the literature, mainly based on subsets of the CPS variables, showed an improved accuracy. It is concluded that CPS is still a good scoring system; however, it can be improved by statistical analysis using the same variables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paavola, Janika; Hall, Michael J. W.; Paris, Matteo G. A.
The transition from quantum to classical, in the case of a quantum harmonic oscillator, is typically identified with the transition from a quantum superposition of macroscopically distinguishable states, such as the Schroedinger-cat state, into the corresponding statistical mixture. This transition is commonly characterized by the asymptotic loss of the interference term in the Wigner representation of the cat state. In this paper we show that the quantum-to-classical transition has different dynamical features depending on the measure for nonclassicality used. Measures based on an operatorial definition have well-defined physical meaning and allow a deeper understanding of the quantum-to-classical transition. Our analysismore » shows that, for most nonclassicality measures, the Schroedinger-cat state becomes classical after a finite time. Moreover, our results challenge the prevailing idea that more macroscopic states are more susceptible to decoherence in the sense that the transition from quantum to classical occurs faster. Since nonclassicality is a prerequisite for entanglement generation our results also bridge the gap between decoherence, which is lost only asymptotically, and entanglement, which may show a ''sudden death''. In fact, whereas the loss of coherences still remains asymptotic, we emphasize that the transition from quantum to classical can indeed occur at a finite time.« less
Tsallis non-extensive statistics and solar wind plasma complexity
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.
2015-03-01
This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).
NASA Astrophysics Data System (ADS)
Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.
2018-02-01
Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.
Classical and sequential limit analysis revisited
NASA Astrophysics Data System (ADS)
Leblond, Jean-Baptiste; Kondo, Djimédo; Morin, Léo; Remmal, Almahdi
2018-04-01
Classical limit analysis applies to ideal plastic materials, and within a linearized geometrical framework implying small displacements and strains. Sequential limit analysis was proposed as a heuristic extension to materials exhibiting strain hardening, and within a fully general geometrical framework involving large displacements and strains. The purpose of this paper is to study and clearly state the precise conditions permitting such an extension. This is done by comparing the evolution equations of the full elastic-plastic problem, the equations of classical limit analysis, and those of sequential limit analysis. The main conclusion is that, whereas classical limit analysis applies to materials exhibiting elasticity - in the absence of hardening and within a linearized geometrical framework -, sequential limit analysis, to be applicable, strictly prohibits the presence of elasticity - although it tolerates strain hardening and large displacements and strains. For a given mechanical situation, the relevance of sequential limit analysis therefore essentially depends upon the importance of the elastic-plastic coupling in the specific case considered.
Using of bayesian networks to estimate the probability of "NATECH" scenario occurrence
NASA Astrophysics Data System (ADS)
Dobes, Pavel; Dlabka, Jakub; Jelšovská, Katarína; Polorecká, Mária; Baudišová, Barbora; Danihelka, Pavel
2015-04-01
In the twentieth century, implementation of Bayesian statistics and probability was not much used (may be it wasn't a preferred approach) in the area of natural and industrial risk analysis and management. Neither it was used within analysis of so called NATECH accidents (chemical accidents triggered by natural events, such as e.g. earthquakes, floods, lightning etc.; ref. E. Krausmann, 2011, doi:10.5194/nhess-11-921-2011). Main role, from the beginning, played here so called "classical" frequentist probability (ref. Neyman, 1937), which rely up to now especially on the right/false results of experiments and monitoring and didn't enable to count on expert's beliefs, expectations and judgements (which is, on the other hand, one of the once again well known pillars of Bayessian approach to probability). In the last 20 or 30 years, there is possible to observe, through publications and conferences, the Renaissance of Baysssian statistics into many scientific disciplines (also into various branches of geosciences). The necessity of a certain level of trust in expert judgment within risk analysis is back? After several decades of development on this field, it could be proposed following hypothesis (to be checked): "We couldn't estimate probabilities of complex crisis situations and their TOP events (many NATECH events could be classified as crisis situations or emergencies), only by classical frequentist approach, but also by using of Bayessian approach (i.e. with help of prestaged Bayessian Network including expert belief and expectation as well as classical frequentist inputs). Because - there is not always enough quantitative information from monitoring of historical emergencies, there could be several dependant or independant variables necessary to consider and in generally - every emergency situation always have a little different run." In this topic, team of authors presents its proposal of prestaged typized Bayessian network model for specified NATECH scenario (heavy rainfalls AND/OR melting snow OR earthquake -> landslides AND/OR floods -> major chemical accident), comparing it with "Black Box approach" and with so called "Bow-tie approach" (ref. C. A. Brebbia, Risk Analysis VIII, p.103-111 , WIT Press, 2012) - visualisation of development of the scenario with possibility to calculate frequencies (TOP event of the scenario, developed both ways down to initation events and upwards to end accidental events, using Fault Tree Analysis and Event Tree Analysis methods). This model can include also possible terrorist attack on the chemical facility with potential of major release of chemical into the environmental compartments (water, soil, air), with the goal to threaten environmental safety in the specific area. The study was supported by the project no. VG20132015128 "Increasing of the Environmental Safety & Security by the Prevention of Industrial Chemicals Misuse to the Terrorism", supported by the Ministry of the Interior of the Czech Republic through Security Research Programme, 2013-2015.
Statistical and linguistic features of DNA sequences
NASA Technical Reports Server (NTRS)
Havlin, S.; Buldyrev, S. V.; Goldberger, A. L.; Mantegna, R. N.; Peng, C. K.; Simons, M.; Stanley, H. E.
1995-01-01
We present evidence supporting the idea that the DNA sequence in genes containing noncoding regions is correlated, and that the correlation is remarkably long range--indeed, base pairs thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene. We resolve the problem of the "non-stationary" feature of the sequence of base pairs by applying a new algorithm called Detrended Fluctuation Analysis (DFA). We address the claim of Voss that there is no difference in the statistical properties of coding and noncoding regions of DNA by systematically applying the DFA algorithm, as well as standard FFT analysis, to all eukaryotic DNA sequences (33 301 coding and 29 453 noncoding) in the entire GenBank database. We describe a simple model to account for the presence of long-range power-law correlations which is based upon a generalization of the classic Levy walk. Finally, we describe briefly some recent work showing that the noncoding sequences have certain statistical features in common with natural languages. Specifically, we adapt to DNA the Zipf approach to analyzing linguistic texts, and the Shannon approach to quantifying the "redundancy" of a linguistic text in terms of a measurable entropy function. We suggest that noncoding regions in plants and invertebrates may display a smaller entropy and larger redundancy than coding regions, further supporting the possibility that noncoding regions of DNA may carry biological information.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.
Symmetric and asymmetric ternary fission of hot nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siwek-Wilczynska, K.; Wilczynski, J.; Leegte, H.K.W.
1993-07-01
Emission of [alpha] particles accompanying fusion-fission processes in the [sup 40]Ar +[sup 232]Th reaction at [ital E]([sup 40]Ar) = 365 MeV was studied in a wide range of in-fission-plane and out-of-plane angles. The exact determination of the emission angles of both fission fragments combined with the time-of-flight measurements allowed us to reconstruct the complete kinematics of each ternary event. The coincident energy spectra of [alpha] particles were analyzed by using predictions of the energy spectra of the statistical code CASCADE . The analysis clearly demonstrates emission from the composite system prior to fission, emission from fully accelerated fragments after fission,more » and also emission during scission. The analysis is presented for both symmetric and asymmetric fission. The results have been analyzed using a time-dependent statistical decay code and confronted with dynamical calculations based on a classical one-body dissipation model. The observed near-scission emission is consistent with evaporation from a dinuclear system just before scission and evaporation from separated fragments just after scission. The analysis suggests that the time scale of fission of the hot composite systems is long (about 7[times]10[sup [minus]20] s) and the motion during the descent to scission almost completely damped.« less
Yassi, Annalee; Tate, Robert B; Routledge, Michael
2003-07-01
This study is an extension of a previously published analysis of cancer mortality in a transformer manufacturing plant where there had been extensive use of mineral oil transformer fluid. The objectives of the present study were to update the mortality analysis and include deaths for the past 6 years as well as to do an analysis of cancer incidence of the cohort. A cohort of 2,222 males working at a transformer manufacturing plant between 1946 and 1975 was constructed. Using a classical historical cohort study design, cancer incidence and mortality were determined through record linkage with Canadian provincial and national registries. The rates of cancer incidence and mortality experienced by this cohort were compared to that of the Canadian male population. A statistically significant increased risk of developing and dying of pancreatic cancer was found but not an increase in overall cancer mortality. This was consistent with the previous report from this group. Interestingly, the cohort demonstrated a statistically significant risk of overall cancer incidence and specific increased incidence of gallbladder cancer. This study contributes further evidence to the growing body of literature indicating the carcinogenic properties of mineral oils used in occupational settings, in particular those used prior to 1970s. Copyright 2003 Wiley-Liss, Inc.
An Update on Statistical Boosting in Biomedicine.
Mayr, Andreas; Hofner, Benjamin; Waldmann, Elisabeth; Hepp, Tobias; Meyer, Sebastian; Gefeller, Olaf
2017-01-01
Statistical boosting algorithms have triggered a lot of research during the last decade. They combine a powerful machine learning approach with classical statistical modelling, offering various practical advantages like automated variable selection and implicit regularization of effect estimates. They are extremely flexible, as the underlying base-learners (regression functions defining the type of effect for the explanatory variables) can be combined with any kind of loss function (target function to be optimized, defining the type of regression setting). In this review article, we highlight the most recent methodological developments on statistical boosting regarding variable selection, functional regression, and advanced time-to-event modelling. Additionally, we provide a short overview on relevant applications of statistical boosting in biomedicine.
Statistical ultrasonics: the influence of Robert F. Wagner
NASA Astrophysics Data System (ADS)
Insana, Michael F.
2009-02-01
An important ongoing question for higher education is how to successfully mentor the next generation of scientists and engineers. It has been my privilege to have been mentored by one of the best, Dr Robert F. Wagner and his colleagues at the CDRH/FDA during the mid 1980s. Bob introduced many of us in medical ultrasonics to statistical imaging techniques. These ideas continue to broadly influence studies on adaptive aperture management (beamforming, speckle suppression, compounding), tissue characterization (texture features, Rayleigh/Rician statistics, scatterer size and number density estimators), and fundamental questions about how limitations of the human eye-brain system for extracting information from textured images can motivate image processing. He adapted the classical techniques of signal detection theory to coherent imaging systems that, for the first time in ultrasonics, related common engineering metrics for image quality to task-based clinical performance. This talk summarizes my wonderfully-exciting three years with Bob as I watched him explore topics in statistical image analysis that formed a rational basis for many of the signal processing techniques used in commercial systems today. It is a story of an exciting time in medical ultrasonics, and of how a sparkling personality guided and motivated the development of junior scientists who flocked around him in admiration and amazement.
Ideal proportions in full face front view, contemporary versus antique.
Mommaerts, M Y; Moerenhout, B A M M L
2011-03-01
To compare the facial proportions of contemporary harmonious faces with those of antiquity, to validate classical canons and to determine new ones useful in orthofacial surgery planning. Contemporary beautiful faces were retrieved from yearly polls of People Magazine and FHM. Selected B/W frontal facial photographs of 31 men and 74 women were ranked by 20 patients who had to undergo orthofacial surgery. The top-15 female faces and the top-10 male faces were analyzed with Scion Image software. The classical facial index, the Bruges facial index, the ratio lower facial height/total facial height and the vertical tri-partite of the lower face were calculated. The same analysis was done on pictures of classical sculptures representing seven goddesses and 12 gods. Harmonious contemporary female faces have a significantly lower classical facial index, indicating that facial height is less or facial width is larger than in male and even than in antique female faces. The Bruges index indicates a similar difference between ideal contemporary female and male faces. The contemporary male has a higher lower face (48%) compared to total facial height than the contemporary female (45%), although this is statistically not significant (P=0.08). The lower facial thirds index remained quite stabile for 2500 years, without gender difference. A good canon for both sexes today is stomion-gnathion being 70% of subnasale-stomion. The average ideal contemporary female face is shorter than the male face, given the fact that interpupillary distance is similar. The Vitruvian thirds in the lower face have to be adjusted to a 30% upper lip, 70% lower lip-chin proportion. The contemporary ideal ratios are suitable to be implemented in an orthofacial planning concept. Copyright © 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Dynamic Network-Based Epistasis Analysis: Boolean Examples
Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.
2011-01-01
In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and single-path assumption, but also by demonstrating the importance of considering temporal dynamics, and specifically introducing the usefulness of Boolean network models and also reviewing some key properties of network approaches. PMID:22645556
NASA Astrophysics Data System (ADS)
Most, S.; Nowak, W.; Bijeljic, B.
2014-12-01
Transport processes in porous media are frequently simulated as particle movement. This process can be formulated as a stochastic process of particle position increments. At the pore scale, the geometry and micro-heterogeneities prohibit the commonly made assumption of independent and normally distributed increments to represent dispersion. Many recent particle methods seek to loosen this assumption. Recent experimental data suggest that we have not yet reached the end of the need to generalize, because particle increments show statistical dependency beyond linear correlation and over many time steps. The goal of this work is to better understand the validity regions of commonly made assumptions. We are investigating after what transport distances can we observe: A statistical dependence between increments, that can be modelled as an order-k Markov process, boils down to order 1. This would be the Markovian distance for the process, where the validity of yet-unexplored non-Gaussian-but-Markovian random walks would start. A bivariate statistical dependence that simplifies to a multi-Gaussian dependence based on simple linear correlation (validity of correlated PTRW). Complete absence of statistical dependence (validity of classical PTRW/CTRW). The approach is to derive a statistical model for pore-scale transport from a powerful experimental data set via copula analysis. The model is formulated as a non-Gaussian, mutually dependent Markov process of higher order, which allows us to investigate the validity ranges of simpler models.
On information, negentropy and H-theorem
NASA Astrophysics Data System (ADS)
Chakrabarti, C. G.; Sarker, N. G.
1983-09-01
The paper deals with the imprtance of the Kullback descrimination information in the statistical characterization of negentropy of non-equilibrium state and the irreversibility of a classical dynamical system. The theory based on the Kullback discrimination information as the H-function gives new insight into the interrelation between the concepts of coarse-graining and the principle of sufficiency leading to important statistical characterization of thermal equilibrium of a closed system.
Minimum Uncertainty Coherent States Attached to Nondegenerate Parametric Amplifiers
NASA Astrophysics Data System (ADS)
Dehghani, A.; Mojaveri, B.
2015-06-01
Exact analytical solutions for the two-mode nondegenerate parametric amplifier have been obtained by using the transformation from the two-dimensional harmonic oscillator Hamiltonian. Some important physical properties such as quantum statistics and quadrature squeezing of the corresponding states are investigated. In addition, these states carry classical features such as Poissonian statistics and minimize the Heisenberg uncertainty relation of a pair of the coordinate and the momentum operators.
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R.; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C.; Downing, James R.; Lamba, Jatinder
2009-01-01
Motivation: In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Results: Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Availability: Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org. Contact: stanley.pounds@stjude.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19528086
Jiang, Xiaoye; Yao, Yuan; Liu, Han; Guibas, Leonidas
2014-01-01
Modern data acquisition routinely produces massive amounts of network data. Though many methods and models have been proposed to analyze such data, the research of network data is largely disconnected with the classical theory of statistical learning and signal processing. In this paper, we present a new framework for modeling network data, which connects two seemingly different areas: network data analysis and compressed sensing. From a nonparametric perspective, we model an observed network using a large dictionary. In particular, we consider the network clique detection problem and show connections between our formulation with a new algebraic tool, namely Randon basis pursuit in homogeneous spaces. Such a connection allows us to identify rigorous recovery conditions for clique detection problems. Though this paper is mainly conceptual, we also develop practical approximation algorithms for solving empirical problems and demonstrate their usefulness on real-world datasets. PMID:25620806
Scala, Giovanni; Affinito, Ornella; Palumbo, Domenico; Florio, Ermanno; Monticelli, Antonella; Miele, Gennaro; Chiariotti, Lorenzo; Cocozza, Sergio
2016-11-25
CpG sites in an individual molecule may exist in a binary state (methylated or unmethylated) and each individual DNA molecule, containing a certain number of CpGs, is a combination of these states defining an epihaplotype. Classic quantification based approaches to study DNA methylation are intrinsically unable to fully represent the complexity of the underlying methylation substrate. Epihaplotype based approaches, on the other hand, allow methylation profiles of cell populations to be studied at the single molecule level. For such investigations, next-generation sequencing techniques can be used, both for quantitative and for epihaplotype analysis. Currently available tools for methylation analysis lack output formats that explicitly report CpG methylation profiles at the single molecule level and that have suited statistical tools for their interpretation. Here we present ampliMethProfiler, a python-based pipeline for the extraction and statistical epihaplotype analysis of amplicons from targeted deep bisulfite sequencing of multiple DNA regions. ampliMethProfiler tool provides an easy and user friendly way to extract and analyze the epihaplotype composition of reads from targeted bisulfite sequencing experiments. ampliMethProfiler is written in python language and requires a local installation of BLAST and (optionally) QIIME tools. It can be run on Linux and OS X platforms. The software is open source and freely available at http://amplimethprofiler.sourceforge.net .
A Photon Interference Detector with Continuous Display.
ERIC Educational Resources Information Center
Gilmore, R. S.
1978-01-01
Describes an apparatus which attempts to give a direct visual impression of the random detection of individual photons coupled with the recognition of the classical intensity distribution as a result of fairly high proton statistics. (Author/GA)
Colors of Inner Disk Classical Kuiper Belt Objects
NASA Astrophysics Data System (ADS)
Romanishin, W.; Tegler, S. C.; Consolmagno, G. J.
2010-07-01
We present new optical broadband colors, obtained with the Keck 1 and Vatican Advanced Technology telescopes, for six objects in the inner classical Kuiper Belt. Objects in the inner classical Kuiper Belt are of interest as they may represent the surviving members of the primordial Kuiper Belt that formed interior to the current position of the 3:2 resonance with Neptune, the current position of the plutinos, or, alternatively, they may be objects formed at a different heliocentric distance that were then moved to their present locations. The six new colors, combined with four previously published, show that the ten inner belt objects with known colors form a neutral clump and a reddish clump in B-R color. Nonparametric statistical tests show no significant difference between the B-R color distribution of the inner disk objects compared to the color distributions of Centaurs, plutinos, or scattered disk objects. However, the B-R color distribution of the inner classical Kuiper Belt Objects does differ significantly from the distribution of colors in the cold (low inclination) main classical Kuiper Belt. The cold main classical objects are predominately red, while the inner classical belt objects are a mixture of neutral and red. The color difference may reveal the existence of a gradient in the composition and/or surface processing history in the primordial Kuiper Belt, or indicate that the inner disk objects are not dynamically analogous to the cold main classical belt objects.
COLORS OF INNER DISK CLASSICAL KUIPER BELT OBJECTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romanishin, W.; Tegler, S. C.; Consolmagno, G. J., E-mail: wromanishin@ou.ed, E-mail: Stephen.Tegler@nau.ed, E-mail: gjc@specola.v
2010-07-15
We present new optical broadband colors, obtained with the Keck 1 and Vatican Advanced Technology telescopes, for six objects in the inner classical Kuiper Belt. Objects in the inner classical Kuiper Belt are of interest as they may represent the surviving members of the primordial Kuiper Belt that formed interior to the current position of the 3:2 resonance with Neptune, the current position of the plutinos, or, alternatively, they may be objects formed at a different heliocentric distance that were then moved to their present locations. The six new colors, combined with four previously published, show that the ten innermore » belt objects with known colors form a neutral clump and a reddish clump in B-R color. Nonparametric statistical tests show no significant difference between the B-R color distribution of the inner disk objects compared to the color distributions of Centaurs, plutinos, or scattered disk objects. However, the B-R color distribution of the inner classical Kuiper Belt Objects does differ significantly from the distribution of colors in the cold (low inclination) main classical Kuiper Belt. The cold main classical objects are predominately red, while the inner classical belt objects are a mixture of neutral and red. The color difference may reveal the existence of a gradient in the composition and/or surface processing history in the primordial Kuiper Belt, or indicate that the inner disk objects are not dynamically analogous to the cold main classical belt objects.« less
NASA Astrophysics Data System (ADS)
Rotter, Stefan; Aigner, Florian; Burgdörfer, Joachim
2007-03-01
We investigate the statistical distribution of transmission eigenvalues in phase-coherent transport through quantum dots. In two-dimensional ab initio simulations for both clean and disordered two-dimensional cavities, we find markedly different quantum-to-classical crossover scenarios for these two cases. In particular, we observe the emergence of “noiseless scattering states” in clean cavities, irrespective of sharp-edged entrance and exit lead mouths. We find the onset of these “classical” states to be largely independent of the cavity’s classical chaoticity, but very sensitive with respect to bulk disorder. Our results suggest that for weakly disordered cavities, the transmission eigenvalue distribution is determined both by scattering at the disorder potential and the cavity walls. To properly account for this intermediate parameter regime, we introduce a hybrid crossover scheme, which combines previous models that are valid in the ballistic and the stochastic limit, respectively.
Interconnect fatigue design for terrestrial photovoltaic modules
NASA Technical Reports Server (NTRS)
Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.
1982-01-01
The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.
Interconnect fatigue design for terrestrial photovoltaic modules
NASA Astrophysics Data System (ADS)
Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.
1982-03-01
The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.
Turbulent statistics and intermittency enhancement in coflowing superfluid 4He
NASA Astrophysics Data System (ADS)
Biferale, L.; Khomenko, D.; L'vov, V.; Pomyalov, A.; Procaccia, I.; Sahoo, G.
2018-02-01
The large-scale turbulent statistics of mechanically driven superfluid 4He was shown experimentally to follow the classical counterpart. In this paper, we use direct numerical simulations to study the whole range of scales in a range of temperatures T ∈[1.3 ,2.1 ] K. The numerics employ self-consistent and nonlinearly coupled normal and superfluid components. The main results are that (i) the velocity fluctuations of normal and super components are well correlated in the inertial range of scales, but decorrelate at small scales. (ii) The energy transfer by mutual friction between components is particulary efficient in the temperature range between 1.8 and 2 K, leading to enhancement of small-scale intermittency for these temperatures. (iii) At low T and close to Tλ, the scaling properties of the energy spectra and structure functions of the two components are approaching those of classical hydrodynamic turbulence.
Quantum chaos: an introduction via chains of interacting spins-1/2
NASA Astrophysics Data System (ADS)
Gubin, Aviva; Santos, Lea
2012-02-01
We discuss aspects of quantum chaos by focusing on spectral statistical properties and structures of eigenstates of quantum many-body systems. Quantum systems whose classical counterparts are chaotic have properties that differ from those of quantum systems whose classical counterparts are regular. One of the main signatures of what became known as quantum chaos is a spectrum showing repulsion of the energy levels. We show how level repulsion may develop in one-dimensional systems of interacting spins-1/2 which are devoid of random elements and involve only two-body interactions. We present a simple recipe to unfold the spectrum and emphasize the importance of taking into account the symmetries of the system. In addition to the statistics of eigenvalues, we analyze also how the structure of the eigenstates may indicate chaos. This is done by computing quantities that measure the level of delocalization of the eigenstates.
Finite-size effect on optimal efficiency of heat engines.
Tajima, Hiroyasu; Hayashi, Masahito
2017-07-01
The optimal efficiency of quantum (or classical) heat engines whose heat baths are n-particle systems is given by the strong large deviation. We give the optimal work extraction process as a concrete energy-preserving unitary time evolution among the heat baths and the work storage. We show that our optimal work extraction turns the disordered energy of the heat baths to the ordered energy of the work storage, by evaluating the ratio of the entropy difference to the energy difference in the heat baths and the work storage, respectively. By comparing the statistical mechanical optimal efficiency with the macroscopic thermodynamic bound, we evaluate the accuracy of the macroscopic thermodynamics with finite-size heat baths from the statistical mechanical viewpoint. We also evaluate the quantum coherence effect on the optimal efficiency of the cycle processes without restricting their cycle time by comparing the classical and quantum optimal efficiencies.
Nano-swimmers in biological membranes and propulsion hydrodynamics in two dimensions.
Huang, Mu-Jie; Chen, Hsuan-Yi; Mikhailov, Alexander S
2012-11-01
Active protein inclusions in biological membranes can represent nano-swimmers and propel themselves in lipid bilayers. A simple model of an active inclusion with three particles (domains) connected by variable elastic links is considered. First, the membrane is modeled as a two-dimensional viscous fluid and propulsion behavior in two dimensions is examined. After that, an example of a microscopic dynamical simulation is presented, where the lipid bilayer structure of the membrane is resolved and the solvent effects are included by multiparticle collision dynamics. Statistical analysis of data reveals ballistic motion of the swimmer, in contrast to the classical diffusion behavior found in the absence of active transitions between the states.
NASA Astrophysics Data System (ADS)
Nazarenko, Sergey
2015-07-01
Wave turbulence is the statistical mechanics of random waves with a broadband spectrum interacting via non-linearity. To understand its difference from non-random well-tuned coherent waves, one could compare the sound of thunder to a piece of classical music. Wave turbulence is surprisingly common and important in a great variety of physical settings, starting with the most familiar ocean waves to waves at quantum scales or to much longer waves in astrophysics. We will provide a basic overview of the wave turbulence ideas, approaches and main results emphasising the physics of the phenomena and using qualitative descriptions avoiding, whenever possible, involved mathematical derivations. In particular, dimensional analysis will be used for obtaining the key scaling solutions in wave turbulence - Kolmogorov-Zakharov (KZ) spectra.
Horizon Entropy from Quantum Gravity Condensates.
Oriti, Daniele; Pranzetti, Daniele; Sindoni, Lorenzo
2016-05-27
We construct condensate states encoding the continuum spherically symmetric quantum geometry of a horizon in full quantum gravity, i.e., without any classical symmetry reduction, in the group field theory formalism. Tracing over the bulk degrees of freedom, we show how the resulting reduced density matrix manifestly exhibits a holographic behavior. We derive a complete orthonormal basis of eigenstates for the reduced density matrix of the horizon and use it to compute the horizon entanglement entropy. By imposing consistency with the horizon boundary conditions and semiclassical thermodynamical properties, we recover the Bekenstein-Hawking entropy formula for any value of the Immirzi parameter. Our analysis supports the equivalence between the von Neumann (entanglement) entropy interpretation and the Boltzmann (statistical) one.
Statistics in biomedical laboratory and clinical science: applications, issues and pitfalls.
Ludbrook, John
2008-01-01
This review is directed at biomedical scientists who want to gain a better understanding of statistics: what tests to use, when, and why. In my view, even during the planning stage of a study it is very important to seek the advice of a qualified biostatistician. When designing and analyzing a study, it is important to construct and test global hypotheses, rather than to make multiple tests on the data. If the latter cannot be avoided, it is essential to control the risk of making false-positive inferences by applying multiple comparison procedures. For comparing two means or two proportions, it is best to use exact permutation tests rather then the better known, classical, ones. For comparing many means, analysis of variance, often of a complex type, is the most powerful approach. The correlation coefficient should never be used to compare the performances of two methods of measurement, or two measures, because it does not detect bias. Instead the Altman-Bland method of differences or least-products linear regression analysis should be preferred. Finally, the educational value to investigators of interaction with a biostatistician, before, during and after a study, cannot be overemphasized. (c) 2007 S. Karger AG, Basel.
Grdanoska, Tatjana; Zafirovska, Planinka; Jaglikovski, Branko; Pavlovska, Irina; Zafirova, Beti; Tosheska-Trajkovska, Katerina; Trajkovska-Dokic, Elena; Petrovska, Milena; Cekovska, Zhaklina; Kondova-Topuzovska, Irena; Georgievska-Ismail, Ljubica; Panovski, Nikola
2012-01-01
Background: Chronic infections in CHD are due to one or both of the organisms Chlamydia pneumoniae and Helicobacter pylori. Aim: To examine the association between serum markers of Chlamydia pneumoniae and Helicobacter pylori infection and markers of myocardial damage. in patients with acute coronary syndrome (ACS), with chronic coronary artery disease (CAD) and in–control group. Material and methods: Sera were taken from a total of 153 subjects. Subjects were divided in three groups: 64 patients with ACS; 53 patients with CAD and a group of 35 conditionally healthy individuals. Analysis of patients’ sera for IgG antibodies to H. pylori and markers for myocardial damage was done on the Immulite system. The presence of specific IgG and IgA antibodies to C. pneumoniae was determined with MIF, Sero FIA (Savyon Diagnostics, Israel). Statistical analysis of data was done using the statistical program SPSS (Statistical Package for Social Sciences), version 13. Results and discussion: There was a high significant difference in troponin levels between the three groups of subjects (p=0.0000). Levels of creatine kinase isoenzyme (CK-MB) were highest in the ACS group (500.0 ng/mL). There was a statistically significant difference between CG subjects and ACS patients due to more frequent detection of antichlamydial IgA antibodies in patients with acute coronary syndrome. Positive serum immune response for Helicobacter pylori was 17 (53.1%) and 29 (80.6%), respectively. Conclusion: Increased IgA antibody titers for C. Pneumoniae, increased CRP values as well as classic markers of myocardial damage are risk factors for coronary events. PMID:23922522
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorkyan, A. S., E-mail: g-ashot@sci.am; Sahakyan, V. V.
We study the classical 1D Heisenberg spin glasses in the framework of nearest-neighboring model. Based on the Hamilton equations we obtained the system of recurrence equations which allows to perform node-by-node calculations of a spin-chain. It is shown that calculations from the first principles of classical mechanics lead to ℕℙ hard problem, that however in the limit of the statistical equilibrium can be calculated by ℙ algorithm. For the partition function of the ensemble a new representation is offered in the form of one-dimensional integral of spin-chains’ energy distribution.
A classical density-functional theory for describing water interfaces.
Hughes, Jessica; Krebs, Eric J; Roundy, David
2013-01-14
We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.
Quantum communication complexity advantage implies violation of a Bell inequality
Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii
2016-01-01
We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600
Perceptual basis of evolving Western musical styles
Rodriguez Zivic, Pablo H.; Shifres, Favio; Cecchi, Guillermo A.
2013-01-01
The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation. PMID:23716669
Carreno-Quintero, Natalia; Acharjee, Animesh; Maliepaard, Chris; Bachem, Christian W.B.; Mumm, Roland; Bouwmeester, Harro; Visser, Richard G.F.; Keurentjes, Joost J.B.
2012-01-01
Recent advances in -omics technologies such as transcriptomics, metabolomics, and proteomics along with genotypic profiling have permitted dissection of the genetics of complex traits represented by molecular phenotypes in nonmodel species. To identify the genetic factors underlying variation in primary metabolism in potato (Solanum tuberosum), we have profiled primary metabolite content in a diploid potato mapping population, derived from crosses between S. tuberosum and wild relatives, using gas chromatography-time of flight-mass spectrometry. In total, 139 polar metabolites were detected, of which we identified metabolite quantitative trait loci for approximately 72% of the detected compounds. In order to obtain an insight into the relationships between metabolic traits and classical phenotypic traits, we also analyzed statistical associations between them. The combined analysis of genetic information through quantitative trait locus coincidence and the application of statistical learning methods provide information on putative indicators associated with the alterations in metabolic networks that affect complex phenotypic traits. PMID:22223596
Censored data treatment using additional information in intelligent medical systems
NASA Astrophysics Data System (ADS)
Zenkova, Z. N.
2015-11-01
Statistical procedures are a very important and significant part of modern intelligent medical systems. They are used for proceeding, mining and analysis of different types of the data about patients and their diseases; help to make various decisions, regarding the diagnosis, treatment, medication or surgery, etc. In many cases the data can be censored or incomplete. It is a well-known fact that censorship considerably reduces the efficiency of statistical procedures. In this paper the author makes a brief review of the approaches which allow improvement of the procedures using additional information, and describes a modified estimation of an unknown cumulative distribution function involving additional information about a quantile which is known exactly. The additional information is used by applying a projection of a classical estimator to a set of estimators with certain properties. The Kaplan-Meier estimator is considered as an estimator of the unknown cumulative distribution function, the properties of the modified estimator are investigated for a case of a single right censorship by means of simulations.
Exploring Attitudes of Indian Classical Singers Toward Seeking Vocal Health Care.
Gunjawate, Dhanshree R; Aithal, Venkataraja U; Guddattu, Vasudeva; Kishore, Amrutha; Bellur, Rajashekhar
2016-11-01
The attitude of Indian classical singers toward seeking vocal health care is a dimension yet to be explored. The current study was aimed to determine the attitudes of these singers toward seeking vocal health care and further understand the influence of age and gender. Cross-sectional. A 10-item self-report questionnaire adapted from a study on contemporary commercial music singers was used. An additional question was added to ask if the singer was aware about the profession and role of speech-language pathologists (SLPs). The questionnaire was administered on 55 randomly selected self-identified trained Indian classical singers who rated the items using a five-point Likert scale. Demographic variables were summarized using descriptive statistics and t test was used to compare the mean scores between genders and age groups. Of the singers, 78.2% were likely to see a doctor for heath-related problems, whereas 81.8% were unlikely to seek medical care for voice-related problems; the difference was statistically significant (P < 0.001). Responses for the questions assessing the attitudes toward findings from medical examination by a specialist revealed a statistically significant difference (P = 0.02) between the genders. Age did not have a significant influence on the responses. Only 23.6% of the respondents were aware about the profession and the role of SLPs. The findings are in tune with western literature reporting hesitation of singers toward seeking vocal health care and draws attention of SLPs to promote their role in vocal health awareness and management. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Quantum-Like Representation of Non-Bayesian Inference
NASA Astrophysics Data System (ADS)
Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.
2013-01-01
This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.
Non-classical Signature of Parametric Fluorescence and its Application in Metrology
NASA Astrophysics Data System (ADS)
Hamar, M.; Michálek, V.; Pathak, A.
2014-08-01
The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.
Statistical modelling of networked human-automation performance using working memory capacity.
Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja
2014-01-01
This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.
Analysis of obsidian from moho cay, belize: new evidence on classic maya trade routes.
Healy, P F; McKillop, H I; Walsh, B
1984-07-27
Trace element analysis of obsidian artifacts from Moho Cay, Belize, reveals that the obsidian derives primarily from the El Chayal outcrop in highland Guatemala and not from the Ixtepeque source. This is contrary to the widely accepted obsidian trade route model for Classic Maya civilization and suggests that Classic Maya obsidian trade was a more complex economic phenomenon than has been recognized.
Hearing the shape of the Ising model with a programmable superconducting-flux annealer.
Vinci, Walter; Markström, Klas; Boixo, Sergio; Roy, Aidan; Spedalieri, Federico M; Warburton, Paul A; Severini, Simone
2014-07-16
Two objects can be distinguished if they have different measurable properties. Thus, distinguishability depends on the Physics of the objects. In considering graphs, we revisit the Ising model as a framework to define physically meaningful spectral invariants. In this context, we introduce a family of refinements of the classical spectrum and consider the quantum partition function. We demonstrate that the energy spectrum of the quantum Ising Hamiltonian is a stronger invariant than the classical one without refinements. For the purpose of implementing the related physical systems, we perform experiments on a programmable annealer with superconducting flux technology. Departing from the paradigm of adiabatic computation, we take advantage of a noisy evolution of the device to generate statistics of low energy states. The graphs considered in the experiments have the same classical partition functions, but different quantum spectra. The data obtained from the annealer distinguish non-isomorphic graphs via information contained in the classical refinements of the functions but not via the differences in the quantum spectra.
NASA Astrophysics Data System (ADS)
Zhou, Chi-Chun; Dai, Wu-Sheng
2018-02-01
In statistical mechanics, for a system with a fixed number of particles, e.g. a finite-size system, strictly speaking, the thermodynamic quantity needs to be calculated in the canonical ensemble. Nevertheless, the calculation of the canonical partition function is difficult. In this paper, based on the mathematical theory of the symmetric function, we suggest a method for the calculation of the canonical partition function of ideal quantum gases, including ideal Bose, Fermi, and Gentile gases. Moreover, we express the canonical partition functions of interacting classical and quantum gases given by the classical and quantum cluster expansion methods in terms of the Bell polynomial in mathematics. The virial coefficients of ideal Bose, Fermi, and Gentile gases are calculated from the exact canonical partition function. The virial coefficients of interacting classical and quantum gases are calculated from the canonical partition function by using the expansion of the Bell polynomial, rather than calculated from the grand canonical potential.
Tagliabue, Anna; Ferraris, Cinzia; Uggeri, Francesca; Trentani, Claudia; Bertoli, Simona; de Giorgis, Valentina; Veggiotti, Pierangelo; Elli, Marina
2017-02-01
The classical ketogenic diet (KD) is a high-fat, very low-carbohydrate normocaloric diet used for drug-resistant epilepsy and Glucose Transporter 1 Deficiency Syndrome (GLUT1 DS). In animal models, high fat diet induces large alterations in microbiota producing deleterious effects on gut health. We carried out a pilot study on patients treated with KD comparing their microbiota composition before and after three months on the diet. Six patients affected by GLUT1 DS were asked to collect fecal samples before and after three months on the diet. RT - PCR analysis was performed in order to quantify Firmicutes, Bacteroidetes, Bifidobacterium spp., Lactobacillus spp., Clostridium perfringens, Enterobacteriaceae, Clostridium cluster XIV, Desulfovibrio spp. and Faecalibacterium prausnitzii. Compared with baseline, there were no statistically significant differences at 3 months in Firmicutes and Bacteroidetes. However fecal microbial profiles revealed a statistically significant increase in Desulfovibrio spp. (p = 0.025), a bacterial group supposed to be involved in the exacerbation of the inflammatory condition of the gut mucosa associated to the consumption of fats of animal origin. A future prospective study on the changes in gut microbiota of all children with epilepsy started on a KD is warranted. In patients with dysbiosis demonstrated by fecal samples, it my be reasonable to consider an empiric trial of pre or probiotics to potentially restore the «ecological balance» of intestinal microbiota. Copyright © 2016 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.
Classical methods and modern analysis for studying fungal diversity
John Paul Schmit
2005-01-01
In this chapter, we examine the use of classical methods to study fungal diversity. Classical methods rely on the direct observation of fungi, rather than sampling fungal DNA. We summarize a wide variety of classical methods, including direct sampling of fungal fruiting bodies, incubation of substrata in moist chambers, culturing of endophytes, and particle plating. We...
Classical Methods and Modern Analysis for Studying Fungal Diversity
J. P. Schmit; D. J. Lodge
2005-01-01
In this chapter, we examine the use of classical methods to study fungal diversity. Classical methods rely on the direct observation of fungi, rather than sampling fungal DNA. We summarize a wide variety of classical methods, including direct sampling of fungal fruiting bodies, incubation of substrata in moist chambers, culturing of endophytes, and particle plating. We...
Classic Maya civilization collapse associated with reduction in tropical cyclone activity
NASA Astrophysics Data System (ADS)
Medina, M. A.; Polanco-Martinez, J. M.; Lases-Hernández, F.; Bradley, R. S.; Burns, S. J.
2013-12-01
In light of the increased destructiveness of tropical cyclones observed over recent decades one might assume that an increase and not a decrease in tropical cyclone activity would lead to societal stress and perhaps collapse of ancient cultures. In this study we present evidence that a reduction in the frequency and intensity of tropical Atlantic cyclones could have contributed to the collapse of the Maya civilization during the Terminal Classic Period (TCP, AD. 800-950). Statistical comparisons of a quantitative precipitation record from the Yucatan Peninsula (YP) Maya lowlands, based on the stalagmite known as Chaac (after the Mayan God of rain and agriculture), relative to environmental proxy records of El Niño/Southern Oscillation (ENSO), tropical Atlantic sea surface temperatures (SSTs), and tropical Atlantic cyclone counts, suggest that these records share significant coherent variability during the TCP and that summer rainfall reductions between 30 and 50% in the Maya lowlands occurred in association with decreased Atlantic tropical cyclones. Analysis of modern instrumental hydrological data suggests cyclone rainfall contributions to the YP equivalent to the range of rainfall deficits associated with decreased tropical cyclone activity during the collapse of the Maya civilization. Cyclone driven precipitation variability during the TCP, implies that climate change may have triggered Maya civilization collapse via freshwater scarcity for domestic use without significant detriment to agriculture. Pyramid in Tikal, the most prominent Maya Kingdom that collapsed during the Terminal Classic Period (circa C.E. 800-950) Rainfall feeding stalagmites inside Rio Secreto cave system, Yucatan, Mexico.
NASA Astrophysics Data System (ADS)
Li, Ziyi
2017-12-01
Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.
Dielectric properties of classical and quantized ionic fluids.
Høye, Johan S
2010-06-01
We study time-dependent correlation functions of classical and quantum gases using methods of equilibrium statistical mechanics for systems of uniform as well as nonuniform densities. The basis for our approach is the path integral formalism of quantum mechanical systems. With this approach the statistical mechanics of a quantum mechanical system becomes the equivalent of a classical polymer problem in four dimensions where imaginary time is the fourth dimension. Several nontrivial results for quantum systems have been obtained earlier by this analogy. Here, we will focus upon the presence of a time-dependent electromagnetic pair interaction where the electromagnetic vector potential that depends upon currents, will be present. Thus both density and current correlations are needed to evaluate the influence of this interaction. Then we utilize that densities and currents can be expressed by polarizations by which the ionic fluid can be regarded as a dielectric one for which a nonlocal susceptibility is found. This nonlocality has as a consequence that we find no contribution from a possible transverse electric zero-frequency mode for the Casimir force between metallic plates. Further, we establish expressions for a leading correction to ab initio calculations for the energies of the quantized electrons of molecules where now retardation effects also are taken into account.
Analysis of clinically relevant values of Ki-67 labeling index in Japanese breast cancer patients.
Tamaki, Kentaro; Ishida, Takanori; Tamaki, Nobumitsu; Kamada, Yoshihiko; Uehara, Kanou; Miyashita, Minoru; Amari, Masakazu; Tadano-Sato, Akiko; Takahashi, Yayoi; Watanabe, Mika; McNamara, Keely; Ohuchi, Noriaki; Sasano, Hironobu
2014-05-01
It has become important to standardize the methods of Ki-67 evaluation in breast cancer patients, especially those used in the interpretation and scoring of immunoreactivity. Therefore, in this study, we examined the Ki-67 immunoreactivity of breast cancer surgical specimens processed and stained in the same manner in one single Japanese institution by counting nuclear immunoreactivity in the same fashion. We examined 408 Japanese breast cancers with invasive ductal carcinoma and studied the correlation between Ki-67 labeling index and ER/HER2 status and histological grade of breast cancer. We also analyzed overall survival (OS) and disease-free survival (DFS) of these patients according to individual Ki-67 labeling index. There were statistically significant differences of Ki-67 labeling index between ER positive/HER2 negative and ER positive/HER2 positive, ER negative/HER2 positive or ER negative/HER2 negative, and ER positive/HER2 positive and ER negative/HER2 negative groups (all P < 0.001). There were also statistically significant differences of Ki-67 labeling index among each histological grade (P < 0.001, respectively). As for multivariate analyses, Ki-67 labeling index was strongly associated with OS (HR 39.12, P = 0.031) and DFS (HR 10.85, P = 0.011) in ER positive and HER2 negative breast cancer patients. In addition, a statistically significant difference was noted between classical luminal A group and "20 % luminal A" in DFS (P = 0.039) but not between classical luminal A group and "25 % luminal A" (P = 0.105). A significant positive correlation was detected between Ki-67 labeling index and ER/HER2 status and histological grades of the cases examined in our study. The suggested optimal cutoff point of Ki-67 labeling index is between 20 and 25 % in ER positive and HER2 negative breast cancer patients.
Playing-related disabling musculoskeletal disorders in young and adult classical piano students.
Bruno, S; Lorusso, A; L'Abbate, N
2008-07-01
To determine the prevalence of instrument-related musculoskeletal problems in classical piano students and investigate piano-specific risk factors. A specially developed four parts questionnaire was administered to classical piano students of two Apulian conservatories, in southern Italy. A cross-sectional design was used. Prevalences of playing related musculoskeletal disorders (MSDs) were calculated and cases were compared with non-cases. A total of 195 out of the 224 piano students responded (87%). Among 195 responders, 75 (38.4%) were considered affected according to the pre-established criteria. Disabling MSDs showed similar prevalence rates for neck (29.3%), thoracic spine (21.3%) and upper limbs (from 20.0 to 30.4%) in the affected group. Univariate analyses showed statistical differences concerning mean age, number of hours per week spent playing, more than 60 min of continuative playing without breaks, lack of sport practice and acceptability of "No pain, no gain" criterion in students with music-related pain compared with pianists not affected. Statistical correlation was found only between upper limbs diseases in pianists and hand sizes. No correlation with the model of piano played was found in the affected group. The multivariate analyses performed by logistic regression confirmed the independent correlation of the risk factors age, lack of sport practice and acceptability of "No pain, no gain" criterion. Our study showed MSDs to be a common problem among classical piano students. With variance in several studies reported, older students appeared to be more frequently affected by disabling MSDs and no difference in the prevalence rate of the disorders was found in females.
ERIC Educational Resources Information Center
Miller, Ann M.
A lexical representational analysis of Classical Arabic is proposed that captures a generalization that McCarthy's (1979, 1981) autosegmental analysis misses, namely that idiosyncratic characteristics of the derivational binyanim in Arabic are lexical, not morphological. This analysis captures that generalization by treating all the idiosyncracies…
Linear and Non-linear Information Flows In Rainfall Field
NASA Astrophysics Data System (ADS)
Molini, A.; La Barbera, P.; Lanza, L. G.
The rainfall process is the result of a complex framework of non-linear dynamical in- teractions between the different components of the atmosphere. It preserves the com- plexity and the intermittent features of the generating system in space and time as well as the strong dependence of these properties on the scale of observations. The understanding and quantification of how the non-linearity of the generating process comes to influence the single rain events constitute relevant research issues in the field of hydro-meteorology, especially in those applications where a timely and effective forecasting of heavy rain events is able to reduce the risk of failure. This work focuses on the characterization of the non-linear properties of the observed rain process and on the influence of these features on hydrological models. Among the goals of such a survey is the research of regular structures of the rainfall phenomenon and the study of the information flows within the rain field. The research focuses on three basic evo- lution directions for the system: in time, in space and between the different scales. In fact, the information flows that force the system to evolve represent in general a connection between the different locations in space, the different instants in time and, unless assuming the hypothesis of scale invariance is verified "a priori", the different characteristic scales. A first phase of the analysis is carried out by means of classic statistical methods, then a survey of the information flows within the field is devel- oped by means of techniques borrowed from the Information Theory, and finally an analysis of the rain signal in the time and frequency domains is performed, with par- ticular reference to its intermittent structure. The methods adopted in this last part of the work are both the classic techniques of statistical inference and a few procedures for the detection of non-linear and non-stationary features within the process starting from measured data.
Barbieri, Antoine; Anota, Amélie; Conroy, Thierry; Gourgou-Bourgade, Sophie; Juzyna, Beata; Bonnetain, Franck; Lavergne, Christian; Bascoul-Mollevi, Caroline
2016-07-01
A new longitudinal statistical approach was compared to the classical methods currently used to analyze health-related quality-of-life (HRQoL) data. The comparison was made using data in patients with metastatic pancreatic cancer. Three hundred forty-two patients from the PRODIGE4/ACCORD 11 study were randomly assigned to FOLFIRINOX versus gemcitabine regimens. HRQoL was evaluated using the European Organization for Research and Treatment of Cancer (EORTC) QLQ-C30. The classical analysis uses a linear mixed model (LMM), considering an HRQoL score as a good representation of the true value of the HRQoL, following EORTC recommendations. In contrast, built on the item response theory (IRT), our approach considered HRQoL as a latent variable directly estimated from the raw data. For polytomous items, we extended the partial credit model to a longitudinal analysis (longitudinal partial credit model [LPCM]), thereby modeling the latent trait as a function of time and other covariates. Both models gave the same conclusions on 11 of 15 HRQoL dimensions. HRQoL evolution was similar between the 2 treatment arms, except for the symptoms of pain. Indeed, regarding the LPCM, pain perception was significantly less important in the FOLFIRINOX arm than in the gemcitabine arm. For most of the scales, HRQoL changes over time, and no difference was found between treatments in terms of HRQoL. The use of LMM to study the HRQoL score does not seem appropriate. It is an easy-to-use model, but the basic statistical assumptions do not check. Our IRT model may be more complex but shows the same qualities and gives similar results. It has the additional advantage of being more precise and suitable because of its direct use of raw data. © The Author(s) 2015.
Kim, Thomas S; Caruso, Joseph M; Christensen, Heidi; Torabinejad, Mahmoud
2010-07-01
The purpose of this investigation was to assess the ability of cone-beam computed tomography (CBCT) scanning to measure distances from the apices of selected posterior teeth to the mandibular canal. Measurements were taken from the apices of all posterior teeth that were superior to the mandibular canal. A pilot study was performed to determine the scanning parameters that produced the most diagnostic image and the best dissection technique. Twelve human hemimandibles with posterior teeth were scanned at .20 voxels on an I-CAT Classic CBCT device (Imaging Sciences International, Hatfield, PA), and the scans were exported in Digital Imaging and Communications in Medicine (DICOM) format. The scans were examined in InVivo Dental software (Anatomage, San Jose, CA), and measurements were taken from the apex of each root along its long axis to the upper portion of the mandibular canal. The specimens were dissected under a dental operating microscope, and analogous direct measurements were taken with a Boley gauge. All measurements were taken in triplicate at least 1 week apart by one individual (TSK). The results were averaged and the data separated into matching pairs for statistical analysis. There was no statistical difference (alpha = .05) between the methods of measurement according to the Wilcoxon matched pairs test (p = 0.676). For the anatomic measurements, the intra-rater correlation coefficient (ICC) was .980 and for the CBCT it was .949, indicating that both methods were highly reproducible. Both measurement methods were highly predictive of and highly correlated to each other according to regression and correlation analysis, respectively. Based on the results of this study, the I-CAT Classic can be used to measure distances from the apices of the posterior teeth to the mandibular canal as accurately as direct anatomic dissection. Copyright 2010 American Association of Endodontists. All rights reserved.
A Gaussian wave packet phase-space representation of quantum canonical statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coughtrie, David J.; Tew, David P.
2015-07-28
We present a mapping of quantum canonical statistical averages onto a phase-space average over thawed Gaussian wave-packet (GWP) parameters, which is exact for harmonic systems at all temperatures. The mapping invokes an effective potential surface, experienced by the wave packets, and a temperature-dependent phase-space integrand, to correctly transition from the GWP average at low temperature to classical statistics at high temperature. Numerical tests on weakly and strongly anharmonic model systems demonstrate that thermal averages of the system energy and geometric properties are accurate to within 1% of the exact quantum values at all temperatures.
Fisher, Neyman, and Bayes at FDA.
Rubin, Donald B
2016-01-01
The wise use of statistical ideas in practice essentially requires some Bayesian thinking, in contrast to the classical rigid frequentist dogma. This dogma too often has seemed to influence the applications of statistics, even at agencies like the FDA. Greg Campbell was one of the most important advocates there for more nuanced modes of thought, especially Bayesian statistics. Because two brilliant statisticians, Ronald Fisher and Jerzy Neyman, are often credited with instilling the traditional frequentist approach in current practice, I argue that both men were actually seeking very Bayesian answers, and neither would have endorsed the rigid application of their ideas.
Classical boson sampling algorithms with superior performance to near-term experiments
NASA Astrophysics Data System (ADS)
Neville, Alex; Sparrow, Chris; Clifford, Raphaël; Johnston, Eric; Birchall, Patrick M.; Montanaro, Ashley; Laing, Anthony
2017-12-01
It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of linear optics, which has sparked interest as a rapid way to demonstrate such quantum supremacy. Photon statistics are governed by intractable matrix functions, which suggests that sampling from the distribution obtained by injecting photons into a linear optical network could be solved more quickly by a photonic experiment than by a classical computer. The apparently low resource requirements for large boson sampling experiments have raised expectations of a near-term demonstration of quantum supremacy by boson sampling. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. Our classical algorithm, based on Metropolised independence sampling, allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. Compared to current experiments, a demonstration of quantum supremacy over a successful implementation of these classical methods on a supercomputer would require the number of photons and experimental components to increase by orders of magnitude, while tackling exponentially scaling photon loss.
Universal scaling for the quantum Ising chain with a classical impurity
NASA Astrophysics Data System (ADS)
Apollaro, Tony J. G.; Francica, Gianluca; Giuliano, Domenico; Falcone, Giovanni; Palma, G. Massimo; Plastina, Francesco
2017-10-01
We study finite-size scaling for the magnetic observables of an impurity residing at the end point of an open quantum Ising chain with transverse magnetic field, realized by locally rescaling the field by a factor μ ≠1 . In the homogeneous chain limit at μ =1 , we find the expected finite-size scaling for the longitudinal impurity magnetization, with no specific scaling for the transverse magnetization. At variance, in the classical impurity limit μ =0 , we recover finite scaling for the longitudinal magnetization, while the transverse one basically does not scale. We provide both analytic approximate expressions for the magnetization and the susceptibility as well as numerical evidences for the scaling behavior. At intermediate values of μ , finite-size scaling is violated, and we provide a possible explanation of this result in terms of the appearance of a second, impurity-related length scale. Finally, by going along the standard quantum-to-classical mapping between statistical models, we derive the classical counterpart of the quantum Ising chain with an end-point impurity as a classical Ising model on a square lattice wrapped on a half-infinite cylinder, with the links along the first circle modified as a function of μ .
Statistical speed of quantum states: Generalized quantum Fisher information and Schatten speed
NASA Astrophysics Data System (ADS)
Gessner, Manuel; Smerzi, Augusto
2018-02-01
We analyze families of measures for the quantum statistical speed which include as special cases the quantum Fisher information, the trace speed, i.e., the quantum statistical speed obtained from the trace distance, and more general quantifiers obtained from the family of Schatten norms. These measures quantify the statistical speed under generic quantum evolutions and are obtained by maximizing classical measures over all possible quantum measurements. We discuss general properties, optimal measurements, and upper bounds on the speed of separable states. We further provide a physical interpretation for the trace speed by linking it to an analog of the quantum Cramér-Rao bound for median-unbiased quantum phase estimation.
Quantum work in the Bohmian framework
NASA Astrophysics Data System (ADS)
Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.
2018-01-01
At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.
Hierarchical multivariate covariance analysis of metabolic connectivity
Carbonell, Felix; Charil, Arnaud; Zijdenbos, Alex P; Evans, Alan C; Bedell, Barry J
2014-01-01
Conventional brain connectivity analysis is typically based on the assessment of interregional correlations. Given that correlation coefficients are derived from both covariance and variance, group differences in covariance may be obscured by differences in the variance terms. To facilitate a comprehensive assessment of connectivity, we propose a unified statistical framework that interrogates the individual terms of the correlation coefficient. We have evaluated the utility of this method for metabolic connectivity analysis using [18F]2-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. As an illustrative example of the utility of this approach, we examined metabolic connectivity in angular gyrus and precuneus seed regions of mild cognitive impairment (MCI) subjects with low and high β-amyloid burdens. This new multivariate method allowed us to identify alterations in the metabolic connectome, which would not have been detected using classic seed-based correlation analysis. Ultimately, this novel approach should be extensible to brain network analysis and broadly applicable to other imaging modalities, such as functional magnetic resonance imaging (MRI). PMID:25294129
Using Bayes' theorem for free energy calculations
NASA Astrophysics Data System (ADS)
Rogers, David M.
Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.
Downside Risk analysis applied to the Hedge Funds universe
NASA Astrophysics Data System (ADS)
Perelló, Josep
2007-09-01
Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
Katsarov, Plamen; Gergov, Georgi; Alin, Aylin; Pilicheva, Bissera; Al-Degs, Yahya; Simeonov, Vasil; Kassarova, Margarita
2018-03-01
The prediction power of partial least squares (PLS) and multivariate curve resolution-alternating least squares (MCR-ALS) methods have been studied for simultaneous quantitative analysis of the binary drug combination - doxylamine succinate and pyridoxine hydrochloride. Analysis of first-order UV overlapped spectra was performed using different PLS models - classical PLS1 and PLS2 as well as partial robust M-regression (PRM). These linear models were compared to MCR-ALS with equality and correlation constraints (MCR-ALS-CC). All techniques operated within the full spectral region and extracted maximum information for the drugs analysed. The developed chemometric methods were validated on external sample sets and were applied to the analyses of pharmaceutical formulations. The obtained statistical parameters were satisfactory for calibration and validation sets. All developed methods can be successfully applied for simultaneous spectrophotometric determination of doxylamine and pyridoxine both in laboratory-prepared mixtures and commercial dosage forms.
Testing for detailed balance in a financial market
NASA Astrophysics Data System (ADS)
Fiebig, H. R.; Musgrove, D. P.
2015-06-01
We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.
Classical subjective expected utility.
Cerreia-Vioglio, Simone; Maccheroni, Fabio; Marinacci, Massimo; Montrucchio, Luigi
2013-04-23
We consider decision makers who know that payoff-relevant observations are generated by a process that belongs to a given class M, as postulated in Wald [Wald A (1950) Statistical Decision Functions (Wiley, New York)]. We incorporate this Waldean piece of objective information within an otherwise subjective setting à la Savage [Savage LJ (1954) The Foundations of Statistics (Wiley, New York)] and show that this leads to a two-stage subjective expected utility model that accounts for both state and model uncertainty.
Specificity and timescales of cortical adaptation as inferences about natural movie statistics.
Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia
2016-10-01
Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation.
Specificity and timescales of cortical adaptation as inferences about natural movie statistics
Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia
2016-01-01
Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation. PMID:27699416
Quantum walks with tuneable self-avoidance in one dimension
Camilleri, Elizabeth; Rohde, Peter P.; Twamley, Jason
2014-01-01
Quantum walks exhibit many unique characteristics compared to classical random walks. In the classical setting, self-avoiding random walks have been studied as a variation on the usual classical random walk. Here the walker has memory of its previous locations and preferentially avoids stepping back to locations where it has previously resided. Classical self-avoiding random walks have found numerous algorithmic applications, most notably in the modelling of protein folding. We consider the analogous problem in the quantum setting – a quantum walk in one dimension with tunable levels of self-avoidance. We complement a quantum walk with a memory register that records where the walker has previously resided. The walker is then able to avoid returning back to previously visited sites or apply more general memory conditioned operations to control the walk. We characterise this walk by examining the variance of the walker's distribution against time, the standard metric for quantifying how quantum or classical a walk is. We parameterise the strength of the memory recording and the strength of the memory back-action on the walker, and investigate their effect on the dynamics of the walk. We find that by manipulating these parameters, which dictate the degree of self-avoidance, the walk can be made to reproduce ideal quantum or classical random walk statistics, or a plethora of more elaborate diffusive phenomena. In some parameter regimes we observe a close correspondence between classical self-avoiding random walks and the quantum self-avoiding walk. PMID:24762398
Fractional Transport in Strongly Turbulent Plasmas.
Isliker, Heinz; Vlahos, Loukas; Constantinescu, Dana
2017-07-28
We analyze statistically the energization of particles in a large scale environment of strong turbulence that is fragmented into a large number of distributed current filaments. The turbulent environment is generated through strongly perturbed, 3D, resistive magnetohydrodynamics simulations, and it emerges naturally from the nonlinear evolution, without a specific reconnection geometry being set up. Based on test-particle simulations, we estimate the transport coefficients in energy space for use in the classical Fokker-Planck (FP) equation, and we show that the latter fails to reproduce the simulation results. The reason is that transport in energy space is highly anomalous (strange), the particles perform Levy flights, and the energy distributions show extended power-law tails. Newly then, we motivate the use and derive the specific form of a fractional transport equation (FTE), we determine its parameters and the order of the fractional derivatives from the simulation data, and we show that the FTE is able to reproduce the high energy part of the simulation data very well. The procedure for determining the FTE parameters also makes clear that it is the analysis of the simulation data that allows us to make the decision whether a classical FP equation or a FTE is appropriate.
Photoacoustic discrimination of vascular and pigmented lesions using classical and Bayesian methods
NASA Astrophysics Data System (ADS)
Swearingen, Jennifer A.; Holan, Scott H.; Feldman, Mary M.; Viator, John A.
2010-01-01
Discrimination of pigmented and vascular lesions in skin can be difficult due to factors such as size, subungual location, and the nature of lesions containing both melanin and vascularity. Misdiagnosis may lead to precancerous or cancerous lesions not receiving proper medical care. To aid in the rapid and accurate diagnosis of such pathologies, we develop a photoacoustic system to determine the nature of skin lesions in vivo. By irradiating skin with two laser wavelengths, 422 and 530 nm, we induce photoacoustic responses, and the relative response at these two wavelengths indicates whether the lesion is pigmented or vascular. This response is due to the distinct absorption spectrum of melanin and hemoglobin. In particular, pigmented lesions have ratios of photoacoustic amplitudes of approximately 1.4 to 1 at the two wavelengths, while vascular lesions have ratios of about 4.0 to 1. Furthermore, we consider two statistical methods for conducting classification of lesions: standard multivariate analysis classification techniques and a Bayesian-model-based approach. We study 15 human subjects with eight vascular and seven pigmented lesions. Using the classical method, we achieve a perfect classification rate, while the Bayesian approach has an error rate of 20%.
Interacting scales and energy transfer in isotropic turbulence
NASA Technical Reports Server (NTRS)
Zhou, YE
1993-01-01
The dependence of the energy transfer process on the disparity of the interacting scales is investigated in the inertial and far-dissipation ranges of isotropic turbulence. The strategy for generating the simulated flow fields and the choice of a disparity parameter to characterize the scaling of the interactions is discussed. The inertial range is found to be dominated by relatively local interactions, in agreement with the Kolmogorov assumption. The far-dissipation is found to be dominated by relatively non-local interactions, supporting the classical notion that the far-dissipation range is slaved to the Kolmogorov scales. The measured energy transfer is compared with the classical models of Heisenberg, Obukhov, and the more detailed analysis of Tennekes and Lumley. The energy transfer statistics measured in the numerically simulated flows are found to be nearly self-similar for wave numbers in the inertial range. Using the self-similar form measured within the limited scale range of the simulation, an 'ideal' energy transfer function and the corresponding energy flux rate for an inertial range of infinite extent are constructed. From this flux rate, the Kolmogorov constant is calculated to be 1.5, in excellent agreement with experiments.
The complexity of classical music networks
NASA Astrophysics Data System (ADS)
Rolla, Vitor; Kestenberg, Juliano; Velho, Luiz
2018-02-01
Previous works suggest that musical networks often present the scale-free and the small-world properties. From a musician's perspective, the most important aspect missing in those studies was harmony. In addition to that, the previous works made use of outdated statistical methods. Traditionally, least-squares linear regression is utilised to fit a power law to a given data set. However, according to Clauset et al. such a traditional method can produce inaccurate estimates for the power law exponent. In this paper, we present an analysis of musical networks which considers the existence of chords (an essential element of harmony). Here we show that only 52.5% of music in our database presents the scale-free property, while 62.5% of those pieces present the small-world property. Previous works argue that music is highly scale-free; consequently, it sounds appealing and coherent. In contrast, our results show that not all pieces of music present the scale-free and the small-world properties. In summary, this research is focused on the relationship between musical notes (Do, Re, Mi, Fa, Sol, La, Si, and their sharps) and accompaniment in classical music compositions. More information about this research project is available at https://eden.dei.uc.pt/~vitorgr/MS.html.
Principal Curves on Riemannian Manifolds.
Hauberg, Soren
2016-09-01
Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimizes a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend to imply that the methods only work well when the manifold is mostly flat within the support of the generating distribution. We argue that instead of generalizing linear Euclidean models, it is more fruitful to generalize non-linear Euclidean models. Specifically, we extend the classic Principal Curves from Hastie & Stuetzle to data residing on a complete Riemannian manifold. We show that for elliptical distributions in the tangent of spaces of constant curvature, the standard principal geodesic is a principal curve. The proposed model is simple to compute and avoids many of the pitfalls of traditional geodesic approaches. We empirically demonstrate the effectiveness of the Riemannian principal curves on several manifolds and datasets.
Fractional Transport in Strongly Turbulent Plasmas
NASA Astrophysics Data System (ADS)
Isliker, Heinz; Vlahos, Loukas; Constantinescu, Dana
2017-07-01
We analyze statistically the energization of particles in a large scale environment of strong turbulence that is fragmented into a large number of distributed current filaments. The turbulent environment is generated through strongly perturbed, 3D, resistive magnetohydrodynamics simulations, and it emerges naturally from the nonlinear evolution, without a specific reconnection geometry being set up. Based on test-particle simulations, we estimate the transport coefficients in energy space for use in the classical Fokker-Planck (FP) equation, and we show that the latter fails to reproduce the simulation results. The reason is that transport in energy space is highly anomalous (strange), the particles perform Levy flights, and the energy distributions show extended power-law tails. Newly then, we motivate the use and derive the specific form of a fractional transport equation (FTE), we determine its parameters and the order of the fractional derivatives from the simulation data, and we show that the FTE is able to reproduce the high energy part of the simulation data very well. The procedure for determining the FTE parameters also makes clear that it is the analysis of the simulation data that allows us to make the decision whether a classical FP equation or a FTE is appropriate.
2013-01-01
Background The synthesis of information across microarray studies has been performed by combining statistical results of individual studies (as in a mosaic), or by combining data from multiple studies into a large pool to be analyzed as a single data set (as in a melting pot of data). Specific issues relating to data heterogeneity across microarray studies, such as differences within and between labs or differences among experimental conditions, could lead to equivocal results in a melting pot approach. Results We applied statistical theory to determine the specific effect of different means and heteroskedasticity across 19 groups of microarray data on the sign and magnitude of gene-to-gene Pearson correlation coefficients obtained from the pool of 19 groups. We quantified the biases of the pooled coefficients and compared them to the biases of correlations estimated by an effect-size model. Mean differences across the 19 groups were the main factor determining the magnitude and sign of the pooled coefficients, which showed largest values of bias as they approached ±1. Only heteroskedasticity across the pool of 19 groups resulted in less efficient estimations of correlations than did a classical meta-analysis approach of combining correlation coefficients. These results were corroborated by simulation studies involving either mean differences or heteroskedasticity across a pool of N > 2 groups. Conclusions The combination of statistical results is best suited for synthesizing the correlation between expression profiles of a gene pair across several microarray studies. PMID:23822712
Identifying Node Role in Social Network Based on Multiple Indicators
Huang, Shaobin; Lv, Tianyang; Zhang, Xizhe; Yang, Yange; Zheng, Weimin; Wen, Chao
2014-01-01
It is a classic topic of social network analysis to evaluate the importance of nodes and identify the node that takes on the role of core or bridge in a network. Because a single indicator is not sufficient to analyze multiple characteristics of a node, it is a natural solution to apply multiple indicators that should be selected carefully. An intuitive idea is to select some indicators with weak correlations to efficiently assess different characteristics of a node. However, this paper shows that it is much better to select the indicators with strong correlations. Because indicator correlation is based on the statistical analysis of a large number of nodes, the particularity of an important node will be outlined if its indicator relationship doesn't comply with the statistical correlation. Therefore, the paper selects the multiple indicators including degree, ego-betweenness centrality and eigenvector centrality to evaluate the importance and the role of a node. The importance of a node is equal to the normalized sum of its three indicators. A candidate for core or bridge is selected from the great degree nodes or the nodes with great ego-betweenness centrality respectively. Then, the role of a candidate is determined according to the difference between its indicators' relationship with the statistical correlation of the overall network. Based on 18 real networks and 3 kinds of model networks, the experimental results show that the proposed methods perform quite well in evaluating the importance of nodes and in identifying the node role. PMID:25089823
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less
A hybrid method in combining treatment effects from matched and unmatched studies.
Byun, Jinyoung; Lai, Dejian; Luo, Sheng; Risser, Jan; Tung, Betty; Hardy, Robert J
2013-12-10
The most common data structures in the biomedical studies have been matched or unmatched designs. Data structures resulting from a hybrid of the two may create challenges for statistical inferences. The question may arise whether to use parametric or nonparametric methods on the hybrid data structure. The Early Treatment for Retinopathy of Prematurity study was a multicenter clinical trial sponsored by the National Eye Institute. The design produced data requiring a statistical method of a hybrid nature. An infant in this multicenter randomized clinical trial had high-risk prethreshold retinopathy of prematurity that was eligible for treatment in one or both eyes at entry into the trial. During follow-up, recognition visual acuity was accessed for both eyes. Data from both eyes (matched) and from only one eye (unmatched) were eligible to be used in the trial. The new hybrid nonparametric method is a meta-analysis based on combining the Hodges-Lehmann estimates of treatment effects from the Wilcoxon signed rank and rank sum tests. To compare the new method, we used the classic meta-analysis with the t-test method to combine estimates of treatment effects from the paired and two sample t-tests. We used simulations to calculate the empirical size and power of the test statistics, as well as the bias, mean square and confidence interval width of the corresponding estimators. The proposed method provides an effective tool to evaluate data from clinical trials and similar comparative studies. Copyright © 2013 John Wiley & Sons, Ltd.
Modal analysis of a classical guitar
NASA Astrophysics Data System (ADS)
Cohen, David; Rossing, Thomas D.
2002-11-01
Using holographic interferometry, we have determined the modes of vibration of a classical guitar (by the first author) having an asymmetrically-braced top plate and a crossed braced back of unique design. The vibrational modes and acoustical properties are compared with other classical guitars.
Classical molecular dynamics simulations for non-equilibrium correlated plasmas
NASA Astrophysics Data System (ADS)
Ferri, S.; Calisti, A.; Talin, B.
2017-03-01
A classical molecular dynamics model was recently extended to simulate neutral multi-component plasmas where various charge states of the same atom and electrons coexist. It is used to investigate the plasma effects on the ion charge and on the ionization potential in dense plasmas. Different simulated statistical properties will show that the concept of isolated particles is lost in such correlated plasmas. The charge equilibration is discussed for a carbon plasma at solid density and investigation on the charge distribution and on the ionization potential depression (IPD) for aluminum plasmas is discussed with reference to existing experiments.
Signatures of chaos in the Brillouin zone.
Barr, Aaron; Barr, Ariel; Porter, Max D; Reichl, Linda E
2017-10-01
When the classical dynamics of a particle in a finite two-dimensional billiard undergoes a transition to chaos, the quantum dynamics of the particle also shows manifestations of chaos in the form of scarring of wave functions and changes in energy level spacing distributions. If we "tile" an infinite plane with such billiards, we find that the Bloch states on the lattice undergo avoided crossings, energy level spacing statistics change from Poisson-like to Wigner-like, and energy sheets of the Brillouin zone begin to "mix" as the classical dynamics of the billiard changes from regular to chaotic behavior.
Use of FTA® classic cards for epigenetic analysis of sperm DNA.
Serra, Olga; Frazzi, Raffaele; Perotti, Alessio; Barusi, Lorenzo; Buschini, Annamaria
2018-02-01
FTA® technologies provide the most reliable method for DNA extraction. Although FTA technologies have been widely used for genetic analysis, there is no literature on their use for epigenetic analysis yet. We present for the first time, a simple method for quantitative methylation assessment based on sperm cells stored on Whatman FTA classic cards. Specifically, elution of seminal DNA from FTA classic cards was successfully tested with an elution buffer and an incubation step in a thermocycler. The eluted DNA was bisulfite converted, amplified by PCR, and a region of interest was pyrosequenced.
Quasi-Static Analysis of Round LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of round LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
Quasi-Static Analysis of LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
ASSESSING THE IMPACTS OF ANTHROPOGENIC STRESSORS ON MACROINVERTEBRATE INDICATORS IN OHIO
In the past few years, there has been increasing interest in using biological community data to provide information about specific anthropogenic factors impacting streams. Previous studies have used statistical approaches that are variants of classical and modern multiple regres...
Performance Characterization of an Instrument.
ERIC Educational Resources Information Center
Salin, Eric D.
1984-01-01
Describes an experiment designed to teach students to apply the same statistical awareness to instrumentation they commonly apply to classical techniques. Uses propagation of error techniques to pinpoint instrumental limitations and breakdowns and to demonstrate capabilities and limitations of volumetric and gravimetric methods. Provides lists of…
Müller, Hans-Peter; Agosta, Federica; Riva, Nilo; Spinelli, Edoardo G; Comi, Giancarlo; Ludolph, Albert C; Filippi, Massimo; Kassubek, Jan
2018-01-01
The criteria for assessing upper motor neuron pathology in pure lower motor neuron disease (LMND) still remain a major issue of debate with respect to the clinical classification as an amyotrophic lateral sclerosis (ALS) variant. The study was designed to investigate white matter damage by a hypothesis-guided tract-of-interest-based approach in patients with LMND compared with healthy controls and ´classical´ ALS patients in order to identify in vivo brain structural changes according to the neuropathologically defined ALS affectation pattern. Data were pooled from two previous studies at two different study sites (Ulm, Germany and Milano, Italy). DTI-based white matter integrity mapping was performed by voxelwise statistical comparison and by a tractwise analysis of fractional anisotropy (FA) maps according to the ALS-staging pattern for 65 LMND patients (clinically differentiated in fast and slow progressors) vs. 92 matched controls and 101 ALS patients with a 'classical' phenotype to identify white matter structural alterations. The analysis of white matter structural connectivity by regional FA reductions demonstrated the characteristic alteration patterns along the CST and also in frontal and prefrontal brain areas in LMND patients compared to controls and ALS. Fast progressing LMND showed substantial involvement, like in ALS, while slow progressors showed less severe alterations. In the tract-specific analysis according to the ALS-staging pattern, fast progressing LMND showed significant alterations of ALS-related tract systems as compared to slow progressors and controls. This study showed an affectation pattern for corticoefferent fibers in LMND with fast disease progression as defined for ALS, that way confirming the hypothesis that fast progressing LMND is a phenotypical variant of ALS.
Bayesian change-point analysis reveals developmental change in a classic theory of mind task.
Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M
2016-12-01
Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
FTree query construction for virtual screening: a statistical analysis.
Gerlach, Christof; Broughton, Howard; Zaliani, Andrea
2008-02-01
FTrees (FT) is a known chemoinformatic tool able to condense molecular descriptions into a graph object and to search for actives in large databases using graph similarity. The query graph is classically derived from a known active molecule, or a set of actives, for which a similar compound has to be found. Recently, FT similarity has been extended to fragment space, widening its capabilities. If a user were able to build a knowledge-based FT query from information other than a known active structure, the similarity search could be combined with other, normally separate, fields like de-novo design or pharmacophore searches. With this aim in mind, we performed a comprehensive analysis of several databases in terms of FT description and provide a basic statistical analysis of the FT spaces so far at hand. Vendors' catalogue collections and MDDR as a source of potential or known "actives", respectively, have been used. With the results reported herein, a set of ranges, mean values and standard deviations for several query parameters are presented in order to set a reference guide for the users. Applications on how to use this information in FT query building are also provided, using a newly built 3D-pharmacophore from 57 5HT-1F agonists and a published one which was used for virtual screening for tRNA-guanine transglycosylase (TGT) inhibitors.
FTree query construction for virtual screening: a statistical analysis
NASA Astrophysics Data System (ADS)
Gerlach, Christof; Broughton, Howard; Zaliani, Andrea
2008-02-01
FTrees (FT) is a known chemoinformatic tool able to condense molecular descriptions into a graph object and to search for actives in large databases using graph similarity. The query graph is classically derived from a known active molecule, or a set of actives, for which a similar compound has to be found. Recently, FT similarity has been extended to fragment space, widening its capabilities. If a user were able to build a knowledge-based FT query from information other than a known active structure, the similarity search could be combined with other, normally separate, fields like de-novo design or pharmacophore searches. With this aim in mind, we performed a comprehensive analysis of several databases in terms of FT description and provide a basic statistical analysis of the FT spaces so far at hand. Vendors' catalogue collections and MDDR as a source of potential or known "actives", respectively, have been used. With the results reported herein, a set of ranges, mean values and standard deviations for several query parameters are presented in order to set a reference guide for the users. Applications on how to use this information in FT query building are also provided, using a newly built 3D-pharmacophore from 57 5HT-1F agonists and a published one which was used for virtual screening for tRNA-guanine transglycosylase (TGT) inhibitors.
Statistical inference for noisy nonlinear ecological dynamic systems.
Wood, Simon N
2010-08-26
Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.
Validation of Bayesian analysis of compartmental kinetic models in medical imaging.
Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M
2016-10-01
Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Salas, Desirée; Le Gall, Antoine; Fiche, Jean-Bernard; Valeri, Alessandro; Ke, Yonggang; Bron, Patrick; Bellot, Gaetan
2017-01-01
Superresolution light microscopy allows the imaging of labeled supramolecular assemblies at a resolution surpassing the classical diffraction limit. A serious limitation of the superresolution approach is sample heterogeneity and the stochastic character of the labeling procedure. To increase the reproducibility and the resolution of the superresolution results, we apply multivariate statistical analysis methods and 3D reconstruction approaches originally developed for cryogenic electron microscopy of single particles. These methods allow for the reference-free 3D reconstruction of nanomolecular structures from two-dimensional superresolution projection images. Since these 2D projection images all show the structure in high-resolution directions of the optical microscope, the resulting 3D reconstructions have the best possible isotropic resolution in all directions. PMID:28811371
Chronological analysis of architectural and acoustical indices in music performance halls.
Kwon, Youngmin; Siebein, Gary W
2007-05-01
This study aims to identify the changes in architectural and acoustical indices in halls for music performance built in the 18th through the 20th Centuries. Seventy-one halls are classified in five specific periods from the Classical Period (1751-1820) to the Contemporary Period (1981-2000) based on chronology in music and architectural acoustics. Architectural indices such as room shape, seating capacity, room volume, balcony configuration, and the like as well as acoustical indices such as RT, EDT, G, C80, IACC, and the like for the halls found in the literature are chronologically tabulated and statistically analyzed to identify trends and relationships in architectural and acoustical design for each of the historical periods identified. Some indices appear correlated with each other.
Parametric interactions in presence of different size colloids in semiconductor quantum plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanshpal, R., E-mail: ravivanshpal@gmail.com; Sharma, Uttam; Dubey, Swati
2015-07-31
Present work is an attempt to investigate the effect of different size colloids on parametric interaction in semiconductor quantum plasma. Inclusion of quantum effect is being done in this analysis through quantum correction term in classical hydrodynamic model of homogeneous semiconductor plasma. The effect is associated with purely quantum origin using quantum Bohm potential and quantum statistics. Colloidal size and quantum correction term modify the parametric dispersion characteristics of ion implanted semiconductor plasma medium. It is found that quantum effect on colloids is inversely proportional to their size. Moreover critical size of implanted colloids for the effective quantum correction ismore » determined which is found to be equal to the lattice spacing of the crystal.« less
Frenetic Bounds on the Entropy Production
NASA Astrophysics Data System (ADS)
Maes, Christian
2017-10-01
We give a systematic derivation of positive lower bounds for the expected entropy production (EP) rate in classical statistical mechanical systems obeying a dynamical large deviation principle. The logic is the same for the return to thermodynamic equilibrium as it is for steady nonequilibria working under the condition of local detailed balance. We recover there recently studied "uncertainty" relations for the EP, appearing in studies about the effectiveness of mesoscopic machines. In general our refinement of the positivity of the expected EP rate is obtained in terms of a positive and even function of the expected current(s) which measures the dynamical activity in the system, a time-symmetric estimate of the changes in the system's configuration. Also underdamped diffusions can be included in the analysis.
Bludau, Sebastian; Bzdok, Danilo; Gruber, Oliver; Kohn, Nils; Riedl, Valentin; Sorg, Christian; Palomero-Gallagher, Nicola; Müller, Veronika I.; Hoffstaedter, Felix; Amunts, Katrin; Eickhoff, Simon B.
2017-01-01
Objective The heterogeneous human frontal pole has been identified as a node in the dysfunctional network of major depressive disorder. The contribution of the medial (socio-affective) versus lateral (cognitive) frontal pole to major depression pathogenesis is currently unclear. The present study performs morphometric comparison of the microstructurally informed subdivisions of human frontal pole between depressed patients and controls using both uni- and multivariate statistics. Methods Multi-site voxel- and region-based morphometric MRI analysis of 73 depressed patients and 73 matched controls without psychiatric history. Frontal pole volume was first compared between depressed patients and controls by subdivision-wise classical morphometric analysis. In a second approach, frontal pole volume was compared by subdivision-naive multivariate searchlight analysis based on support vector machines. Results Subdivision-wise morphometric analysis found a significantly smaller medial frontal pole in depressed patients with a negative correlation of disease severity and duration. Histologically uninformed multivariate voxel-wise statistics provided converging evidence for structural aberrations specific to the microstructurally defined medial area of the frontal pole in depressed patients. Conclusions Across disparate methods, we demonstrated subregion specificity in the left medial frontal pole volume in depressed patients. Indeed, the frontal pole was shown to structurally and functionally connect to other key regions in major depression pathology like the anterior cingulate cortex and the amygdala via the uncinate fasciculus. Present and previous findings consolidate the left medial portion of the frontal pole as particularly altered in major depression. PMID:26621569
An A Priori Multiobjective Optimization Model of a Search and Rescue Network
1992-03-01
sequences. Classical sensitivity analysis and tolerance analysis were used to analyze the frequency assignments generated by the different weight...function for excess coverage of a frequency. Sensitivity analysis is used to investigate the robustness of the frequency assignments produced by the...interest. The linear program solution is used to produce classical sensitivity analysis for the weight ranges. 17 III. Model Formulation This chapter
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
NASA Astrophysics Data System (ADS)
Eltom, Hassan A.; Abdullatif, Osman M.; Makkawi, Mohammed H.; Eltoum, Isam-Eldin A.
2017-03-01
The interpretation of depositional environments provides important information to understand facies distribution and geometry. The classical approach to interpret depositional environments principally relies on the analysis of lithofacies, biofacies and stratigraphic data, among others. An alternative method, based on geochemical data (chemical element data), is advantageous because it can simply, reproducibly and efficiently interpret and refine the interpretation of the depositional environment of carbonate strata. Here we geochemically analyze and statistically model carbonate samples (n = 156) from seven sections of the Arab-D reservoir outcrop analog of central Saudi Arabia, to determine whether the elemental signatures (major, trace and rare earth elements [REEs]) can be effectively used to predict depositional environments. We find that lithofacies associations of the studied outcrop (peritidal to open marine depositional environments) possess altered REE signatures, and that this trend increases stratigraphically from bottom-to-top, which corresponds to an upward shallowing of depositional environments. The relationship between REEs and major, minor and trace elements indicates that contamination by detrital materials is the principal source of REEs, whereas redox condition, marine and diagenetic processes have minimal impact on the relative distribution of REEs in the lithofacies. In a statistical model (factor analysis and logistic regression), REEs, major and trace elements cluster together and serve as markers to differentiate between peritidal and open marine facies and to differentiate between intertidal and subtidal lithofacies within the peritidal facies. The results indicate that statistical modelling of the elemental composition of carbonate strata can be used as a quantitative method to predict depositional environments and regional paleogeography. The significance of this study lies in offering new assessments of the relationships between lithofacies and geochemical elements by using advanced statistical analysis, a method that could be used elsewhere to interpret depositional environment and refine facies models.
Analytic Methods for Adjusting Subjective Rating Schemes.
ERIC Educational Resources Information Center
Cooper, Richard V. L.; Nelson, Gary R.
Statistical and econometric techniques of correcting for supervisor bias in models of individual performance appraisal were developed, using a variant of the classical linear regression model. Location bias occurs when individual performance is systematically overestimated or underestimated, while scale bias results when raters either exaggerate…
Polymer Principles in the Undergraduate Physical Chemistry Course. Part 2.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1985
1985-01-01
Part l (SE 538 305) covered application of classical thermodynamics, polymer crystallinity, and phase diagrams to teaching physical chemistry. This part covers statistical thermodynamics, conformation, molecular weights, rubber elasticity and viscoelasticity, and kinetics of polymerization. Eight polymer-oriented, multiple-choice test questions…
Semi-classical statistical description of Fröhlich condensation.
Preto, Jordane
2017-06-01
Fröhlich's model equations describing phonon condensation in open systems of biological relevance are reinvestigated within a semi-classical statistical framework. The main assumptions needed to deduce Fröhlich's rate equations are identified and it is shown how they lead us to write an appropriate form for the corresponding master equation. It is shown how solutions of the master equation can be numerically computed and can highlight typical features of the condensation effect. Our approach provides much more information compared to the existing ones as it allows to investigate the time evolution of the probability density function instead of following single averaged quantities. The current work is also motivated, on the one hand, by recent experimental evidences of long-lived excited modes in the protein structure of hen-egg white lysozyme, which were reported as a consequence of the condensation effect, and, on the other hand, by a growing interest in investigating long-range effects of electromagnetic origin and their influence on the dynamics of biochemical reactions.
NASA Astrophysics Data System (ADS)
Shenker, Orly R.
2004-09-01
In 1867, James Clerk Maxwell proposed a perpetuum mobile of the second kind, that is, a counter example for the Second Law of thermodynamics, which came to be known as "Maxwell's Demon." Unlike any other perpetual motion machine, this one escaped attempts by the best scientists and philosophers to show that the Second Law or its statistical mechanical counterparts are universal after all. "Maxwell's demon lives on. After more than 130 years of uncertain life and at least two pronouncements of death, this fanciful character seems more vibrant than ever." These words of Harvey Leff and Andrew Rex (1990), which open their introduction to Maxwell's Demon 2: Entropy, Classical and Quantum Information, Computing (hereafter MD2) are very true: the Demon is as challenging and as intriguing as ever, and forces us to think and rethink about the foundations of thermodynamics and of statistical mechanics.
Comparison of Iranian National Medical Library with digital libraries of selected countries.
Zare-Farashbandi, Firoozeh; Najafi, Nayere Sadat Soleimanzade; Atashpour, Bahare
2014-01-01
The important role of information and communication technologies and their influence on methods of storing, retrieving information in digital libraries, has not only changed the meanings behind classic library activates but has also created great changes in their services. However, it seems that not all digital libraries provide their users with similar services and only some of them are successful in fulfilling their role in digital environment. The Iranian National Medical library is among those that appear to come short compared to other digital libraries around the world. By knowing the different services provided by digital libraries worldwide, one can evaluate the services provided by Iranian National Medical library. The goal of this study is a comparison between Iranian National Medical library and digital libraries of selected countries. This is an applied study and uses descriptive - survey method. The statistical population is the digital libraries around the world which were actively providing library services between October and December 2011 and were selected by using the key word "Digital Library" in Google search engine. The data-gathering tool was direct access to the websites of these digital libraries. The statistical study is descriptive and Excel software was used for data analysis and plotting of the charts. The findings showed that among the 33 digital libraries investigated worldwide, most of them provided Browse (87.87%), Search (84.84%), and Electronic information retrieval (57.57%) services. The "Help" in public services (48/48%) and "Interlibrary Loan" in traditional services (27/27%) had the highest frequency. The Iranian National Medical library provides more digital services compared to other libraries but has less classic and public services and has less than half of possible public services. Other than Iranian National Medical library, among the 33 libraries investigated, the leaders in providing different services are Library of University of California in classic services, Count Way Library of Medicine in digital services, and Library of Finland in public services. The results of this study show that among the digital libraries investigated, most provided similar public, digital, and classic services and The Iranian National Medical library has been somewhat successful in providing these services compared to other digital libraries. One can also conclude that the difference in services is at least in part due to difference in environments, information needs, and users. Iranian National Medical Library has been somewhat successful in providing library services in digital environment and needs to identify the services which are valuable to its users by identifying the users' needs and special characteristics of its environment.
Strongly magnetized classical plasma models
NASA Technical Reports Server (NTRS)
Montgomery, D. C.
1972-01-01
The class of plasma processes for which the so-called Vlasov approximation is inadequate is investigated. Results from the equilibrium statistical mechanics of two-dimensional plasmas are derived. These results are independent of the presence of an external dc magnetic field. The nonequilibrium statistical mechanics of the electrostatic guiding-center plasma, a two-dimensional plasma model, is discussed. This model is then generalized to three dimensions. The guiding-center model is relaxed to include finite Larmor radius effects for a two-dimensional plasma.
Classical statistical mechanics approach to multipartite entanglement
NASA Astrophysics Data System (ADS)
Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.
2010-06-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over balanced bipartitions. We search for maximally multipartite entangled states, whose average purity is minimal, and recast this optimization problem into a problem of statistical mechanics, by introducing a cost function, a fictitious temperature and a partition function. By investigating the high-temperature expansion, we obtain the first three moments of the distribution. We find that the problem exhibits frustration.
Dimensionally regularized Tsallis' statistical mechanics and two-body Newton's gravitation
NASA Astrophysics Data System (ADS)
Zamora, J. D.; Rocca, M. C.; Plastino, A.; Ferri, G. L.
2018-05-01
Typical Tsallis' statistical mechanics' quantifiers like the partition function and the mean energy exhibit poles. We are speaking of the partition function Z and the mean energy 〈 U 〉 . The poles appear for distinctive values of Tsallis' characteristic real parameter q, at a numerable set of rational numbers of the q-line. These poles are dealt with dimensional regularization resources. The physical effects of these poles on the specific heats are studied here for the two-body classical gravitation potential.
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
Hoyer, Dirk; Leder, Uwe; Hoyer, Heike; Pompe, Bernd; Sommer, Michael; Zwiener, Ulrich
2002-01-01
The heart rate variability (HRV) is related to several mechanisms of the complex autonomic functioning such as respiratory heart rate modulation and phase dependencies between heart beat cycles and breathing cycles. The underlying processes are basically nonlinear. In order to understand and quantitatively assess those physiological interactions an adequate coupling analysis is necessary. We hypothesized that nonlinear measures of HRV and cardiorespiratory interdependencies are superior to the standard HRV measures in classifying patients after acute myocardial infarction. We introduced mutual information measures which provide access to nonlinear interdependencies as counterpart to the classically linear correlation analysis. The nonlinear statistical autodependencies of HRV were quantified by auto mutual information, the respiratory heart rate modulation by cardiorespiratory cross mutual information, respectively. The phase interdependencies between heart beat cycles and breathing cycles were assessed basing on the histograms of the frequency ratios of the instantaneous heart beat and respiratory cycles. Furthermore, the relative duration of phase synchronized intervals was acquired. We investigated 39 patients after acute myocardial infarction versus 24 controls. The discrimination of these groups was improved by cardiorespiratory cross mutual information measures and phase interdependencies measures in comparison to the linear standard HRV measures. This result was statistically confirmed by means of logistic regression models of particular variable subsets and their receiver operating characteristics.
The influence of dopants on the nucleation of semiconductor nanocrystals from homogeneous solution.
Bryan, J Daniel; Schwartz, Dana A; Gamelin, Daniel R
2005-09-01
The influence of Co2+ ions on the homogeneous nucleation of ZnO is examined. Using electronic absorption spectroscopy as a dopant-specific in-situ spectroscopic probe, Co2+ ions are found to be quantitatively excluded from the ZnO critical nuclei but incorporated nearly statistically in the subsequent growth layers, resulting in crystallites with pure ZnO cores and Zn(1-x)Co(x)O shells. Strong inhibition of ZnO nucleation by Co2+ ions is also observed. These results are explained using the classical nucleation model. Statistical analysis of nucleation inhibition data allows estimation of the critical nucleus size as 25 +/- 4 Zn2+ ions. Bulk calorimetric data allow the activation barrier for ZnO nucleation containing a single Co2+ impurity to be estimated as 5.75 kcal/mol cluster greater than that of pure ZnO, corresponding to a 1.5 x 10(4)-fold reduction in the ZnO nucleation rate constant upon introduction of a single Co2+ impurity. These data and analysis offer a rare view into the role of composition in homogeneous nucleation processes, and specifically address recent experiments targeting formation of semiconductor quantum dots containing single magnetic impurity ions at their precise centers.
NASA Astrophysics Data System (ADS)
Ignatyev, A. V.; Ignatyev, V. A.; Onischenko, E. V.
2017-11-01
This article is the continuation of the work made bt the authors on the development of the algorithms that implement the finite element method in the form of a classical mixed method for the analysis of geometrically nonlinear bar systems [1-3]. The paper describes an improved algorithm of the formation of the nonlinear governing equations system for flexible plane frames and bars with large displacements of nodes based on the finite element method in a mixed classical form and the use of the procedure of step-by-step loading. An example of the analysis is given.
Quantum Behavior of an Autonomous Maxwell Demon
NASA Astrophysics Data System (ADS)
Chapman, Adrian; Miyake, Akimasa
2015-03-01
A Maxwell Demon is an agent that can exploit knowledge of a system's microstate to perform useful work. The second law of thermodynamics is only recovered upon taking into account the work required to irreversibly update the demon's memory, bringing information theoretic concepts into a thermodynamic framework. Recently, there has been interest in modeling a classical Maxwell demon as an autonomous physical system to study this information-work tradeoff explicitly. Motivated by the idea that states with non-local entanglement structure can be used as a computational resource, we ask whether these states have thermodynamic resource quality as well by generalizing a particular classical autonomous Maxwell demon to the quantum regime. We treat the full quantum description using a matrix product operator formalism, which allows us to handle quantum and classical correlations in a unified framework. Applying this, together with techniques from statistical mechanics, we are able to approximate nonlocal quantities such as the erasure performed on the demon's memory register when correlations are present. Finally, we examine how the demon may use these correlations as a resource to outperform its classical counterpart.
NASA Astrophysics Data System (ADS)
Kreis, Karsten; Kremer, Kurt; Potestio, Raffaello; Tuckerman, Mark E.
2017-12-01
Path integral-based methodologies play a crucial role for the investigation of nuclear quantum effects by means of computer simulations. However, these techniques are significantly more demanding than corresponding classical simulations. To reduce this numerical effort, we recently proposed a method, based on a rigorous Hamiltonian formulation, which restricts the quantum modeling to a small but relevant spatial region within a larger reservoir where particles are treated classically. In this work, we extend this idea and show how it can be implemented along with state-of-the-art path integral simulation techniques, including path-integral molecular dynamics, which allows for the calculation of quantum statistical properties, and ring-polymer and centroid molecular dynamics, which allow the calculation of approximate quantum dynamical properties. To this end, we derive a new integration algorithm that also makes use of multiple time-stepping. The scheme is validated via adaptive classical-path-integral simulations of liquid water. Potential applications of the proposed multiresolution method are diverse and include efficient quantum simulations of interfaces as well as complex biomolecular systems such as membranes and proteins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoilova, N. I.
Generalized quantum statistics, such as paraboson and parafermion statistics, are characterized by triple relations which are related to Lie (super)algebras of type B. The correspondence of the Fock spaces of parabosons, parafermions as well as the Fock space of a system of parafermions and parabosons to irreducible representations of (super)algebras of type B will be pointed out. Example of generalized quantum statistics connected to the basic classical Lie superalgebra B(1|1) ≡ osp(3|2) with interesting physical properties, such as noncommutative coordinates, will be given. Therefore the article focuses on the question, addressed already in 1950 by Wigner: do the equation ofmore » motion determine the quantum mechanical commutation relation?.« less
Semi-classical analysis and pseudo-spectra
NASA Astrophysics Data System (ADS)
Davies, E. B.
We prove an approximate spectral theorem for non-self-adjoint operators and investigate its applications to second-order differential operators in the semi-classical limit. This leads to the construction of a twisted FBI transform. We also investigate the connections between pseudo-spectra and boundary conditions in the semi-classical limit.
Turning Points in the Development of Classical Musicians
ERIC Educational Resources Information Center
Gabor, Elena
2011-01-01
This qualitative study investigated the vocational socialization turning points in families of classical musicians. I sampled and interviewed 20 parent-child dyads, for a total of 46 interviews. Data analysis revealed that classical musicians' experiences were marked by 11 turning points that affected their identification with the occupation:…
Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)
NASA Astrophysics Data System (ADS)
De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.
1993-01-01
The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.
Karasik, Avshalom; Rahimi, Oshrit; David, Michal; Weiss, Ehud; Drori, Elyashiv
2018-04-25
Grapevine (Vitis vinifera L.) is one of the classical fruits of the Old World. Among the thousands of domesticated grapevine varieties and variable wild sylvestris populations, the range of variation in pip morphology is very wide. In this study we scanned representative samples of grape pip populations, in an attempt to probe the possibility of using the 3D tool for grape variety identification. The scanning was followed by mathematical and statistical analysis using innovative algorithms from the field of computer sciences. Using selected Fourier coefficients, a very clear separation was obtained between most of the varieties, with only very few overlaps. These results show that this method enables the separation between different Vitis vinifera varieties. Interestingly, when using the 3D approach to analyze couples of varieties, considered synonyms by the standard 22 SSR analysis approach, we found that the varieties in two of the considered synonym couples were clearly separated by the morphological analysis. This work, therefore, suggests a new systematic tool for high resolution variety discrimination.
Meta-analysis of thirty-two case-control and two ecological radon studies of lung cancer.
Dobrzynski, Ludwik; Fornalski, Krzysztof W; Reszczynska, Joanna
2018-03-01
A re-analysis has been carried out of thirty-two case-control and two ecological studies concerning the influence of radon, a radioactive gas, on the risk of lung cancer. Three mathematically simplest dose-response relationships (models) were tested: constant (zero health effect), linear, and parabolic (linear-quadratic). Health effect end-points reported in the analysed studies are odds ratios or relative risk ratios, related either to morbidity or mortality. In our preliminary analysis, we show that the results of dose-response fitting are qualitatively (within uncertainties, given as error bars) the same, whichever of these health effect end-points are applied. Therefore, we deemed it reasonable to aggregate all response data into the so-called Relative Health Factor and jointly analysed such mixed data, to obtain better statistical power. In the second part of our analysis, robust Bayesian and classical methods of analysis were applied to this combined dataset. In this part of our analysis, we selected different subranges of radon concentrations. In view of substantial differences between the methodology used by the authors of case-control and ecological studies, the mathematical relationships (models) were applied mainly to the thirty-two case-control studies. The degree to which the two ecological studies, analysed separately, affect the overall results when combined with the thirty-two case-control studies, has also been evaluated. In all, as a result of our meta-analysis of the combined cohort, we conclude that the analysed data concerning radon concentrations below ~1000 Bq/m3 (~20 mSv/year of effective dose to the whole body) do not support the thesis that radon may be a cause of any statistically significant increase in lung cancer incidence.
Prospective randomized clinical trial: single and weekly viscosupplementation
Zóboli, Alejandro Agustin Carri; de Rezende, Márcia Uchôa; de Campos, Gustavo Constantino; Pasqualin, Thiago; Frucchi, Renato; de Camargo, Olavo Pires
2013-01-01
OBJECTIVE: To compare two different dosages of an intermediate molecular weight sodium hyaluronate (HA) (Osteonil®-TRB Pharma) assessing whether a single 6 ml application of this HA has the same effectiveness as the classical three-weekly 2 ml dose. METHODS: 108 patients with knee osteoarthritis were randomized into two groups of 54 patients each. The groups were designated "single" (S) and "weekly" (W). Patients in group S underwent a viscosupplementation procedure by application of only 6 ml of sodium hyaluronate and 1 ml triamcinolone hexacetonide. Patients in group W underwent the procedure of viscosupplementation through three applications with 2 ml sodium hyaluronate with a week interval between them, and the first application was also performed with the infiltration of 1 ml (20 mg) of Triamcinolone Hexacetonide. Both groups were assessed before, at one month and three months after application, by responding to the WOMAC, Lequesne, IKDC and VAS questionnaires. RESULTS: There was no statistical difference between the single application of 6 ml of sodium hyaluronate and classic application with three weekly injections. However, only the classical regime showed statistically significant improvement in baseline pain (WOMAC pain and VAS). CONCLUSION: Our results suggest that both application schemes improve application function, but the three-weekly regimen of 2 ml was more effective in reducing pain. Level of Evidence I, Prospective Randomized, Clinical Trial. PMID:24453681
Juárez, M; Polvillo, O; Contò, M; Ficco, A; Ballico, S; Failla, S
2008-05-09
Four different extraction-derivatization methods commonly used for fatty acid analysis in meat (in situ or one-step method, saponification method, classic method and a combination of classic extraction and saponification derivatization) were tested. The in situ method had low recovery and variation. The saponification method showed the best balance between recovery, precision, repeatability and reproducibility. The classic method had high recovery and acceptable variation values, except for the polyunsaturated fatty acids, showing higher variation than the former methods. The combination of extraction and methylation steps had great recovery values, but the precision, repeatability and reproducibility were not acceptable. Therefore the saponification method would be more convenient for polyunsaturated fatty acid analysis, whereas the in situ method would be an alternative for fast analysis. However the classic method would be the method of choice for the determination of the different lipid classes.
SU-E-I-98: PET/CT's Most-Cited 50 Articles since 2000: A Bibliometric Analysis of Classics.
Sayed, G
2012-06-01
Despite its relatively recent introduction to clinical practice, PET/CT has gained wide acceptance both in diagnostic and therapeutic applications. Scientific publication in PET/CT has also experienced significant development since its introduction. Bibliometric analyses allow an understanding of how this publication trend has developed at an aggregated level. Citation analysis is one of the most widely used bibliometric tools of scientometrics. Analysis of classics, defined as an articles with 100 or more citations, is common in the biomedical sciences as it reflects an article's influence in its professional and scientific community. Our objective was to identify the 50 most frequently cited classic articles in PET/CT in the past 10 years. The 50 most-cited PET/CT articles were identified by searching ISI's Web of Knowledge and Pubmed databases for all related publications from 2000 through 2010. Articles were evaluated for several characteristics such as author(s), institution, country of origin, publication year, type, and number of citations. An unadjusted categorical analysis was performed to compare all articles published in the search period. The search yielded a cumulative total of 22,554 entries for the publication period, of which 15,943 were original research articles. The 50 most-cited articles were identified from the latter sum and selected out of 73 classics. The number of citations for the top 50 classics ranged from 114 to 700. PET/CT classics appeared in three general and 12 core journals. The majority of the classics were in oncologic applications of PET/CT (62%). Articles related to diagnostic topics were 6%. The rest were focused on physics and instrumentation 24% and other basic sciences 16%. Despite its relatively short history PET/CT accumulated 73 classic articles in a decade. Such information is of importance to researchers and those who wish to study the scientific development in the field. © 2012 American Association of Physicists in Medicine.
Eigensystem analysis of classical relaxation techniques with applications to multigrid analysis
NASA Technical Reports Server (NTRS)
Lomax, Harvard; Maksymiuk, Catherine
1987-01-01
Classical relaxation techniques are related to numerical methods for solution of ordinary differential equations. Eigensystems for Point-Jacobi, Gauss-Seidel, and SOR methods are presented. Solution techniques such as eigenvector annihilation, eigensystem mixing, and multigrid methods are examined with regard to the eigenstructure.
Salo, Oleksandr V; Ries, Marco; Medema, Marnix H; Lankhorst, Peter P; Vreeken, Rob J; Bovenberg, Roel A L; Driessen, Arnold J M
2015-11-14
Penicillium chrysogenum is a filamentous fungus that is employed as an industrial producer of β-lactams. The high β-lactam titers of current strains is the result of a classical strain improvement program (CSI) starting with a wild-type like strain more than six decades ago. This involved extensive mutagenesis and strain selection for improved β-lactam titers and growth characteristics. However, the impact of the CSI on the secondary metabolism in general remains unknown. To examine the impact of CSI on secondary metabolism, a comparative genomic analysis of β-lactam producing strains was carried out by genome sequencing of three P. chrysogenum strains that are part of a lineage of the CSI, i.e., strains NRRL1951, Wisconsin 54-1255, DS17690, and the derived penicillin biosynthesis cluster free strain DS68530. CSI has resulted in a wide spread of mutations, that statistically did not result in an over- or underrepresentation of specific gene classes. However, in this set of mutations, 8 out of 31 secondary metabolite genes (20 polyketide synthases and 11 non-ribosomal peptide synthetases) were targeted with a corresponding and progressive loss in the production of a range of secondary metabolites unrelated to β-lactam production. Additionally, key Velvet complex proteins (LeaA and VelA) involved in global regulation of secondary metabolism have been repeatedly targeted for mutagenesis during CSI. Using comparative metabolic profiling, the polyketide synthetase gene cluster was identified that is responsible for sorbicillinoid biosynthesis, a group of yellow-colored metabolites that are abundantly produced by early production strains of P. chrysogenum. The classical industrial strain improvement of P. chrysogenum has had a broad mutagenic impact on metabolism and has resulted in silencing of specific secondary metabolite genes with the concomitant diversion of metabolism towards the production of β-lactams.
The Gibbs paradox and the physical criteria for indistinguishability of identical particles
NASA Astrophysics Data System (ADS)
Unnikrishnan, C. S.
2016-08-01
Gibbs paradox in the context of statistical mechanics addresses the issue of additivity of entropy of mixing gases. The usual discussion attributes the paradoxical situation to classical distinguishability of identical particles and credits quantum theory for enabling indistinguishability of identical particles to solve the problem. We argue that indistinguishability of identical particles is already a feature in classical mechanics and this is clearly brought out when the problem is treated in the language of information and associated entropy. We pinpoint the physical criteria for indistinguishability that is crucial for the treatment of the Gibbs’ problem and the consistency of its solution with conventional thermodynamics. Quantum mechanics provides a quantitative criterion, not possible in the classical picture, for the degree of indistinguishability in terms of visibility of quantum interference, or overlap of the states as pointed out by von Neumann, thereby endowing the entropy expression with mathematical continuity and physical reasonableness.
Nonclassicality Criteria in Multiport Interferometry
NASA Astrophysics Data System (ADS)
Rigovacca, L.; Di Franco, C.; Metcalf, B. J.; Walmsley, I. A.; Kim, M. S.
2016-11-01
Interference lies at the heart of the behavior of classical and quantum light. It is thus crucial to understand the boundaries between which interference patterns can be explained by a classical electromagnetic description of light and which, on the other hand, can only be understood with a proper quantum mechanical approach. While the case of two-mode interference has received a lot of attention, the multimode case has not yet been fully explored. Here we study a general scenario of intensity interferometry: we derive a bound on the average correlations between pairs of output intensities for the classical wavelike model of light, and we show how it can be violated in a quantum framework. As a consequence, this violation acts as a nonclassicality witness, able to detect the presence of sources with sub-Poissonian photon-number statistics. We also develop a criterion that can certify the impossibility of dividing a given interferometer into two independent subblocks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saint-Michel, B.; Aix Marseille Université, CNRS, Centrale Marseille, IRPHE UMR 7342, 13384 Marseille; Herbert, E.
2014-12-15
We report measurements of the dissipation in the Superfluid helium high REynold number von Kármán flow experiment for different forcing conditions. Statistically steady flows are reached; they display a hysteretic behavior similar to what has been observed in a 1:4 scale water experiment. Our macroscopical measurements indicate no noticeable difference between classical and superfluid flows, thereby providing evidence of the same dissipation scaling laws in the two phases. A detailed study of the evolution of the hysteresis cycle with the Reynolds number supports the idea that the stability of the steady states of classical turbulence in this closed flow ismore » partly governed by the dissipative scales. It also supports the idea that the normal and the superfluid components at these temperatures (1.6 K) are locked down to the dissipative length scale.« less
ERIC Educational Resources Information Center
Griffiths, Thomas L.; Tenenbaum, Joshua B.
2009-01-01
Inducing causal relationships from observations is a classic problem in scientific inference, statistics, and machine learning. It is also a central part of human learning, and a task that people perform remarkably well given its notorious difficulties. People can learn causal structure in various settings, from diverse forms of data: observations…
Linking Performance Measures to Resource Allocation: Exploring Unmapped Terrain.
ERIC Educational Resources Information Center
Ewell, Peter T.
1999-01-01
Examination of how (and whether) particular types of institutional performance measures can be beneficially used in making resource allocation decisions finds that only easily verifiable "hard" statistics should be used in classic performance funding approaches, although surveys and the use of good practices by institutions may…
Spear Phishing Attack Detection
2011-03-24
the insider amongst senior leaders of an organization [Mes08], the undercover detective within a drug cartel, or the classic secret agent planted in...to a mimicry attack that shapes the embedded malware to have a statistical distribution similar to "normal" or benign behavior. 2.3.1.3
On the analysis of Canadian Holstein dairy cow lactation curves using standard growth functions.
López, S; France, J; Odongo, N E; McBride, R A; Kebreab, E; AlZahal, O; McBride, B W; Dijkstra, J
2015-04-01
Six classical growth functions (monomolecular, Schumacher, Gompertz, logistic, Richards, and Morgan) were fitted to individual and average (by parity) cumulative milk production curves of Canadian Holstein dairy cows. The data analyzed consisted of approximately 91,000 daily milk yield records corresponding to 122 first, 99 second, and 92 third parity individual lactation curves. The functions were fitted using nonlinear regression procedures, and their performance was assessed using goodness-of-fit statistics (coefficient of determination, residual mean squares, Akaike information criterion, and the correlation and concordance coefficients between observed and adjusted milk yields at several days in milk). Overall, all the growth functions evaluated showed an acceptable fit to the cumulative milk production curves, with the Richards equation ranking first (smallest Akaike information criterion) followed by the Morgan equation. Differences among the functions in their goodness-of-fit were enlarged when fitted to average curves by parity, where the sigmoidal functions with a variable point of inflection (Richards and Morgan) outperformed the other 4 equations. All the functions provided satisfactory predictions of milk yield (calculated from the first derivative of the functions) at different lactation stages, from early to late lactation. The Richards and Morgan equations provided the most accurate estimates of peak yield and total milk production per 305-d lactation, whereas the least accurate estimates were obtained with the logistic equation. In conclusion, classical growth functions (especially sigmoidal functions with a variable point of inflection) proved to be feasible alternatives to fit cumulative milk production curves of dairy cows, resulting in suitable statistical performance and accurate estimates of lactation traits. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kjellander, Roland
2006-04-01
It is shown that the nature of the non-electrostatic part of the pair interaction potential in classical Coulomb fluids can have a profound influence on the screening behaviour. Two cases are compared: (i) when the non-electrostatic part equals an arbitrary finite-ranged interaction and (ii) when a dispersion r-6 interaction potential is included. A formal analysis is done in exact statistical mechanics, including an investigation of the bridge function. It is found that the Coulombic r-1 and the dispersion r-6 potentials are coupled in a very intricate manner as regards the screening behaviour. The classical one-component plasma (OCP) is a particularly clear example due to its simplicity and is investigated in detail. When the dispersion r-6 potential is turned on, the screened electrostatic potential from a particle goes from a monotonic exponential decay, exp(-κr)/r, to a power-law decay, r-8, for large r. The pair distribution function acquire, at the same time, an r-10 decay for large r instead of the exponential one. There still remains exponentially decaying contributions to both functions, but these contributions turn oscillatory when the r-6 interaction is switched on. When the Coulomb interaction is turned off but the dispersion r-6 pair potential is kept, the decay of the pair distribution function for large r goes over from the r-10 to an r-6 behaviour, which is the normal one for fluids of electroneutral particles with dispersion interactions. Differences and similarities compared to binary electrolytes are pointed out.
On the streaming model for redshift-space distortions
NASA Astrophysics Data System (ADS)
Kuruvilla, Joseph; Porciani, Cristiano
2018-06-01
The streaming model describes the mapping between real and redshift space for 2-point clustering statistics. Its key element is the probability density function (PDF) of line-of-sight pairwise peculiar velocities. Following a kinetic-theory approach, we derive the fundamental equations of the streaming model for ordered and unordered pairs. In the first case, we recover the classic equation while we demonstrate that modifications are necessary for unordered pairs. We then discuss several statistical properties of the pairwise velocities for DM particles and haloes by using a suite of high-resolution N-body simulations. We test the often used Gaussian ansatz for the PDF of pairwise velocities and discuss its limitations. Finally, we introduce a mixture of Gaussians which is known in statistics as the generalised hyperbolic distribution and show that it provides an accurate fit to the PDF. Once inserted in the streaming equation, the fit yields an excellent description of redshift-space correlations at all scales that vastly outperforms the Gaussian and exponential approximations. Using a principal-component analysis, we reduce the complexity of our model for large redshift-space separations. Our results increase the robustness of studies of anisotropic galaxy clustering and are useful for extending them towards smaller scales in order to test theories of gravity and interacting dark-energy models.
Exact test-based approach for equivalence test with parameter margin.
Cassie Dong, Xiaoyu; Bian, Yuanyuan; Tsong, Yi; Wang, Tianhua
2017-01-01
The equivalence test has a wide range of applications in pharmaceutical statistics which we need to test for the similarity between two groups. In recent years, the equivalence test has been used in assessing the analytical similarity between a proposed biosimilar product and a reference product. More specifically, the mean values of the two products for a given quality attribute are compared against an equivalence margin in the form of ±f × σ R , where ± f × σ R is a function of the reference variability. In practice, this margin is unknown and is estimated from the sample as ±f × S R . If we use this estimated margin with the classic t-test statistic on the equivalence test for the means, both Type I and Type II error rates may inflate. To resolve this issue, we develop an exact-based test method and compare this method with other proposed methods, such as the Wald test, the constrained Wald test, and the Generalized Pivotal Quantity (GPQ) in terms of Type I error rate and power. Application of those methods on data analysis is also provided in this paper. This work focuses on the development and discussion of the general statistical methodology and is not limited to the application of analytical similarity.
The Singing Rod (in the Modern Age)
ERIC Educational Resources Information Center
Lasby, B.; O'Meara, J. M.; Williams, M.
2014-01-01
This is a classic classroom demonstration of resonance, nodes, anti-nodes, and standing waves that has been described elsewhere. The modern age twist that we are advocating is the coupling of this classic demo with free (or relatively inexpensive) sound analysis software, thereby allowing for quantitative analysis of resonance while experimenting…
Data from: Retrospective analysis of a classical biological control programme
USDA-ARS?s Scientific Manuscript database
This database contains the raw data for the publication entitled Naranjo, S.E. 2018. Retrospective analysis of a classical biological control programme. Journal of Applied Ecology https://doi.org/10.1111/1365-2664.13163. Specific data include field-based, partial life table data for immature stage...
Laban Movement Analysis Approach to Classical Ballet Pedagogy
ERIC Educational Resources Information Center
Whittier, Cadence
2006-01-01
As a Certified Laban Movement Analyst and a classically trained ballet dancer, I consistently weave the Laban Movement Analysis/Bartenieff Fundamentals (LMA/BF) theories and philosophies into the ballet class. This integration assists in: (1) Identifying the qualitative movement elements both in the art of ballet and in the students' dancing…
Royle, J. Andrew; Dorazio, Robert M.
2008-01-01
A guide to data collection, modeling and inference strategies for biological survey data using Bayesian and classical statistical methods. This book describes a general and flexible framework for modeling and inference in ecological systems based on hierarchical models, with a strict focus on the use of probability models and parametric inference. Hierarchical models represent a paradigm shift in the application of statistics to ecological inference problems because they combine explicit models of ecological system structure or dynamics with models of how ecological systems are observed. The principles of hierarchical modeling are developed and applied to problems in population, metapopulation, community, and metacommunity systems. The book provides the first synthetic treatment of many recent methodological advances in ecological modeling and unifies disparate methods and procedures. The authors apply principles of hierarchical modeling to ecological problems, including * occurrence or occupancy models for estimating species distribution * abundance models based on many sampling protocols, including distance sampling * capture-recapture models with individual effects * spatial capture-recapture models based on camera trapping and related methods * population and metapopulation dynamic models * models of biodiversity, community structure and dynamics.
Direct Breakthrough Curve Prediction From Statistics of Heterogeneous Conductivity Fields
NASA Astrophysics Data System (ADS)
Hansen, Scott K.; Haslauer, Claus P.; Cirpka, Olaf A.; Vesselinov, Velimir V.
2018-01-01
This paper presents a methodology to predict the shape of solute breakthrough curves in heterogeneous aquifers at early times and/or under high degrees of heterogeneity, both cases in which the classical macrodispersion theory may not be applicable. The methodology relies on the observation that breakthrough curves in heterogeneous media are generally well described by lognormal distributions, and mean breakthrough times can be predicted analytically. The log-variance of solute arrival is thus sufficient to completely specify the breakthrough curves, and this is calibrated as a function of aquifer heterogeneity and dimensionless distance from a source plane by means of Monte Carlo analysis and statistical regression. Using the ensemble of simulated groundwater flow and solute transport realizations employed to calibrate the predictive regression, reliability estimates for the prediction are also developed. Additional theoretical contributions include heuristics for the time until an effective macrodispersion coefficient becomes applicable, and also an expression for its magnitude that applies in highly heterogeneous systems. It is seen that the results here represent a way to derive continuous time random walk transition distributions from physical considerations rather than from empirical field calibration.
Microcirculation of the fingers in Raynaud's syndrome: (99m)Tc-DTPA imaging.
Csiki, Z; Garai, I; Varga, J; Szücs, G; Galajda, Z; András, C; Zeher, M; Galuska, L
2005-02-01
We investigated the circulatory characteristics of patients suffering of primary and secondary Raynaud's syndrome. We examined 106 patients presenting with the classical symptoms of Raynaud's syndrom (47 primary, 59 secondary) by hand perfusion scintigraphy developed by our Department of Nuclear Medicine. After visual evaluation we analyzed the images semiquantitatively, using the finger to palm ratio. We statistically compared the patients with primary and those with secondary Raynaud's syndrome. By visual evaluation we constated regional perfusion disturbances in 42 from 59 patients with secondary Raynaud's syndrome. However, this was observed in only 3 from 47 patients with the primary form of this disease. This difference was statistically significant (p<0.001). Semiquantitative analysis showed that the finger/palm ratios (FPR) were significantly lower (p<0.05) for the patients with primary Raynaud's syndrome. No differences in the FPR values concerning sex or right and left side. The hand perfusion scintigraphy with (99m)Tc-DTPA is a noninvasive, cost effective diagnostic tool, which objectively reflects the global and regional microcirculatory abnormalities of the hands, and provides quantitative data for follow-up.
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
Monitoring Method of Cow Anthrax Based on Gis and Spatial Statistical Analysis
NASA Astrophysics Data System (ADS)
Li, Lin; Yang, Yong; Wang, Hongbin; Dong, Jing; Zhao, Yujun; He, Jianbin; Fan, Honggang
Geographic information system (GIS) is a computer application system, which possesses the ability of manipulating spatial information and has been used in many fields related with the spatial information management. Many methods and models have been established for analyzing animal diseases distribution models and temporal-spatial transmission models. Great benefits have been gained from the application of GIS in animal disease epidemiology. GIS is now a very important tool in animal disease epidemiological research. Spatial analysis function of GIS can be widened and strengthened by using spatial statistical analysis, allowing for the deeper exploration, analysis, manipulation and interpretation of spatial pattern and spatial correlation of the animal disease. In this paper, we analyzed the cow anthrax spatial distribution characteristics in the target district A (due to the secret of epidemic data we call it district A) based on the established GIS of the cow anthrax in this district in combination of spatial statistical analysis and GIS. The Cow anthrax is biogeochemical disease, and its geographical distribution is related closely to the environmental factors of habitats and has some spatial characteristics, and therefore the correct analysis of the spatial distribution of anthrax cow for monitoring and the prevention and control of anthrax has a very important role. However, the application of classic statistical methods in some areas is very difficult because of the pastoral nomadic context. The high mobility of livestock and the lack of enough suitable sampling for the some of the difficulties in monitoring currently make it nearly impossible to apply rigorous random sampling methods. It is thus necessary to develop an alternative sampling method, which could overcome the lack of sampling and meet the requirements for randomness. The GIS computer application software ArcGIS9.1 was used to overcome the lack of data of sampling sites.Using ArcGIS 9.1 and GEODA to analyze the cow anthrax spatial distribution of district A. we gained some conclusions about cow anthrax' density: (1) there is a spatial clustering model. (2) there is an intensely spatial autocorrelation. We established a prediction model to estimate the anthrax distribution based on the spatial characteristic of the density of cow anthrax. Comparing with the true distribution, the prediction model has a well coincidence and is feasible to the application. The method using a GIS tool facilitates can be implemented significantly in the cow anthrax monitoring and investigation, and the space statistics - related prediction model provides a fundamental use for other study on space-related animal diseases.
Ten reasons why a thermalized system cannot be described by a many-particle wave function
NASA Astrophysics Data System (ADS)
Drossel, Barbara
2017-05-01
It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practice and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper suggests that the irreversibility and stochasticity of statistical mechanics should be taken as a real property of nature. It follows that a gas of a macroscopic number N of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.
Friedrich Nietzsche in Basel: An Apology for Classical Studies
ERIC Educational Resources Information Center
Santini, Carlotta
2018-01-01
Alongside his work as a professor of Greek Language and Literature at the University of Basel, Friedrich Nietzsche reflected on the value of classical studies in contemporary nineteenth-century society, starting with a self-analysis of his own classical training and position as a philologist and teacher. Contrary to his well-known aversion to…
Fourier Analysis in Introductory Physics
ERIC Educational Resources Information Center
Huggins, Elisha
2007-01-01
In an after-dinner talk at the fall 2005 meeting of the New England chapter of the AAPT, Professor Robert Arns drew an analogy between classical physics and Classic Coke. To generations of physics teachers and textbook writers, classical physics was the real thing. Modern physics, which in introductory textbooks "appears in one or more extra…
NASA Astrophysics Data System (ADS)
Kbaier Ben Ismail, Dhouha; Lazure, Pascal; Puillat, Ingrid
2016-10-01
In marine sciences, many fields display high variability over a large range of spatial and temporal scales, from seconds to thousands of years. The longer recorded time series, with an increasing sampling frequency, in this field are often nonlinear, nonstationary, multiscale and noisy. Their analysis faces new challenges and thus requires the implementation of adequate and specific methods. The objective of this paper is to highlight time series analysis methods already applied in econometrics, signal processing, health, etc. to the environmental marine domain, assess advantages and inconvenients and compare classical techniques with more recent ones. Temperature, turbidity and salinity are important quantities for ecosystem studies. The authors here consider the fluctuations of sea level, salinity, turbidity and temperature recorded from the MAREL Carnot system of Boulogne-sur-Mer (France), which is a moored buoy equipped with physico-chemical measuring devices, working in continuous and autonomous conditions. In order to perform adequate statistical and spectral analyses, it is necessary to know the nature of the considered time series. For this purpose, the stationarity of the series and the occurrence of unit-root are addressed with the Augmented-Dickey Fuller tests. As an example, the harmonic analysis is not relevant for temperature, turbidity and salinity due to the nonstationary condition, except for the nearly stationary sea level datasets. In order to consider the dominant frequencies associated to the dynamics, the large number of data provided by the sensors should enable the estimation of Fourier spectral analysis. Different power spectra show a complex variability and reveal an influence of environmental factors such as tides. However, the previous classical spectral analysis, namely the Blackman-Tukey method, requires not only linear and stationary data but also evenly-spaced data. Interpolating the time series introduces numerous artifacts to the data. The Lomb-Scargle algorithm is adapted to unevenly-spaced data and is used as an alternative. The limits of the method are also set out. It was found that beyond 50% of missing measures, few significant frequencies are detected, several seasonalities are no more visible, and even a whole range of high frequency disappears progressively. Furthermore, two time-frequency decomposition methods, namely wavelets and Hilbert-Huang Transformation (HHT), are applied for the analysis of the entire dataset. Using the Continuous Wavelet Transform (CWT), some properties of the time series are determined. Then, the inertial wave and several low-frequency tidal waves are identified by the application of the Empirical Mode Decomposition (EMD). Finally, EMD based Time Dependent Intrinsic Correlation (TDIC) analysis is applied to consider the correlation between two nonstationary time series.
NASA Astrophysics Data System (ADS)
Lusanna, Luca; Pauri, Massimo
2014-08-01
If the classical structure of space-time is assumed to define an a priori scenario for the formulation of quantum theory (QT), the coordinate representation of the solutions of the Schroedinger equation of a quantum system containing one ( N) massive scalar particle has a preferred status. Let us consider all of the solutions admitting a multipolar expansion of the probability density function (and more generally of the Wigner function) around a space-time trajectory to be properly selected. For every normalized solution there is a privileged trajectory implying the vanishing of the dipole moment of the multipolar expansion: it is given by the expectation value of the position operator . Then, the special subset of solutions which satisfy Ehrenfest's Theorem (named thereby Ehrenfest monopole wave functions (EMWF)), have the important property that this privileged classical trajectory is determined by a closed Newtonian equation of motion where the effective force is the Newtonian force plus non-Newtonian terms (of order ħ 2 or higher) depending on the higher multipoles of the probability distribution ρ. Note that the superposition of two EMWFs is not an EMWF, a result to be strongly hoped for, given the possible unwanted implications concerning classical spatial perception. These results can be extended to N-particle systems in such a way that, when N classical trajectories with all the dipole moments vanishing and satisfying Ehrenfest theorem are associated with the normalized wave functions of the N-body system, we get a natural transition from the 3 N-dimensional configuration space to the space-time. Moreover, these results can be extended to relativistic quantum mechanics. Consequently, in suitable states of N quantum particle which are EMWF, we get the "emergence" of corresponding "classical particles" following Newton-like trajectories in space-time. Note that all this holds true in the standard framework of quantum mechanics, i.e. assuming, in particular, the validity of Born's rule and the individual system interpretation of the wave function (no ensemble interpretation). These results are valid without any approximation (like ħ → 0, big quantum numbers, etc.). Moreover, we do not commit ourselves to any specific ontological interpretation of quantum theory (such as, e.g., the Bohmian one). We will argue that, in substantial agreement with Bohr's viewpoint, the macroscopic description of the preparation, certain intermediate steps and the detection of the final outcome of experiments involving massive particles are dominated by these classical "effective" trajectories. This approach can be applied to the point of view of de-coherence in the case of a diagonal reduced density matrix ρ red (an improper mixture) depending on the position variables of a massive particle and of a pointer. When both the particle and the pointer wave functions appearing in ρ red are EMWF, the expectation value of the particle and pointer position variables becomes a statistical average on a classical ensemble. In these cases an improper quantum mixture becomes a classical statistical one, thus providing a particular answer to an open problem of de-coherence about the emergence of classicality.
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
An Integrated Approach to Thermodynamics in the Introductory Physics Course.
ERIC Educational Resources Information Center
Alonso, Marcelo; Finn, Edward J.
1995-01-01
Presents an approach to combine the empirical approach of classical thermodynamics with the structural approach of statistical mechanics. Topics covered include dynamical foundation of the first law; mechanical work, heat, radiation, and the first law; thermal equilibrium; thermal processes; thermodynamic probability; entropy; the second law;…
A New Challenge for Compression Algorithms: Genetic Sequences.
ERIC Educational Resources Information Center
Grumbach, Stephane; Tahi, Fariza
1994-01-01
Analyzes the properties of genetic sequences that cause the failure of classical algorithms used for data compression. A lossless algorithm, which compresses the information contained in DNA and RNA sequences by detecting regularities such as palindromes, is presented. This algorithm combines substitutional and statistical methods and appears to…
Spacecraft Formation Control and Estimation Via Improved Relative Motion Dynamics
2017-03-30
statistical (e.g. batch least-squares or Extended Kalman Filter ) estimator. In addition, the IROD approach can be applied to classical (ground-based...covariance Test the viability of IROD solutions by injecting them into precise orbit determination schemes (e.g. various strains of Kalman filters