Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.
1983-09-01
research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis
Statistical Analysis on the Mechanical Properties of Magnesium Alloys
Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng
2017-01-01
Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116
APPLICATION OF STATISTICAL ENERGY ANALYSIS TO VIBRATIONS OF MULTI-PANEL STRUCTURES.
cylindrical shell are compared with predictions obtained from statistical energy analysis . Generally good agreement is observed. The flow of mechanical...the coefficients of proportionality between power flow and average modal energy difference, which one must know in order to apply statistical energy analysis . No
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
Experimental Quiet Sprocket Design and Noise Reduction in Tracked Vehicles
1981-04-01
Track and Suspension Noise Reduction Statistical Energy Analysis Mechanical Impedance Measurement Finite Element Modal Analysis\\Noise Sources 2...shape and idler attachment are different. These differen- ces were investigated using the concepts of statistical energy analysis for hull generated noise...element r,’calculated from Statistical Energy Analysis . Such an approach will be valid within reasonable limits for frequencies of about 200 Hz and
A κ-generalized statistical mechanics approach to income analysis
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
Shinzato, Takashi
2016-12-01
The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2016-12-01
The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.
A Mechanical Power Flow Capability for the Finite Element Code NASTRAN
1989-07-01
perimental methods. statistical energy analysis , the finite element method, and a finite element analog-,y using heat conduction equations. Experimental...weights and inertias of the transducers attached to an experimental structure may produce accuracy problems. Statistical energy analysis (SEA) is a...405-422 (1987). 8. Lyon, R.L., Statistical Energy Analysis of Dynamical Sistems, The M.I.T. Press, (1975). 9. Mickol, J.D., and R.J. Bernhard, "An
Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin
We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less
Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries
Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...
2016-03-09
We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less
Analysis of Longitudinal Outcome Data with Missing Values in Total Knee Arthroplasty.
Kang, Yeon Gwi; Lee, Jang Taek; Kang, Jong Yeal; Kim, Ga Hye; Kim, Tae Kyun
2016-01-01
We sought to determine the influence of missing data on the statistical results, and to determine which statistical method is most appropriate for the analysis of longitudinal outcome data of TKA with missing values among repeated measures ANOVA, generalized estimating equation (GEE) and mixed effects model repeated measures (MMRM). Data sets with missing values were generated with different proportion of missing data, sample size and missing-data generation mechanism. Each data set was analyzed with three statistical methods. The influence of missing data was greater with higher proportion of missing data and smaller sample size. MMRM tended to show least changes in the statistics. When missing values were generated by 'missing not at random' mechanism, no statistical methods could fully avoid deviations in the results. Copyright © 2016 Elsevier Inc. All rights reserved.
The Shock and Vibration Digest. Volume 13. Number 7
1981-07-01
Richards, ISVR, University of Southampton Presidential Address "A Structural Dynamicist Looks at Statistical Energy Analysis " Professor B.L...excitation and for random and sine sweep mechanical excitation. Test data were used to assess prediction methods, in particular a statistical energy analysis method
EFFECTS OF COMPOSITION ON THE MECHANICAL PROPERTIES OF NI-CR-MO-CO FILLER METALS.
STEEL, WELDING RODS), CHEMICAL ANALYSIS, CARBON ALLOYS , COBALT ALLOYS , CHROMIUM ALLOYS , MOLYBDENUM ALLOYS , NICKEL ALLOYS , MARAGING STEELS...ALUMINUM COMPOUNDS, TITANIUM , NONMETALS, SHIP HULLS, SHIP PLATES, SUBMARINE HULLS, WELDING , WELDS , MECHANICAL PROPERTIES, STATISTICAL ANALYSIS, MICROSTRUCTURE.
ERIC Educational Resources Information Center
Findley, Bret R.; Mylon, Steven E.
2008-01-01
We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…
Mechanical properties of silicate glasses exposed to a low-Earth orbit
NASA Technical Reports Server (NTRS)
Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.
1992-01-01
The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.
A statistical mechanics approach to autopoietic immune networks
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2010-07-01
In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
The effects of multiple repairs on Inconel 718 weld mechanical properties
NASA Technical Reports Server (NTRS)
Russell, C. K.; Nunes, A. C., Jr.; Moore, D.
1991-01-01
Inconel 718 weldments were repaired 3, 6, 9, and 13 times using the gas tungsten arc welding process. The welded panels were machined into mechanical test specimens, postweld heat treated, and nondestructively tested. Tensile properties and high cycle fatigue life were evaluated and the results compared to unrepaired weld properties. Mechanical property data were analyzed using the statistical methods of difference in means for tensile properties and difference in log means and Weibull analysis for high cycle fatigue properties. Statistical analysis performed on the data did not show a significant decrease in tensile or high cycle fatigue properties due to the repeated repairs. Some degradation was observed in all properties, however, it was minimal.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
NASA Astrophysics Data System (ADS)
Sergis, Antonis; Hardalupas, Yannis
2011-05-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.
Sergis, Antonis; Hardalupas, Yannis
2011-05-19
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
NASA Astrophysics Data System (ADS)
Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.
2018-02-01
In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
2011-01-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-10-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.
THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES
Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil
2016-01-01
In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512
Confounding in statistical mediation analysis: What it is and how to address it.
Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P
2017-11-01
Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Statistical evidence of strain induced breaking of metallic point contacts
NASA Astrophysics Data System (ADS)
Alwan, Monzer; Candoni, Nadine; Dumas, Philippe; Klein, Hubert R.
2013-06-01
A scanning tunneling microscopy in break junction regime and a mechanically controllable break junction are used to acquire thousands of conductance-elongation curves by stretching until breaking and re-connecting Au junctions. From a robust statistical analysis performed on large sets of experiments, parameters such as lifetime, elongation and occurrence probabilities are extracted. The analysis of results obtained for different stretching speeds of the electrodes indicates that the breaking mechanism of di- and mono-atomic junction is identical, and that the junctions undergo atomic rearrangement during their stretching and at the moment of breaking.
NASA Astrophysics Data System (ADS)
Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia
2014-02-01
How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Statistical models and NMR analysis of polymer microstructure
USDA-ARS?s Scientific Manuscript database
Statistical models can be used in conjunction with NMR spectroscopy to study polymer microstructure and polymerization mechanisms. Thus, Bernoullian, Markovian, and enantiomorphic-site models are well known. Many additional models have been formulated over the years for additional situations. Typica...
Statistical Analysis Tools for Learning in Engineering Laboratories.
ERIC Educational Resources Information Center
Maher, Carolyn A.
1990-01-01
Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…
Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements
NASA Technical Reports Server (NTRS)
Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.
1988-01-01
The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-05-04
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-01-01
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zubko, I. Yu., E-mail: zoubko@list.ru; Kochurov, V. I.
2015-10-27
For the aim of the crystal temperature control the computational-statistical approach to studying thermo-mechanical properties for finite sized crystals is presented. The approach is based on the combination of the high-performance computational techniques and statistical analysis of the crystal response on external thermo-mechanical actions for specimens with the statistically small amount of atoms (for instance, nanoparticles). The heat motion of atoms is imitated in the statics approach by including the independent degrees of freedom for atoms connected with their oscillations. We obtained that under heating, graphene material response is nonsymmetric.
A Three Dimensional Kinematic and Kinetic Study of the Golf Swing
Nesbit, Steven M.
2005-01-01
This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key Points Full-body model of the golf swing. Mechanical description of the golf swing. Statistical analysis of golf swing mechanics. Comparisons of subject swing mechanics PMID:24627665
A three dimensional kinematic and kinetic study of the golf swing.
Nesbit, Steven M
2005-12-01
This paper discusses the three-dimensional kinematics and kinetics of a golf swing as performed by 84 male and one female amateur subjects of various skill levels. The analysis was performed using a variable full-body computer model of a human coupled with a flexible model of a golf club. Data to drive the model was obtained from subject swings recorded using a multi-camera motion analysis system. Model output included club trajectories, golfer/club interaction forces and torques, work and power, and club deflections. These data formed the basis for a statistical analysis of all subjects, and a detailed analysis and comparison of the swing characteristics of four of the subjects. The analysis generated much new data concerning the mechanics of the golf swing. It revealed that a golf swing is a highly coordinated and individual motion and subject-to-subject variations were significant. The study highlighted the importance of the wrists in generating club head velocity and orienting the club face. The trajectory of the hands and the ability to do work were the factors most closely related to skill level. Key PointsFull-body model of the golf swing.Mechanical description of the golf swing.Statistical analysis of golf swing mechanics.Comparisons of subject swing mechanics.
NASA Astrophysics Data System (ADS)
Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle
2017-08-01
In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.
Improving Markov Chain Models for Road Profiles Simulation via Definition of States
2012-04-01
wavelet transform in pavement profile analysis," Vehicle System Dynamics: International Journal of Vehicle Mechanics and Mobility, vol. 47, no. 4...34Estimating Markov Transition Probabilities from Micro -Unit Data," Journal of the Royal Statistical Society. Series C (Applied Statistics), pp. 355-371
SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliopoulos, AS; Sun, X; Floros, D
Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less
Balanced mechanical resonator for powder handling device
NASA Technical Reports Server (NTRS)
Sarrazin, Philippe C. (Inventor); Brunner, Will M. (Inventor)
2012-01-01
A system incorporating a balanced mechanical resonator and a method for vibration of a sample composed of granular material to generate motion of a powder sample inside the sample holder for obtaining improved analysis statistics, without imparting vibration to the sample holder support.
Methodologies for the Statistical Analysis of Memory Response to Radiation
NASA Astrophysics Data System (ADS)
Bosser, Alexandre L.; Gupta, Viyas; Tsiligiannis, Georgios; Frost, Christopher D.; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigné, Frédéric; Virtanen, Ari; Wrobel, Frédéric; Dilillo, Luigi
2016-08-01
Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Acoustic emission spectral analysis of fiber composite failure mechanisms
NASA Technical Reports Server (NTRS)
Egan, D. M.; Williams, J. H., Jr.
1978-01-01
The acoustic emission of graphite fiber polyimide composite failure mechanisms was investigated with emphasis on frequency spectrum analysis. Although visual examination of spectral densities could not distinguish among fracture sources, a paired-sample t statistical analysis of mean normalized spectral densities did provide quantitative discrimination among acoustic emissions from 10 deg, 90 deg, and plus or minus 45 deg, plus or minus 45 deg sub s specimens. Comparable discrimination was not obtained for 0 deg specimens.
Statistical mechanics of economics I
NASA Astrophysics Data System (ADS)
Kusmartsev, F. V.
2011-02-01
We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
Statistical analysis of early failures in electromigration
NASA Astrophysics Data System (ADS)
Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.
2001-07-01
The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.
Lepesqueur, Laura Soares; de Figueiredo, Viviane Maria Gonçalves; Ferreira, Leandro Lameirão; Sobrinho, Argemiro Soares da Silva; Massi, Marcos; Bottino, Marco Antônio; Nogueira Junior, Lafayette
2015-01-01
To determine the effect of maintaining torque after mechanical cycling of abutment screws that are coated with diamondlike carbon and coated with diamondlike carbon doped with diamond nanoparticles, with external and internal hex connections. Sixty implants were divided into six groups according to the type of connection (external or internal hex) and the type of abutment screw (uncoated, coated with diamondlike carbon, and coated with diamondlike carbon doped with diamond nanoparticles). The implants were inserted into polyurethane resin and crowns of nickel chrome were cemented on the implants. The crowns had a hole for access to the screw. The initial torque and the torque after mechanical cycling were measured. The torque values maintained (in percentages) were evaluated. Statistical analysis was performed using one-way analysis of variance and the Tukey test, with a significance level of 5%. The largest torque value was maintained in uncoated screws with external hex connections, a finding that was statistically significant (P = .0001). No statistically significant differences were seen between the groups with and without coating in maintaining torque for screws with internal hex connections (P = .5476). After mechanical cycling, the diamondlike carbon with and without diamond doping on the abutment screws showed no improvement in maintaining torque in external and internal hex connections.
Markov Random Fields, Stochastic Quantization and Image Analysis
1990-01-01
Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.
ERIC Educational Resources Information Center
Yeager, Joseph; Sommer, Linda
2007-01-01
Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…
Fundamentals of poly(lactic acid) microstructure, crystallization behavior, and properties
NASA Astrophysics Data System (ADS)
Kang, Shuhui
Poly(lactic acid) is an environmentally-benign biodegradable and sustainable thermoplastic material, which has found broad applications as food packaging films and as non-woven fibers. The crystallization and deformation mechanisms of the polymer are largely determined by the distribution of conformation and configuration. Knowledge of these mechanisms is needed to understand the mechanical and thermal properties on which processing conditions mainly depend. In conjunction with laser light scattering, Raman spectroscopy and normal coordinate analysis are used in this thesis to elucidate these properties. Vibrational spectroscopic theory, Flory's rotational isomeric state (RIS) theory, Gaussian chain statistics and statistical mechanics are used to relate experimental data to molecular chain structure. A refined RIS model is proposed, chain rigidity recalculated and chain statistics discussed. A Raman spectroscopic characterization method for crystalline and amorphous phase orientation has been developed. A shrinkage model is also proposed to interpret the dimensional stability for fibers and uni- or biaxially stretched films. A study of stereocomplexation formed by poly(l-lactic acid) and poly(d-lactic acid) is also presented.
Liu, Wei; Ding, Jinhui
2018-04-01
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
19 CFR 201.21 - Availability of specific records.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., graphs, notes, charts, tabulations, data analysis, statistical or information accumulations, records of meetings and conversations, film impressions, magnetic tapes, and sound or mechanical reproductions; the...
NASA Astrophysics Data System (ADS)
Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo
2016-12-01
We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.
Generalising the logistic map through the q-product
NASA Astrophysics Data System (ADS)
Pessoa, R. W. S.; Borges, E. P.
2011-03-01
We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 <= xn <= 1, 0 < a <= 2) where otimesq stands for a generalisation of the ordinary product, known as q-product [Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.
Spatio-temporal analysis of aftershock sequences in terms of Non Extensive Statistical Physics.
NASA Astrophysics Data System (ADS)
Chochlaki, Kalliopi; Vallianatos, Filippos
2017-04-01
Earth's seismicity is considered as an extremely complicated process where long-range interactions and fracturing exist (Vallianatos et al., 2016). For this reason, in order to analyze it, we use an innovative methodological approach, introduced by Tsallis (Tsallis, 1988; 2009), named Non Extensive Statistical Physics. This approach introduce a generalization of the Boltzmann-Gibbs statistical mechanics and it is based on the definition of Tsallis entropy Sq, which maximized leads the the so-called q-exponential function that expresses the probability distribution function that maximizes the Sq. In the present work, we utilize the concept of Non Extensive Statistical Physics in order to analyze the spatiotemporal properties of several aftershock series. Marekova (Marekova, 2014) suggested that the probability densities of the inter-event distances between successive aftershocks follow a beta distribution. Using the same data set we analyze the inter-event distance distribution of several aftershocks sequences in different geographic regions by calculating non extensive parameters that determine the behavior of the system and by fitting the q-exponential function, which expresses the degree of non-extentivity of the investigated system. Furthermore, the inter-event times distribution of the aftershocks as well as the frequency-magnitude distribution has been analyzed. The results supports the applicability of Non Extensive Statistical Physics ideas in aftershock sequences where a strong correlation exists along with memory effects. References C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. Phys. 52 (1988) 479-487. doi:10.1007/BF01016429 C. Tsallis, Introduction to nonextensive statistical mechanics: Approaching a complex world, 2009. doi:10.1007/978-0-387-85359-8. E. Marekova, Analysis of the spatial distribution between successive earthquakes in aftershocks series, Annals of Geophysics, 57, 5, doi:10.4401/ag-6556, 2014 F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016.
New Statistics for Testing Differential Expression of Pathways from Microarray Data
NASA Astrophysics Data System (ADS)
Siu, Hoicheong; Dong, Hua; Jin, Li; Xiong, Momiao
Exploring biological meaning from microarray data is very important but remains a great challenge. Here, we developed three new statistics: linear combination test, quadratic test and de-correlation test to identify differentially expressed pathways from gene expression profile. We apply our statistics to two rheumatoid arthritis datasets. Notably, our results reveal three significant pathways and 275 genes in common in two datasets. The pathways we found are meaningful to uncover the disease mechanisms of rheumatoid arthritis, which implies that our statistics are a powerful tool in functional analysis of gene expression data.
Oral cancer associated with chronic mechanical irritation of the oral mucosa.
Piemonte, E; Lazos, J; Belardinelli, P; Secchi, D; Brunotto, M; Lanfranchi-Tizeira, H
2018-03-01
Most of the studies dealing with Chronic Mechanical Irritation (CMI) and Oral Cancer (OC) only considered prosthetic and dental variables separately, and CMI functional factors are not registered. Thus, the aim of this study was to assess OC risk in individuals with dental, prosthetic and functional CMI. Also, we examined CMI presence in relation to tumor size. A case-control study was carried out from 2009 to 2013. Study group were squamous cell carcinoma cases; control group was patients seeking dental treatment in the same institution. 153 patients were studied (Study group n=53, Control group n=100). CMI reproducibility displayed a correlation coefficient of 1 (p<0.0001). Bivariate analysis showed statistically significant associations for all variables (age, gender, tobacco and alcohol consumption and CMI). Multivariate analysis exhibited statistical significance for age, alcohol, and CMI, but not for gender or tobacco. Relationship of CMI with tumor size showed no statistically significant differences. CMI could be regarded as a risk factor for oral cancer. In individuals with other OC risk factors, proper treatment of the mechanical injuring factors (dental, prosthetic and functional) could be an important measure to reduce the risk of oral cancer.
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1992-01-01
Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.
NASA Astrophysics Data System (ADS)
Jokhio, Gul A.; Syed Mohsin, Sharifah M.; Gul, Yasmeen
2018-04-01
It has been established that Adobe provides, in addition to being sustainable and economic, a better indoor air quality without spending extensive amounts of energy as opposed to the modern synthetic materials. The material, however, suffers from weak structural behaviour when subjected to adverse loading conditions. A wide range of mechanical properties has been reported in literature owing to lack of research and standardization. The present paper presents the statistical analysis of the results that were obtained through compressive and flexural tests on Adobe samples. Adobe specimens with and without wire mesh reinforcement were tested and the results were reported. The statistical analysis of these results presents an interesting read. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. This increase is statistically significant. The flexural response of Adobe has also shown improvement with the addition of wire mesh reinforcement, however, the statistical significance of the same cannot be established.
Investigation of Pre-Earthquake Ionospheric Disturbances by 3D Tomographic Analysis
NASA Astrophysics Data System (ADS)
Yagmur, M.
2016-12-01
Ionospheric variations before earthquakes have been widely discussed phenomena in ionospheric studies. To clarify the source and mechanism of these phenomena is highly important for earthquake forecasting. To well understanding the mechanical and physical processes of pre-seismic Ionospheric anomalies that might be related even with Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling, both statistical and 3D modeling analysis are needed. For these purpose, firstly we have investigated the relation between Ionospheric TEC Anomalies and potential source mechanisms such as space weather activity and lithospheric phenomena like positive surface electric charges. To distinguish their effects on Ionospheric TEC, we have focused on pre-seismically active days. Then, we analyzed the statistical data of 54 earthquakes that M≽6 between 2000 and 2013 as well as the 2011 Tohoku and the 2016 Kumamoto Earthquakes in Japan. By comparing TEC anomaly and Solar activity by Dst Index, we have found that 28 events that might be related with Earthquake activity. Following the statistical analysis, we also investigate the Lithospheric effect on TEC change on selected days. Among those days, we have chosen two case studies as the 2011 Tohoku and the 2016 Kumamoto Earthquakes to make 3D reconstructed images by utilizing 3D Tomography technique with Neural Networks. The results will be presented in our presentation. Keywords : Earthquake, 3D Ionospheric Tomography, Positive and Negative Anomaly, Geomagnetic Storm, Lithosphere
Course of Weaning from Prolonged Mechanical Ventilation after Cardiac Surgery
Herlihy, James P.; Koch, Stephen M.; Jackson, Robert; Nora, Hope
2006-01-01
In order to determine the temporal pattern of weaning from mechanical ventilation for patients undergoing prolonged mechanical ventilation after cardiac surgery, we performed a retrospective review of 21 patients' weaning courses at our long-term acute care hospital. Using multiple regression analysis of an estimate of individual patients' percentage of mechanical ventilator support per day (%MVSD), we determined that 14 of 21 patients (67%) showed a statistically significant quadratic or cubic relationship between time and %MVSD. These patients showed little or no improvement in their ventilator dependence until a point in time when, abruptly, they began to make rapid progress (a “wean turning point”), after which they progressed to discontinuation of mechanical ventilation in a relatively short period of time. The other 7 patients appeared to have a similar weaning pattern, although the data were not statistically significant. Most patients in the study group weaned from the ventilator through a specific temporal pattern that is newly described herein. Data analysis suggested that the mechanism for the development of a wean turning point was improvement of pulmonary mechanics rather than improvement in gas exchange or respiratory load. Although these observations need to be confirmed by a prospective trial, they may have implications for weaning cardiac surgery patients from prolonged mechanical ventilation, and possibly for weaning a broader group of patients who require prolonged mechanical ventilation. PMID:16878611
AIDS Education for Tanzanian Youth: A Mediation Analysis
ERIC Educational Resources Information Center
Stigler, Melissa H.; Kugler, K. C.; Komro, K. A.; Leshabari, M. T.; Klepp, K. I.
2006-01-01
Mediation analysis is a statistical technique that can be used to identify mechanisms by which intervention programs achieve their effects. This paper presents the results of a mediation analysis of Ngao, an acquired immunodeficiency syndrome (AIDS) education program that was implemented with school children in Grades 6 and 7 in Tanzania in the…
Impact resistance of fiber composites - Energy-absorbing mechanisms and environmental effects
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1985-01-01
Energy absorbing mechanisms were identified by several approaches. The energy absorbing mechanisms considered are those in unidirectional composite beams subjected to impact. The approaches used include: mechanic models, statistical models, transient finite element analysis, and simple beam theory. Predicted results are correlated with experimental data from Charpy impact tests. The environmental effects on impact resistance are evaluated. Working definitions for energy absorbing and energy releasing mechanisms are proposed and a dynamic fracture progression is outlined. Possible generalizations to angle-plied laminates are described.
Impact resistance of fiber composites: Energy absorbing mechanisms and environmental effects
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1983-01-01
Energy absorbing mechanisms were identified by several approaches. The energy absorbing mechanisms considered are those in unidirectional composite beams subjected to impact. The approaches used include: mechanic models, statistical models, transient finite element analysis, and simple beam theory. Predicted results are correlated with experimental data from Charpy impact tests. The environmental effects on impact resistance are evaluated. Working definitions for energy absorbing and energy releasing mechanisms are proposed and a dynamic fracture progression is outlined. Possible generalizations to angle-plied laminates are described.
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu
2014-05-01
The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.
NASA Astrophysics Data System (ADS)
Ulyanov, Sergey S.; Ulianova, Onega V.; Zaytsev, Sergey S.; Saltykov, Yury V.; Feodorova, Valentina A.
2018-04-01
The transformation mechanism for a nucleotide sequence of the Chlamydia trachomatis gene into a speckle pattern has been considered. The first and second-order statistics of gene-based speckles have been analyzed. It has been demonstrated that gene-based speckles do not obey Gaussian statistics and belong to the class of speckles with a small number of scatterers. It has been shown that gene polymorphism can be easily detected through analysis of the statistical characteristics of gene-based speckles.
Statistical testing of association between menstruation and migraine.
Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G
2015-02-01
To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.
Mechanics, Waves and Thermodynamics
NASA Astrophysics Data System (ADS)
Ranjan Jain, Sudhir
2016-05-01
Figures; Preface; Acknowledgement; 1. Energy, mass, momentum; 2. Kinematics, Newton's laws of motion; 3. Circular motion; 4. The principle of least action; 5. Work and energy; 6. Mechanics of a system of particles; 7. Friction; 8. Impulse and collisions; 9. Central forces; 10. Dimensional analysis; 11. Oscillations; 12. Waves; 13. Sound of music; 14. Fluid mechanics; 15. Water waves; 16. The kinetic theory of gases; 17. Concepts and laws of thermodynamics; 18. Some applications of thermodynamics; 19. Basic ideas of statistical mechanics; Bibliography; Index.
NASA Technical Reports Server (NTRS)
Crowe, D. R.; Henricks, W.
1983-01-01
The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.
Advanced statistical energy analysis
NASA Astrophysics Data System (ADS)
Heron, K. H.
1994-09-01
A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.
Using Bayes' theorem for free energy calculations
NASA Astrophysics Data System (ADS)
Rogers, David M.
Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
Guidelines for the Investigation of Mediating Variables in Business Research.
MacKinnon, David P; Coxe, Stefany; Baraldi, Amanda N
2012-03-01
Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized.
NASA Astrophysics Data System (ADS)
Rani Rana, Sandhya; Pattnaik, A. B.; Patnaik, S. C.
2018-03-01
In the present work the wear behavior and mechanical properties of as cast A16082 and A16086-T6 were compared and analyzed using statistical analysis. The as cast Al6082 alloy was solutionized at 550°C, quenched and artificially aged at 170°C for 8hrs. Metallographic examination and XRD analysis revealed the presence of intermetallic compounds Al6Mn.Hardness of heat treated Al6082 was found to be more than as cast sample. Wear tests were carried out using Pin on Disc wear testing machine according to Taguchi L9 orthogonal array. Experiments were conducted under normal load 10-30N, sliding speed 1-3m/s, sliding distance 400,800,1200m respectively. Sliding speed was found to be the dominant factor for wear in both as cast and aged Al 6082 alloy. Sliding distance increases the wear rate up to 800m and then after it decreases.
A statistical physics viewpoint on the dynamics of the bouncing ball
NASA Astrophysics Data System (ADS)
Chastaing, Jean-Yonnel; Géminard, Jean-Christophe; Bertin, Eric
2016-06-01
We compute, in a statistical physics perspective, the dynamics of a bouncing ball maintained in a chaotic regime thanks to collisions with a plate experiencing an aperiodic vibration. We analyze in details the energy exchanges between the bead and the vibrating plate, and show that the coupling between the bead and the plate can be modeled in terms of both a dissipative process and an injection mechanism by an energy reservoir. An analysis of the injection statistics in terms of fluctuation relation is also provided.
Physics of Electronic Materials
NASA Astrophysics Data System (ADS)
Rammer, Jørgen
2017-03-01
1. Quantum mechanics; 2. Quantum tunneling; 3. Standard metal model; 4. Standard conductor model; 5. Electric circuit theory; 6. Quantum wells; 7. Particle in a periodic potential; 8. Bloch currents; 9. Crystalline solids; 10. Semiconductor doping; 11. Transistors; 12. Heterostructures; 13. Mesoscopic physics; 14. Arithmetic, logic and machines; Appendix A. Principles of quantum mechanics; Appendix B. Dirac's delta function; Appendix C. Fourier analysis; Appendix D. Classical mechanics; Appendix E. Wave function properties; Appendix F. Transfer matrix properties; Appendix G. Momentum; Appendix H. Confined particles; Appendix I. Spin and quantum statistics; Appendix J. Statistical mechanics; Appendix K. The Fermi-Dirac distribution; Appendix L. Thermal current fluctuations; Appendix M. Gaussian wave packets; Appendix N. Wave packet dynamics; Appendix O. Screening by symmetry method; Appendix P. Commutation and common eigenfunctions; Appendix Q. Interband coupling; Appendix R. Common crystal structures; Appendix S. Effective mass approximation; Appendix T. Integral doubling formula; Bibliography; Index.
Measuring Circulation Desk Activities Using a Random Alarm Mechanism.
ERIC Educational Resources Information Center
Mosborg, Stella Frank
1980-01-01
Reports a job analysis methodology to gather meaningful data related to circulation desk activity. The technique is designed to give librarians statistical data on actual time expenditures for complex and varying activities. (Author/RAA)
Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C
2016-05-20
Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.
A Primer on Bayesian Analysis for Experimental Psychopathologists
Krypotos, Angelos-Miltiadis; Blanken, Tessa F.; Arnaudova, Inna; Matzke, Dora; Beckers, Tom
2016-01-01
The principal goals of experimental psychopathology (EPP) research are to offer insights into the pathogenic mechanisms of mental disorders and to provide a stable ground for the development of clinical interventions. The main message of the present article is that those goals are better served by the adoption of Bayesian statistics than by the continued use of null-hypothesis significance testing (NHST). In the first part of the article we list the main disadvantages of NHST and explain why those disadvantages limit the conclusions that can be drawn from EPP research. Next, we highlight the advantages of Bayesian statistics. To illustrate, we then pit NHST and Bayesian analysis against each other using an experimental data set from our lab. Finally, we discuss some challenges when adopting Bayesian statistics. We hope that the present article will encourage experimental psychopathologists to embrace Bayesian statistics, which could strengthen the conclusions drawn from EPP research. PMID:28748068
NASA Astrophysics Data System (ADS)
Huang, Haiping
2017-05-01
Revealing hidden features in unlabeled data is called unsupervised feature learning, which plays an important role in pretraining a deep neural network. Here we provide a statistical mechanics analysis of the unsupervised learning in a restricted Boltzmann machine with binary synapses. A message passing equation to infer the hidden feature is derived, and furthermore, variants of this equation are analyzed. A statistical analysis by replica theory describes the thermodynamic properties of the model. Our analysis confirms an entropy crisis preceding the non-convergence of the message passing equation, suggesting a discontinuous phase transition as a key characteristic of the restricted Boltzmann machine. Continuous phase transition is also confirmed depending on the embedded feature strength in the data. The mean-field result under the replica symmetric assumption agrees with that obtained by running message passing algorithms on single instances of finite sizes. Interestingly, in an approximate Hopfield model, the entropy crisis is absent, and a continuous phase transition is observed instead. We also develop an iterative equation to infer the hyper-parameter (temperature) hidden in the data, which in physics corresponds to iteratively imposing Nishimori condition. Our study provides insights towards understanding the thermodynamic properties of the restricted Boltzmann machine learning, and moreover important theoretical basis to build simplified deep networks.
Hersche, Sepp; Sifakakis, Iosif; Zinelis, Spiros; Eliades, Theodore
2017-02-01
The purpose of the present study was to investigate the elemental composition, the microstructure, and the selected mechanical properties of high gold orthodontic brackets after intraoral aging. Thirty Incognito™ (3M Unitek, Bad Essen, Germany) lingual brackets were studied, 15 brackets as received (control group) and 15 brackets retrieved from different patients after orthodontic treatment. The surface of the wing area was examined by scanning electron microscopy (SEM). Backscattered electron imaging (BEI) was performed, and the elemental composition was determined by X-ray EDS analysis (EDX). After appropriate metallographic preparation, the mechanical properties tested were Martens hardness (HM), indentation modulus (EIT), elastic index (ηIT), and Vickers hardness (HV). These properties were determined employing instrumented indentation testing (IIT) with a Vickers indenter. The results were statistically analyzed by unpaired t-test (α=0.05). There were no statistically significant differences evidenced in surface morphology and elemental content between the control and the experimental group. These two groups of brackets showed no statistically significant difference in surface morphology. Moreover, the mean values of HM, EIT, ηIT, and HV did not reach statistical significance between the groups (p>0.05). Under the limitations of this study, it may be concluded that the surface elemental content and microstructure as well as the evaluated mechanical properties of the Incognito™ lingual brackets remain unaffected by intraoral aging.
Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B
2017-08-01
After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.
NASA Astrophysics Data System (ADS)
Trifoniuk, L. I.; Ushenko, Yu. A.; Sidor, M. I.; Minzer, O. P.; Gritsyuk, M. V.; Novakovskaya, O. Y.
2014-08-01
The work consists of investigation results of diagnostic efficiency of a new azimuthally stable Mueller-matrix method of analysis of laser autofluorescence coordinate distributions of biological tissues histological sections. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of histological sections of uterus wall tumor - group 1 (dysplasia) and group 2 (adenocarcinoma) are estimated.
NASA Astrophysics Data System (ADS)
Ushenko, Yu. O.; Pashkovskaya, N. V.; Marchuk, Y. F.; Dubolazov, O. V.; Savich, V. O.
2015-08-01
The work consists of investigation results of diagnostic efficiency of a new azimuthally stable Muellermatrix method of analysis of laser autofluorescence coordinate distributions of biological liquid layers. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of human urine polycrystalline layers for the sake of diagnosing and differentiating cholelithiasis with underlying chronic cholecystitis (group 1) and diabetes mellitus of degree II (group 2) are estimated.
NASA Astrophysics Data System (ADS)
Ushenko, Yuriy A.; Koval, Galina D.; Ushenko, Alexander G.; Dubolazov, Olexander V.; Ushenko, Vladimir A.; Novakovskaia, Olga Yu.
2016-07-01
This research presents investigation results of the diagnostic efficiency of an azimuthally stable Mueller-matrix method of analysis of laser autofluorescence of polycrystalline films of dried uterine cavity peritoneal fluid. A model of the generalized optical anisotropy of films of dried peritoneal fluid is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase (linear and circular birefringence) and amplitude (linear and circular dichroism) anisotropies is taken into consideration. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistical analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the first to the fourth order) of differentiation of polycrystalline films of dried peritoneal fluid, group 1 (healthy donors) and group 2 (uterus endometriosis patients), are determined.
NASA Astrophysics Data System (ADS)
Ushenko, A. G.; Dubolazov, O. V.; Ushenko, Vladimir A.; Ushenko, Yu. A.; Sakhnovskiy, M. Yu.; Prydiy, O. G.; Lakusta, I. I.; Novakovskaya, O. Yu.; Melenko, S. R.
2016-12-01
This research presents investigation results of diagnostic efficiency of a new azimuthally stable Mueller-matrix method of laser autofluorescence coordinate distributions analysis of dried polycrystalline films of uterine cavity peritoneal fluid. A new model of generalized optical anisotropy of biological tissues protein networks is proposed in order to define the processes of laser autofluorescence. The influence of complex mechanisms of both phase anisotropy (linear birefringence and optical activity) and linear (circular) dichroism is taken into account. The interconnections between the azimuthally stable Mueller-matrix elements characterizing laser autofluorescence and different mechanisms of optical anisotropy are determined. The statistic analysis of coordinate distributions of such Mueller-matrix rotation invariants is proposed. Thereupon the quantitative criteria (statistic moments of the 1st to the 4th order) of differentiation of dried polycrystalline films of peritoneal fluid - group 1 (healthy donors) and group 2 (uterus endometriosis patients) are estimated.
Addressing the statistical mechanics of planet orbits in the solar system
NASA Astrophysics Data System (ADS)
Mogavero, Federico
2017-10-01
The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.
Shuai, Wang; Yongrui, Bao; Shanshan, Guan; Bo, Liu; Lu, Chen; Lei, Wang; Xiaorong, Ran
2014-01-01
Metabolomics, the systematic analysis of potential metabolites in a biological specimen, has been increasingly applied to discovering biomarkers, identifying perturbed pathways, measuring therapeutic targets, and discovering new drugs. By analyzing and verifying the significant difference in metabolic profiles and changes of metabolite biomarkers, metabolomics enables us to better understand substance metabolic pathways which can clarify the mechanism of Traditional Chinese Medicines (TCM). Corydalis yanhusuo alkaloid (CA) is a major component of Qizhiweitong (QZWT) prescription which has been used for treating gastric ulcer for centuries and its mechanism remains unclear completely. Metabolite profiling was performed by high-performance liquid chromatography combined with time-of-flight mass spectrometry (HPLC/ESI-TOF-MS) and in conjunction with multivariate data analysis and pathway analysis. The statistic software Mass Profiller Prossional (MPP) and statistic method including ANOVA and principal component analysis (PCA) were used for discovering novel potential biomarkers to clarify mechanism of CA in treating acid injected rats with gastric ulcer. The changes in metabolic profiling were restored to their base-line values after CA treatment according to the PCA score plots. Ten different potential biomarkers and seven key metabolic pathways contributing to the treatment of gastric ulcer were discovered and identified. Among the pathways, sphingophospholipid metabolism and fatty acid metabolism related network were acutely perturbed. Quantitative real time polymerase chain reaction (RT-PCR) analysis were performed to evaluate the expression of genes related to the two pathways for verifying the above results. The results show that changed biomarkers and pathways may provide evidence to insight into drug action mechanisms and enable us to increase research productivity toward metabolomics drug discovery. PMID:24454691
Finite Element Analysis of Reverberation Chambers
NASA Technical Reports Server (NTRS)
Bunting, Charles F.; Nguyen, Duc T.
2000-01-01
The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.
Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana
2015-11-01
The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.
Statistics for Time-Series Spatial Data: Applying Survival Analysis to Study Land-Use Change
ERIC Educational Resources Information Center
Wang, Ninghua Nathan
2013-01-01
Traditional spatial analysis and data mining methods fall short of extracting temporal information from data. This inability makes their use difficult to study changes and the associated mechanisms of many geographic phenomena of interest, for example, land-use. On the other hand, the growing availability of land-change data over multiple time…
The need for conducting forensic analysis of decommissioned bridges.
DOT National Transportation Integrated Search
2014-01-01
A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jing; Ackerman, David M.; Lin, Victor S.-Y.
2013-04-02
Statistical mechanical modeling is performed of a catalytic conversion reaction within a functionalized nanoporous material to assess the effect of varying the reaction product-pore interior interaction from attractive to repulsive. A strong enhancement in reactivity is observed not just due to the shift in reaction equilibrium towards completion but also due to enhanced transport within the pore resulting from reduced loading. The latter effect is strongest for highly restricted transport (single-file diffusion), and applies even for irreversible reactions. The analysis is performed utilizing a generalized hydrodynamic formulation of the reaction-diffusion equations which can reliably capture the complex interplay between reactionmore » and restricted transport.« less
Guidelines for the Investigation of Mediating Variables in Business Research
Coxe, Stefany; Baraldi, Amanda N.
2013-01-01
Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized. PMID:25237213
Xu, Zhongwei; Chen, Tingmei; Luo, Jiao; Ding, Shijia; Gao, Sichuan; Zhang, Jian
2017-04-07
Osteophyte is one of the inevitable consequences of progressive osteoarthritis with the main characteristics of cartilage degeneration and endochondral ossification. The pathogenesis of osteophyte formation is not fully understood to date. In this work, metabolomic approaches were employed to explore potential mechanisms of osteophyte formation by detecting metabolic variations between extracts of osteophyte cartilage tissues (n = 32) and uninvolved control cartilage tissues (n = 34), based on the platform of ultraperformance liquid chromatography tandem quadrupole time-of-flight mass spectrometry, as well as the use of multivariate statistic analysis and univariate statistic analysis. The osteophyte group was significantly separated from the control group by the orthogonal partial least-squares discriminant analysis models, indicating that metabolic state of osteophyte cartilage had been changed. In total, 28 metabolic variations further validated by mass spectrum (MS) match, tandom mass spectrum (MS/MS) match, and standards match mainly included amino acids, sulfonic acids, glycerophospholipids, and fatty acyls. These metabolites were related to some specific physiological or pathological processes (collagen dissolution, boundary layers destroyed, self-restoration triggered, etc.) which might be associated with the procedure of osteophyte formation. Pathway analysis showed phenylalanine metabolism (PI = 0.168, p = 0.004) was highly correlative to this degenerative process. Our findings provided a direction for targeted metabolomic study and an insight into further reveal the molecular mechanisms of ostophyte formation.
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.
Gautestad, Arild O
2012-09-07
Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.
Takabe, Satoshi; Hukushima, Koji
2016-05-01
Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.
Bi, Xiaohong; Grafe, Ingo; Ding, Hao; Flores, Rene; Munivez, Elda; Jiang, Ming Ming; Dawson, Brian; Lee, Brendan; Ambrose, Catherine G
2017-02-01
Osteogenesis imperfecta (OI) is a group of genetic disorders characterized by brittle bones that are prone to fracture. Although previous studies in animal models investigated the mechanical properties and material composition of OI bone, little work has been conducted to statistically correlate these parameters to identify key compositional contributors to the impaired bone mechanical behaviors in OI. Further, although increased TGF-β signaling has been demonstrated as a contributing mechanism to the bone pathology in OI models, the relationship between mechanical properties and bone composition after anti-TGF-β treatment in OI has not been studied. Here, we performed follow-up analyses of femurs collected in an earlier study from OI mice with and without anti-TGF-β treatment from both recessive (Crtap -/- ) and dominant (Col1a2 +/P.G610C ) OI mouse models and WT mice. Mechanical properties were determined using three-point bending tests and evaluated for statistical correlation with molecular composition in bone tissue assessed by Raman spectroscopy. Statistical regression analysis was conducted to determine significant compositional determinants of mechanical integrity. Interestingly, we found differences in the relationships between bone composition and mechanical properties and in the response to anti-TGF-β treatment. Femurs of both OI models exhibited increased brittleness, which was associated with reduced collagen content and carbonate substitution. In the Col1a2 +/P.G610C femurs, reduced hydroxyapatite crystallinity was also found to be associated with increased brittleness, and increased mineral-to-collagen ratio was correlated with increased ultimate strength, elastic modulus, and bone brittleness. In both models of OI, regression analysis demonstrated that collagen content was an important predictor of the increased brittleness. In summary, this work provides new insights into the relationships between bone composition and material properties in models of OI, identifies key bone compositional parameters that correlate with the impaired mechanical integrity of OI bone, and explores the effects of anti-TGF-β treatment on bone-quality parameters in these models. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.
Twenty-five years of maximum-entropy principle
NASA Astrophysics Data System (ADS)
Kapur, J. N.
1983-04-01
The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
Histological analysis of effects of 24% EDTA gel for nonsurgical treatment of periodontal tissues.
de Vasconcellos, Luana Marotta Reis; Ricardo, Lucilene Hernandes; Balducci, Ivan; de Vasconcellos, Luis Gustavo Oliveira; Carvalho, Yasmin Rodarte
2006-12-01
The aim of this study was to investigate, by means of histological and histomorphometric analysis, the effects of 24% ethylenediaminetetraacetic acid (EDTA) gel in periodontal tissue when used in combination with conventional periodontal treatment. Periodontitis was induced in the 2nd upper left permanent molars of 45 male Wistar rats by means of ligature. After 5 weeks, this was removed and debridement was performed. The animals were then randomly divided into 3 groups; group 1: mechanical treatment, group 2: mechanical treatment and EDTA gel application for 2 min, and group 3: mechanical treatment and placebo gel application for 2 min. After the treatment, rinsing was done with 0.9% saline solution for 1 min in all cases, followed by root notching in the deepest part of the pocket. After 4, 10, and 28 days the animals were sacrificed. The averages obtained were evaluated by means of test two-way analysis of variance (ANOVA) and Tukey statistical tests (P < 0.05). The results showed that with respect to the type of treatment employed, there were no statistically significant differences in the vitality of the periodontal tissue. It was concluded that 24% EDTA gel did not interfere with periodontal tissue repair when used in combination with conventional periodontal treatment.
Epidemics in Ming and Qing China: Impacts of changes of climate and economic well-being.
Pei, Qing; Zhang, David D; Li, Guodong; Winterhalder, Bruce; Lee, Harry F
2015-07-01
We investigated the mechanism of epidemics with the impacts of climate change and socio-economic fluctuations in the Ming and Qing Dynasties in China (AD 1368-1901). Using long-term and high-quality datasets, this study is the first quantitative research that verifies the 'climate change → economy → epidemics' mechanism in historical China by statistical methods that include correlation analysis, Granger causality analysis, ARX, and Poisson-ARX modeling. The analysis provides the evidences that climate change could only fundamentally lead to the epidemics spread and occurrence, but the depressed economic well-being is the direct trigger of epidemics spread and occurrence at the national and long term scale in historical China. Moreover, statistical modeling shows that economic well-being is more important than population pressure in the mechanism of epidemics. However, population pressure remains a key element in determining the social vulnerability of the epidemics occurrence under climate change. Notably, the findings not only support adaptation theories but also enhance our confidence to address climatic shocks if economic buffering capacity can be promoted steadily. The findings can be a basis for scientists and policymakers in addressing global and regional environmental changes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bayesian Sensitivity Analysis of Statistical Models with Missing Data
ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG
2013-01-01
Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718
Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076
Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.
Radiation from quantum weakly dynamical horizons in loop quantum gravity.
Pranzetti, Daniele
2012-07-06
We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.
The Statistical Basis of Chemical Equilibria.
ERIC Educational Resources Information Center
Hauptmann, Siegfried; Menger, Eva
1978-01-01
Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)
Sieve analysis in HIV-1 vaccine efficacy trials
Edlefsen, Paul T.; Gilbert, Peter B.; Rolland, Morgane
2013-01-01
Purpose of review The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. Recent findings The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 and RV144, led to numerous studies in the last five years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Summary Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons while correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection. PMID:23719202
Sieve analysis in HIV-1 vaccine efficacy trials.
Edlefsen, Paul T; Gilbert, Peter B; Rolland, Morgane
2013-09-01
The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 (HIV Vaccine Trials Network-502) and RV144, led to numerous studies in the last 5 years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons, whereas correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection.
Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.
2015-01-01
Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316
Carpe-Carpe, Bienvenida; Hernando-Arizaleta, Lauro; Ibáñez-Pérez, M Carmen; Palomar-Rodríguez, Joaquín A; Esquinas-Rodríguez, Antonio M
2013-08-01
Noninvasive mechanical ventilation (NIV) appeared in the 1980s as an alternative to invasive mechanical ventilation (IMV) in patients with acute respiratory failure. We evaluated the introduction of NIV and the results in patients with acute exacerbation of chronic obstructive pulmonary disease in the Region of Murcia (Spain). A retrospective observational study based on the minimum basic hospital discharge data of all patients hospitalised for this pathology in all public hospitals in the region between 1997 and 2010. We performed a time trend analysis on hospital attendance, the use of each ventilatory intervention and hospital mortality through joinpoint regression. We identified 30.027 hospital discharges. Joinpoint analysis: downward trend in attendance (annual percentage change [APC]=-3.4, 95% CI: - 4.8; -2.0, P <.05) and in the group without ventilatory intervention (APC=-4.2%, -5.6; -2.8, P <.05); upward trend in the use of NIV (APC=16.4, 12.0; 20. 9, P <.05), and downward trend that was not statistically significant in IMV (APC=-4.5%, -10.3; 1.7). We observed an upward trend without statistical significance in overall mortality (APC=0.5, -1.3; 2.4) and in the group without intervention (APC=0.1, -1.6; 1.9); downward trend with statistical significance in the NIV group (APC=-7.1, -11.7; -2.2, P <.05) and not statistically significant in the IMV group (APC=-0,8, -6, 1; 4.8). The mean stay did not change substantially. The introduction of NIV has reduced the group of patients not receiving assisted ventilation. No improvement in results was found in terms of mortality or length of stay. Copyright © 2012 SEPAR. Published by Elsevier Espana. All rights reserved.
Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.
Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L
2016-02-09
Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.
A climatology of total ozone mapping spectrometer data using rotated principal component analysis
NASA Astrophysics Data System (ADS)
Eder, Brian K.; Leduc, Sharon K.; Sickles, Joseph E.
1999-02-01
The spatial and temporal variability of total column ozone (Ω) obtained from the total ozone mapping spectrometer (TOMS version 7.0) during the period 1980-1992 was examined through the use of a multivariate statistical technique called rotated principal component analysis. Utilization of Kaiser's varimax orthogonal rotation led to the identification of 14, mostly contiguous subregions that together accounted for more than 70% of the total Ω variance. Each subregion displayed statistically unique Ω characteristics that were further examined through time series and spectral density analyses, revealing significant periodicities on semiannual, annual, quasi-biennial, and longer term time frames. This analysis facilitated identification of the probable mechanisms responsible for the variability of Ω within the 14 homogeneous subregions. The mechanisms were either dynamical in nature (i.e., advection associated with baroclinic waves, the quasi-biennial oscillation, or El Niño-Southern Oscillation) or photochemical in nature (i.e., production of odd oxygen (O or O3) associated with the annual progression of the Sun). The analysis has also revealed that the influence of a data retrieval artifact, found in equatorial latitudes of version 6.0 of the TOMS data, has been reduced in version 7.0.
10 CFR 431.17 - Determination of efficiency.
Code of Federal Regulations, 2010 CFR
2010-01-01
... method or methods used; the mathematical model, the engineering or statistical analysis, computer... accordance with § 431.16 of this subpart, or by application of an alternative efficiency determination method... must be: (i) Derived from a mathematical model that represents the mechanical and electrical...
Non Kolmogorov Probability Models Outside Quantum Mechanics
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2009-03-01
This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.
Rodríguez-Arias, Miquel Angel; Rodó, Xavier
2004-03-01
Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1992-01-01
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
NASA Astrophysics Data System (ADS)
Zhang, Ning; Shahsavari, Rouzbeh
2016-11-01
As the most widely used manufactured material on Earth, concrete poses serious societal and environmental concerns which call for innovative strategies to develop greener concrete with improved strength and toughness, properties that are exclusive in man-made materials. Herein, we focus on calcium silicate hydrate (C-S-H), the major binding phase of all Portland cement concretes, and study how engineering its nanovoids and portlandite particle inclusions can impart a balance of strength, toughness and stiffness. By performing an extensive +600 molecular dynamics simulations coupled with statistical analysis tools, our results provide new evidence of ductile fracture mechanisms in C-S-H - reminiscent of crystalline alloys and ductile metals - decoding the interplay between the crack growth, nanovoid/particle inclusions, and stoichiometry, which dictates the crystalline versus amorphous nature of the underlying matrix. We found that introduction of voids and portlandite particles can significantly increase toughness and ductility, specially in C-S-H with more amorphous matrices, mainly owing to competing mechanisms of crack deflection, voids coalescence, internal necking, accommodation, and geometry alteration of individual voids/particles, which together regulate toughness versus strength. Furthermore, utilizing a comprehensive global sensitivity analysis on random configuration-property relations, we show that the mean diameter of voids/particles is the most critical statistical parameter influencing the mechanical properties of C-S-H, irrespective of stoichiometry or crystalline or amorphous nature of the matrix. This study provides new fundamental insights, design guidelines, and de novo strategies to turn the brittle C-S-H into a ductile material, impacting modern engineering of strong and tough concrete infrastructures and potentially other complex brittle materials.
Fairchild, Amanda J.; Abara, Winston E.; Gottschall, Amanda C.; Tein, Jenn-Yun; Prinz, Ronald J.
2015-01-01
The purpose of this article is to introduce and describe a statistical model that researchers can use to evaluate underlying mechanisms of behavioral onset and other event occurrence outcomes. Specifically, the article develops a framework for estimating mediation effects with outcomes measured in discrete-time epochs by integrating the statistical mediation model with discrete-time survival analysis. The methodology has the potential to help strengthen health research by targeting prevention and intervention work more effectively as well as by improving our understanding of discretized periods of risk. The model is applied to an existing longitudinal data set to demonstrate its use, and programming code is provided to facilitate its implementation. PMID:24296470
NASA Astrophysics Data System (ADS)
Wu, Y.; Chen, G. L.; Hui, X. D.; Liu, C. T.; Lin, Y.; Shang, X. C.; Lu, Z. P.
2009-10-01
Based on mechanical instability of individual shear transformation zones (STZs), a quantitative link between the microplastic instability and macroscopic deformation behavior of metallic glasses was proposed. Our analysis confirms that macroscopic metallic glasses comprise a statistical distribution of STZ embryos with distributed values of activation energy, and the microplastic instability of all the individual STZs dictates the macroscopic deformation behavior of amorphous solids. The statistical model presented in this paper can successfully reproduce the macroscopic stress-strain curves determined experimentally and readily be used to predict strain-rate effects on the macroscopic responses with the availability of the material parameters at a certain strain rate, which offer new insights into understanding the actual deformation mechanism in amorphous solids.
Gong, Anmin; Liu, Jianping; Chen, Si; Fu, Yunfa
2018-01-01
To study the physiologic mechanism of the brain during different motor imagery (MI) tasks, the authors employed a method of brain-network modeling based on time-frequency cross mutual information obtained from 4-class (left hand, right hand, feet, and tongue) MI tasks recorded as brain-computer interface (BCI) electroencephalography data. The authors explored the brain network revealed by these MI tasks using statistical analysis and the analysis of topologic characteristics, and observed significant differences in the reaction level, reaction time, and activated target during 4-class MI tasks. There was a great difference in the reaction level between the execution and resting states during different tasks: the reaction level of the left-hand MI task was the greatest, followed by that of the right-hand, feet, and tongue MI tasks. The reaction time required to perform the tasks also differed: during the left-hand and right-hand MI tasks, the brain networks of subjects reacted promptly and strongly, but there was a delay during the feet and tongue MI task. Statistical analysis and the analysis of network topology revealed the target regions of the brain network during different MI processes. In conclusion, our findings suggest a new way to explain the neural mechanism behind MI.
Lei, Tianli; Chen, Shifeng; Wang, Kai; Zhang, Dandan; Dong, Lin; Lv, Chongning; Wang, Jing; Lu, Jincai
2018-02-01
Bupleuri Radix is a commonly used herb in clinic, and raw and vinegar-baked Bupleuri Radix are both documented in the Pharmacopoeia of People's Republic of China. According to the theories of traditional Chinese medicine, Bupleuri Radix possesses different therapeutic effects before and after processing. However, the chemical mechanism of this processing is still unknown. In this study, ultra-high-performance liquid chromatography with quadruple time-of-flight mass spectrometry coupled with multivariate statistical analysis including principal component analysis and orthogonal partial least square-discriminant analysis was developed to holistically compare the difference between raw and vinegar-baked Bupleuri Radix for the first time. As a result, 50 peaks in raw and processed Bupleuri Radix were detected, respectively, and a total of 49 peak chemical compounds were identified. Saikosaponin a, saikosaponin d, saikosaponin b 3 , saikosaponin e, saikosaponin c, saikosaponin b 2 , saikosaponin b 1 , 4''-O-acetyl-saikosaponin d, hyperoside and 3',4'-dimethoxy quercetin were explored as potential markers of raw and vinegar-baked Bupleuri Radix. This study has been successfully applied for global analysis of raw and vinegar-processed samples. Furthermore, the underlying hepatoprotective mechanism of Bupleuri Radix was predicted, which was related to the changes of chemical profiling. Copyright © 2017 John Wiley & Sons, Ltd.
Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander
2012-01-01
We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912
Frazier, Thomas W; Ratliff, Kristin R; Gruber, Chris; Zhang, Yi; Law, Paul A; Constantino, John N
2014-01-01
Understanding the factor structure of autistic symptomatology is critical to the discovery and interpretation of causal mechanisms in autism spectrum disorder. We applied confirmatory factor analysis and assessment of measurement invariance to a large (N = 9635) accumulated collection of reports on quantitative autistic traits using the Social Responsiveness Scale, representing a broad diversity of age, severity, and reporter type. A two-factor structure (corresponding to social communication impairment and restricted, repetitive behavior) as elaborated in the updated Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) criteria for autism spectrum disorder exhibited acceptable model fit in confirmatory factor analysis. Measurement invariance was appreciable across age, sex, and reporter (self vs other), but somewhat less apparent between clinical and nonclinical populations in this sample comprised of both familial and sporadic autism spectrum disorders. The statistical power afforded by this large sample allowed relative differentiation of three factors among items encompassing social communication impairment (emotion recognition, social avoidance, and interpersonal relatedness) and two factors among items encompassing restricted, repetitive behavior (insistence on sameness and repetitive mannerisms). Cross-trait correlations remained extremely high, that is, on the order of 0.66-0.92. These data clarify domains of statistically significant factoral separation that may relate to partially-but not completely-overlapping biological mechanisms, contributing to variation in human social competency. Given such robust intercorrelations among symptom domains, understanding their co-emergence remains a high priority in conceptualizing common neural mechanisms underlying autistic syndromes.
Cumulative (Dis)Advantage and the Matthew Effect in Life-Course Analysis
Bask, Miia; Bask, Mikael
2015-01-01
To foster a deeper understanding of the mechanisms behind inequality in society, it is crucial to work with well-defined concepts associated with such mechanisms. The aim of this paper is to define cumulative (dis)advantage and the Matthew effect. We argue that cumulative (dis)advantage is an intra-individual micro-level phenomenon, that the Matthew effect is an inter-individual macro-level phenomenon and that an appropriate measure of the Matthew effect focuses on the mechanism or dynamic process that generates inequality. The Matthew mechanism is, therefore, a better name for the phenomenon, where we provide a novel measure of the mechanism, including a proof-of-principle analysis using disposable personal income data. Finally, because socio-economic theory should be able to explain cumulative (dis)advantage and the Matthew mechanism when they are detected in data, we discuss the types of models that may explain the phenomena. We argue that interactions-based models in the literature traditions of analytical sociology and statistical mechanics serve this purpose. PMID:26606386
NASA Technical Reports Server (NTRS)
1981-01-01
The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.
Laser Velocimeter Measurements and Analysis in Turbulent Flows with Combustion. Part 2.
1983-07-01
sampling error for 63 this sample size. Mean velocities and turbulence intensi- ties were found to be statistically accurate to ± 1 % and 13%, respectively...Although the statist - ical error was found to be rather small (± 1 % for mean velo- cities and 13% for turbulence intensities), there can be additional...34Computational and Experimental Study of a Captive Annular Eddy," Journal of Fluid Mechanics, Vol. 28, pt. 1 , pp. 43-63, 12 April, 1967. 152 REFERENCES (con’d
Principle of maximum entropy for reliability analysis in the design of machine components
NASA Astrophysics Data System (ADS)
Zhang, Yimin
2018-03-01
We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.
Hayes, Andrew F; Rockwood, Nicholas J
2017-11-01
There have been numerous treatments in the clinical research literature about various design, analysis, and interpretation considerations when testing hypotheses about mechanisms and contingencies of effects, popularly known as mediation and moderation analysis. In this paper we address the practice of mediation and moderation analysis using linear regression in the pages of Behaviour Research and Therapy and offer some observations and recommendations, debunk some popular myths, describe some new advances, and provide an example of mediation, moderation, and their integration as conditional process analysis using the PROCESS macro for SPSS and SAS. Our goal is to nudge clinical researchers away from historically significant but increasingly old school approaches toward modifications, revisions, and extensions that characterize more modern thinking about the analysis of the mechanisms and contingencies of effects. Copyright © 2016 Elsevier Ltd. All rights reserved.
Blow molding electric drives of Mechanical Engineering
NASA Astrophysics Data System (ADS)
Bukhanov, S. S.; Ramazanov, M. A.; Tsirkunenko, A. T.
2018-03-01
The article considers the questions about the analysis of new possibilities, which gives the use of adjustable electric drives for blowing mechanisms of plastic production. Thus, the use of new semiconductor converters makes it possible not only to compensate the instability of the supply network by using special dynamic voltage regulators, but to improve (correct) the power factor. The calculation of economic efficiency in controlled electric drives of blowing mechanisms is given. On the basis of statistical analysis, the calculation of the reliability parameters of the regulated electric drives’ elements under consideration is given. It is shown that an increase in the reliability of adjustable electric drives is possible both due to overestimation of the electric drive’s installed power, and in simpler schemes with pulse-vector control.
Matsumoto, Takao; Ishikawa, Ryo; Tohei, Tetsuya; Kimura, Hideo; Yao, Qiwen; Zhao, Hongyang; Wang, Xiaolin; Chen, Dapeng; Cheng, Zhenxiang; Shibata, Naoya; Ikuhara, Yuichi
2013-10-09
A state-of-the-art spherical aberration-corrected STEM was fully utilized to directly visualize the multiferroic domain structure in a hexagonal YMnO3 single crystal at atomic scale. With the aid of multivariate statistical analysis (MSA), we obtained unbiased and quantitative maps of ferroelectric domain structures with atomic resolution. Such a statistical image analysis of the transition region between opposite polarizations has confirmed atomically sharp transitions of ferroelectric polarization both in antiparallel (uncharged) and tail-to-tail 180° (charged) domain boundaries. Through the analysis, a correlated subatomic image shift of Mn-O layers with that of Y layers, exhibiting a double-arc shape of reversed curvatures, have been elucidated. The amount of image shift in Mn-O layers along the c-axis is statistically significant as small as 0.016 nm, roughly one-third of the evident image shift of 0.048 nm in Y layers. Interestingly, a careful analysis has shown that such a subatomic image shift in Mn-O layers vanishes at the tail-to-tail 180° domain boundaries. Furthermore, taking advantage of the annular bright field (ABF) imaging technique combined with MSA, the tilting of MnO5 bipyramids, the very core mechanism of multiferroicity of the material, is evaluated.
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
Gautestad, Arild O.
2012-01-01
Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the ‘power law in disguise’ paradox—from a composite Brownian motion consisting of a superposition of independent movement processes at different scales—may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated. PMID:22456456
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Scout trajectory error propagation computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1982-01-01
Since 1969, flight experience has been used as the basis for predicting Scout orbital accuracy. The data used for calculating the accuracy consists of errors in the trajectory parameters (altitude, velocity, etc.) at stage burnout as observed on Scout flights. Approximately 50 sets of errors are used in Monte Carlo analysis to generate error statistics in the trajectory parameters. A covariance matrix is formed which may be propagated in time. The mechanization of this process resulted in computer program Scout Trajectory Error Propagation (STEP) and is described herein. Computer program STEP may be used in conjunction with the Statistical Orbital Analysis Routine to generate accuracy in the orbit parameters (apogee, perigee, inclination, etc.) based upon flight experience.
Observational Word Learning: Beyond Propose-But-Verify and Associative Bean Counting.
Roembke, Tanja; McMurray, Bob
2016-04-01
Learning new words is difficult. In any naming situation, there are multiple possible interpretations of a novel word. Recent approaches suggest that learners may solve this problem by tracking co-occurrence statistics between words and referents across multiple naming situations (e.g. Yu & Smith, 2007), overcoming the ambiguity in any one situation. Yet, there remains debate around the underlying mechanisms. We conducted two experiments in which learners acquired eight word-object mappings using cross-situational statistics while eye-movements were tracked. These addressed four unresolved questions regarding the learning mechanism. First, eye-movements during learning showed evidence that listeners maintain multiple hypotheses for a given word and bring them all to bear in the moment of naming. Second, trial-by-trial analyses of accuracy suggested that listeners accumulate continuous statistics about word/object mappings, over and above prior hypotheses they have about a word. Third, consistent, probabilistic context can impede learning, as false associations between words and highly co-occurring referents are formed. Finally, a number of factors not previously considered in prior analysis impact observational word learning: knowledge of the foils, spatial consistency of the target object, and the number of trials between presentations of the same word. This evidence suggests that observational word learning may derive from a combination of gradual statistical or associative learning mechanisms and more rapid real-time processes such as competition, mutual exclusivity and even inference or hypothesis testing.
NASA Astrophysics Data System (ADS)
Schneider, Markus P. A.
This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely the never married and women. The estimated parameter for never-married men's incomes is significantly different from the parameter estimated for never-married women, implying that either the combined distribution is not exponential or that the individual distributions are not exponential. However, it substantiates the existence of a persistent gender income gap among the never-married. References: Reich, M., D. M. Gordon, and R. C. Edwards (1973). A Theory of Labor Market Segmentation. Quarterly Journal of Economics 63, 359-365. Yakovenko, V. M. (2009). Econophysics, Statistical Mechanics Approach to. In R. A. Meyers (Ed.), Encyclopedia of Complexity and System Science. Springer.
NASA Astrophysics Data System (ADS)
Fakir, Rachid; Barka, Noureddine; Brousseau, Jean
2018-03-01
This paper proposes a statistical approach to analyze the mechanical properties of a standard test specimen, of cylindrical geometry and in steel 4340, with a diameter of 6 mm, heat-treated and quenched in three different fluids. Samples were evaluated in standard tensile test to access their characteristic quantities: hardness, modulus of elasticity, yield strength, tensile strength and ultimate deformation. The proposed approach is gradually being built (a) by a presentation of the experimental device, (b) a presentation of the experimental plan and the results of the mechanical tests, (c) anova analysis of variance and a representation of the output responses using the RSM response surface method, and (d) an analysis of the results and discussion. The feasibility and effectiveness of the proposed approach leads to a precise and reliable model capable of predicting the variation of mechanical properties, depending on the tempering temperature, the tempering time and the cooling capacity of the quenching medium.
Statistical mechanics framework for static granular matter.
Henkes, Silke; Chakraborty, Bulbul
2009-06-01
The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
Statistical mechanics based on fractional classical and quantum mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Patankar, Ravindra
2003-10-01
Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.
Many-Body Localization and Thermalization in Quantum Statistical Mechanics
NASA Astrophysics Data System (ADS)
Nandkishore, Rahul; Huse, David A.
2015-03-01
We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.
Cortical mechanisms for the segregation and representation of acoustic textures.
Overath, Tobias; Kumar, Sukhbinder; Stewart, Lauren; von Kriegstein, Katharina; Cusack, Rhodri; Rees, Adrian; Griffiths, Timothy D
2010-02-10
Auditory object analysis requires two fundamental perceptual processes: the definition of the boundaries between objects, and the abstraction and maintenance of an object's characteristic features. Although it is intuitive to assume that the detection of the discontinuities at an object's boundaries precedes the subsequent precise representation of the object, the specific underlying cortical mechanisms for segregating and representing auditory objects within the auditory scene are unknown. We investigated the cortical bases of these two processes for one type of auditory object, an "acoustic texture," composed of multiple frequency-modulated ramps. In these stimuli, we independently manipulated the statistical rules governing (1) the frequency-time space within individual textures (comprising ramps with a given spectrotemporal coherence) and (2) the boundaries between textures (adjacent textures with different spectrotemporal coherences). Using functional magnetic resonance imaging, we show mechanisms defining boundaries between textures with different coherences in primary and association auditory cortices, whereas texture coherence is represented only in association cortex. Furthermore, participants' superior detection of boundaries across which texture coherence increased (as opposed to decreased) was reflected in a greater neural response in auditory association cortex at these boundaries. The results suggest a hierarchical mechanism for processing acoustic textures that is relevant to auditory object analysis: boundaries between objects are first detected as a change in statistical rules over frequency-time space, before a representation that corresponds to the characteristics of the perceived object is formed.
Rotenone and paraquat perturb dopamine metabolism: a computational analysis of pesticide toxicity
Qi, Zhen; Miller, Gary W.; Voit, Eberhard O.
2014-01-01
Pesticides, such as rotenone and paraquat, are suspected in the pathogenesis of Parkinson’s disease (PD), whose hallmark is the progressive loss of dopaminergic neurons in the substantia nigra pars compacta. Thus, compounds expected to play a role in the pathogenesis of PD will likely impact the function of dopaminergic neurons. To explore the relationship between pesticide exposure and dopaminergic toxicity, we developed a custom-tailored mathematical model of dopamine metabolism and utilized it to infer potential mechanisms underlying the toxicity of rotenone and paraquat, asking how these pesticides perturb specific processes. We performed two types of analyses, which are conceptually different and complement each other. The first analysis, a purely algebraic reverse engineering approach, analytically and deterministically computes the altered profile of enzyme activities that characterize the effects of a pesticide. The second method consists of large-scale Monte Carlo simulations that statistically reveal possible mechanisms of pesticides. The results from the reverse engineering approach show that rotenone and paraquat exposures lead to distinctly different flux perturbations. Rotenone seems to affect all fluxes associated with dopamine compartmentalization, whereas paraquat exposure perturbs fluxes associated with dopamine and its breakdown metabolites. The statistical results of the Monte-Carlo analysis suggest several specific mechanisms. The findings are interesting, because no a priori assumptions are made regarding specific pesticide actions, and all parameters characterizing the processes in the dopamine model are treated in an unbiased manner. Our results show how approaches from computational systems biology can help identify mechanisms underlying the toxicity of pesticide exposure. PMID:24269752
Applied Mathematical Methods in Theoretical Physics
NASA Astrophysics Data System (ADS)
Masujima, Michio
2005-04-01
All there is to know about functional analysis, integral equations and calculus of variations in a single volume. This advanced textbook is divided into two parts: The first on integral equations and the second on the calculus of variations. It begins with a short introduction to functional analysis, including a short review of complex analysis, before continuing a systematic discussion of different types of equations, such as Volterra integral equations, singular integral equations of Cauchy type, integral equations of the Fredholm type, with a special emphasis on Wiener-Hopf integral equations and Wiener-Hopf sum equations. After a few remarks on the historical development, the second part starts with an introduction to the calculus of variations and the relationship between integral equations and applications of the calculus of variations. It further covers applications of the calculus of variations developed in the second half of the 20th century in the fields of quantum mechanics, quantum statistical mechanics and quantum field theory. Throughout the book, the author presents over 150 problems and exercises -- many from such branches of physics as quantum mechanics, quantum statistical mechanics, and quantum field theory -- together with outlines of the solutions in each case. Detailed solutions are given, supplementing the materials discussed in the main text, allowing problems to be solved making direct use of the method illustrated. The original references are given for difficult problems. The result is complete coverage of the mathematical tools and techniques used by physicists and applied mathematicians Intended for senior undergraduates and first-year graduates in science and engineering, this is equally useful as a reference and self-study guide.
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
Evidence of the non-extensive character of Earth's ambient noise.
NASA Astrophysics Data System (ADS)
Koutalonis, Ioannis; Vallianatos, Filippos
2017-04-01
Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.
SPATIO-TEMPORAL ANALYSIS OF TOTAL NITRATE CONCENTRATIONS USING DYNAMIC STATISTICAL MODELS
Atmospheric concentrations of total nitrate (TNO3), defined here as gas-phase nitric acid plus particle-phase nitrate, are difficult to simulate in numerical air quality models due to the presence of a variety of formation pathways and loss mechanisms, some of which ar...
Experimental Analysis of Cell Function Using Cytoplasmic Streaming
ERIC Educational Resources Information Center
Janssens, Peter; Waldhuber, Megan
2012-01-01
This laboratory exercise investigates the phenomenon of cytoplasmic streaming in the fresh water alga "Nitella". Students use the fungal toxin cytochalasin D, an inhibitor of actin polymerization, to investigate the mechanism of streaming. Students use simple statistical methods to analyze their data. Typical student data are provided. (Contains 3…
Entropy for Mechanically Vibrating Systems
NASA Astrophysics Data System (ADS)
Tufano, Dante
The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.
NASA Technical Reports Server (NTRS)
Oravec, Heather Ann; Daniels, Christopher C.
2014-01-01
The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223
Tsallis statistics and neurodegenerative disorders
NASA Astrophysics Data System (ADS)
Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.
2016-08-01
In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
Sirota, Miroslav; Kostovičová, Lenka; Juanchich, Marie
2014-08-01
Knowing which properties of visual displays facilitate statistical reasoning bears practical and theoretical implications. Therefore, we studied the effect of one property of visual diplays - iconicity (i.e., the resemblance of a visual sign to its referent) - on Bayesian reasoning. Two main accounts of statistical reasoning predict different effect of iconicity on Bayesian reasoning. The ecological-rationality account predicts a positive iconicity effect, because more highly iconic signs resemble more individuated objects, which tap better into an evolutionary-designed frequency-coding mechanism that, in turn, facilitates Bayesian reasoning. The nested-sets account predicts a null iconicity effect, because iconicity does not affect the salience of a nested-sets structure-the factor facilitating Bayesian reasoning processed by a general reasoning mechanism. In two well-powered experiments (N = 577), we found no support for a positive iconicity effect across different iconicity levels that were manipulated in different visual displays (meta-analytical overall effect: log OR = -0.13, 95% CI [-0.53, 0.28]). A Bayes factor analysis provided strong evidence in favor of the null hypothesis-the null iconicity effect. Thus, these findings corroborate the nested-sets rather than the ecological-rationality account of statistical reasoning.
Statistics and classification of the microwave zebra patterns associated with solar flares
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Baolin; Tan, Chengming; Zhang, Yin
2014-01-10
The microwave zebra pattern (ZP) is the most interesting, intriguing, and complex spectral structure frequently observed in solar flares. A comprehensive statistical study will certainly help us to understand the formation mechanism, which is not exactly clear now. This work presents a comprehensive statistical analysis of a big sample with 202 ZP events collected from observations at the Chinese Solar Broadband Radio Spectrometer at Huairou and the Ondŕejov Radiospectrograph in the Czech Republic at frequencies of 1.00-7.60 GHz from 2000 to 2013. After investigating the parameter properties of ZPs, such as the occurrence in flare phase, frequency range, polarization degree,more » duration, etc., we find that the variation of zebra stripe frequency separation with respect to frequency is the best indicator for a physical classification of ZPs. Microwave ZPs can be classified into three types: equidistant ZPs, variable-distant ZPs, and growing-distant ZPs, possibly corresponding to mechanisms of the Bernstein wave model, whistler wave model, and double plasma resonance model, respectively. This statistical classification may help us to clarify the controversies between the existing various theoretical models and understand the physical processes in the source regions.« less
Mechanical properties of experimental composites with different calcium phosphates fillers.
Okulus, Zuzanna; Voelkel, Adam
2017-09-01
Calcium phosphates (CaPs)-containing composites have already shown good properties from the point of view of dental restorative materials. The purpose of this study was to examine the crucial mechanical properties of twelve hydroxyapatite- or tricalcium phosphate-filled composites. The raw and surface-treated forms of both CaP fillers were applied. As a reference materials two experimental glass-containing composites and one commercial dental restorative composite were applied. Nano-hardness, elastic modulus, compressive, flexural and diametral tensile strength of all studied materials were determined. Application of statistical methods (one-way analysis of variance and cluster agglomerative analysis) allowed for assessing the similarities between examined materials according to the values of studied parameters. The obtained results show that in almost all cases the mechanical properties of experimental CaPs-composites are comparable or even better than mechanical properties of examined reference materials. Copyright © 2017 Elsevier B.V. All rights reserved.
Chorioamnionitis and chronic lung disease of prematurity: a path analysis of causality.
Dessardo, Nada Sindičić; Mustać, Elvira; Dessardo, Sandro; Banac, Srđan; Peter, Branimir; Finderle, Aleksandar; Marić, Marinko; Haller, Herman
2012-02-01
Current evidence suggests that additional pathogenetic factors could play a role in the development of chronic lung disease of prematurity, other than mechanical ventilation and free radical injury. The introduction of the concept of "fetal inflammatory response syndrome" offers a new perspective on the pathogenesis of chronic lung disease of prematurity. New statistical approaches could be useful tools in evaluating causal relationships in the development of chronic morbidity in preterm infants. The aim of this study was to test a new statistical framework incorporating path analysis to evaluate causality between exposure to chorioamnionitis and fetal inflammatory response syndrome and the development of chronic lung disease of prematurity. We designed a prospective cohort study that included consecutively born premature infants less than 32 weeks of gestation whose placentas were collected for histological analysis. Histological chorioamnionitis, clinical data, and neonatal outcomes were related to chronic lung disease. Along with standard statistical methods, a path analysis was performed to test the relationship between histological chorioamnionitis, gestational age, mechanical ventilation, and development of chronic lung disease of prematurity. Among the newborns enrolled in the study, 69/189 (36%) had histological chorioamnionitis. Of those with histological chorioamnionitis, 28/69 (37%) were classified as having fetal inflammatory response syndrome, according to the presence of severe chorioamnionitis and funisitis. Histological chorioamnionitis was associated with a lower birth weight, shorter gestation, higher frequency of patent ductus arteriosus, greater use of surfactant, and higher frequency of chronic lung disease of prematurity. Severe chorioamnionitis and funisitis were significantly associated with lower birth weight, lower gestational age, lower Apgar score at 5 minutes, more frequent use of mechanical ventilatory support and surfactant, as well as higher frequency of patent ductus arteriosus and chronic lung disease. The results of the path analysis showed that fetal inflammatory response syndrome has a significant direct (0.66), indirect (0.11), and overall (0.77) effect on chronic lung disease. This study demonstrated a strong positive correlation between exposure of the fetus to a severe inflammatory response and the development of chronic lung disease of prematurity. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander
2012-11-07
We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Understanding amyloid aggregation by statistical analysis of atomic force microscopy images
NASA Astrophysics Data System (ADS)
Adamcik, Jozef; Jung, Jin-Mi; Flakowski, Jérôme; de Los Rios, Paolo; Dietler, Giovanni; Mezzenga, Raffaele
2010-06-01
The aggregation of proteins is central to many aspects of daily life, including food processing, blood coagulation, eye cataract formation disease and prion-related neurodegenerative infections. However, the physical mechanisms responsible for amyloidosis-the irreversible fibril formation of various proteins that is linked to disorders such as Alzheimer's, Creutzfeldt-Jakob and Huntington's diseases-have not yet been fully elucidated. Here, we show that different stages of amyloid aggregation can be examined by performing a statistical polymer physics analysis of single-molecule atomic force microscopy images of heat-denatured β-lactoglobulin fibrils. The atomic force microscopy analysis, supported by theoretical arguments, reveals that the fibrils have a multistranded helical shape with twisted ribbon-like structures. Our results also indicate a possible general model for amyloid fibril assembly and illustrate the potential of this approach for investigating fibrillar systems.
Goudouri, Ourania-Menti; Kontonasaki, Eleana; Papadopoulou, Lambrini; Manda, Marianthi; Kavouras, Panagiotis; Triantafyllidis, Konstantinos S; Stefanidou, Maria; Koidis, Petros; Paraskevopoulos, Konstantinos M
2017-02-01
The aim of this study was the evaluation of the textural characteristics of an experimental sol-gel derived feldspathic dental ceramic, which has already been proven bioactive and the investigation of its flexural strength through Weibull Statistical Analysis. The null hypothesis was that the flexural strength of the experimental and the commercial dental ceramic would be of the same order, resulting in a dental ceramic with apatite forming ability and adequate mechanical integrity. Although the flexural strength of the experimental ceramics was not statistically significant different compared to the commercial one, the amount of blind pores due to processing was greater. The textural characteristics of the experimental ceramic were in accordance with the standard low porosity levels reported for dental ceramics used for fixed prosthetic restorations. Feldspathic dental ceramics with typical textural characteristics and advanced mechanical properties as well as enhanced apatite forming ability can be synthesized through the sol-gel method. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zackay, Arie; Steinhoff, Christine
2010-12-15
Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org.
2010-01-01
Background Exploration of DNA methylation and its impact on various regulatory mechanisms has become a very active field of research. Simultaneously there is an arising need for tools to process and analyse the data together with statistical investigation and visualisation. Findings MethVisual is a new application that enables exploratory analysis and intuitive visualization of DNA methylation data as is typically generated by bisulfite sequencing. The package allows the import of DNA methylation sequences, aligns them and performs quality control comparison. It comprises basic analysis steps as lollipop visualization, co-occurrence display of methylation of neighbouring and distant CpG sites, summary statistics on methylation status, clustering and correspondence analysis. The package has been developed for methylation data but can be also used for other data types for which binary coding can be inferred. The application of the package, as well as a comparison to existing DNA methylation analysis tools and its workflow based on two datasets is presented in this paper. Conclusions The R package MethVisual offers various analysis procedures for data that can be binarized, in particular for bisulfite sequenced methylation data. R/Bioconductor has become one of the most important environments for statistical analysis of various types of biological and medical data. Therefore, any data analysis within R that allows the integration of various data types as provided from different technological platforms is convenient. It is the first and so far the only specific package for DNA methylation analysis, in particular for bisulfite sequenced data available in R/Bioconductor enviroment. The package is available for free at http://methvisual.molgen.mpg.de/ and from the Bioconductor Consortium http://www.bioconductor.org. PMID:21159174
Introduction of statistical information in a syntactic analyzer for document image recognition
NASA Astrophysics Data System (ADS)
Maroneze, André O.; Coüasnon, Bertrand; Lemaitre, Aurélie
2011-01-01
This paper presents an improvement to document layout analysis systems, offering a possible solution to Sayre's paradox (which states that an element "must be recognized before it can be segmented; and it must be segmented before it can be recognized"). This improvement, based on stochastic parsing, allows integration of statistical information, obtained from recognizers, during syntactic layout analysis. We present how this fusion of numeric and symbolic information in a feedback loop can be applied to syntactic methods to improve document description expressiveness. To limit combinatorial explosion during exploration of solutions, we devised an operator that allows optional activation of the stochastic parsing mechanism. Our evaluation on 1250 handwritten business letters shows this method allows the improvement of global recognition scores.
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics.
Lyon, Yana A; Riggs, Dylan; Fornelli, Luca; Compton, Philip D; Julian, Ryan R
2018-01-01
Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. Graphical abstract ᅟ.
The Ups and Downs of Repeated Cleavage and Internal Fragment Production in Top-Down Proteomics
NASA Astrophysics Data System (ADS)
Lyon, Yana A.; Riggs, Dylan; Fornelli, Luca; Compton, Philip D.; Julian, Ryan R.
2018-01-01
Analysis of whole proteins by mass spectrometry, or top-down proteomics, has several advantages over methods relying on proteolysis. For example, proteoforms can be unambiguously identified and examined. However, from a gas-phase ion-chemistry perspective, proteins are enormous molecules that present novel challenges relative to peptide analysis. Herein, the statistics of cleaving the peptide backbone multiple times are examined to evaluate the inherent propensity for generating internal versus terminal ions. The raw statistics reveal an inherent bias favoring production of terminal ions, which holds true regardless of protein size. Importantly, even if the full suite of internal ions is generated by statistical dissociation, terminal ions are predicted to account for at least 50% of the total ion current, regardless of protein size, if there are three backbone dissociations or fewer. Top-down analysis should therefore be a viable approach for examining proteins of significant size. Comparison of the purely statistical analysis with actual top-down data derived from ultraviolet photodissociation (UVPD) and higher-energy collisional dissociation (HCD) reveals that terminal ions account for much of the total ion current in both experiments. Terminal ion production is more favored in UVPD relative to HCD, which is likely due to differences in the mechanisms controlling fragmentation. Importantly, internal ions are not found to dominate from either the theoretical or experimental point of view. [Figure not available: see fulltext.
Association analysis of multiple traits by an approach of combining P values.
Chen, Lili; Wang, Yong; Zhou, Yajing
2018-03-01
Increasing evidence shows that one variant can affect multiple traits, which is a widespread phenomenon in complex diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic mechanism. Although there are many statistical methods to analyse multiple traits, most of these methods are usually suitable for detecting common variants associated with multiple traits. However, because of low minor allele frequency of rare variant, these methods are not optimal for rare variant association analysis. In this paper, we extend an adaptive combination of P values method (termed ADA) for single trait to test association between multiple traits and rare variants in the given region. For a given region, we use reverse regression model to test each rare variant associated with multiple traits and obtain the P value of single-variant test. Further, we take the weighted combination of these P values as the test statistic. Extensive simulation studies show that our approach is more powerful than several other comparison methods in most cases and is robust to the inclusion of a high proportion of neutral variants and the different directions of effects of causal variants.
Nonlinear dynamic mechanism of vocal tremor from voice analysis and model simulations
NASA Astrophysics Data System (ADS)
Zhang, Yu; Jiang, Jack J.
2008-09-01
Nonlinear dynamic analysis and model simulations are used to study the nonlinear dynamic characteristics of vocal folds with vocal tremor, which can typically be characterized by low-frequency modulation and aperiodicity. Tremor voices from patients with disorders such as paresis, Parkinson's disease, hyperfunction, and adductor spasmodic dysphonia show low-dimensional characteristics, differing from random noise. Correlation dimension analysis statistically distinguishes tremor voices from normal voices. Furthermore, a nonlinear tremor model is proposed to study the vibrations of the vocal folds with vocal tremor. Fractal dimensions and positive Lyapunov exponents demonstrate the evidence of chaos in the tremor model, where amplitude and frequency play important roles in governing vocal fold dynamics. Nonlinear dynamic voice analysis and vocal fold modeling may provide a useful set of tools for understanding the dynamic mechanism of vocal tremor in patients with laryngeal diseases.
NASA Astrophysics Data System (ADS)
Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui
2016-06-01
Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.
A five year review of paediatric burns and social deprivation: Is there a link?
Richards, Helen; Kokocinska, Maria; Lewis, Darren
2017-09-01
To establish if there is a correlation between burn incidence and social deprivation in order to formulate a more effective burns prevention strategy. A quantitative retrospective review of International Burn Injury Database (IBID) was carried out over a period from 2006 to 2011 to obtain data for children referred to our burns centre in West Midlands. Social deprivation scores for geographical areas were obtained from Office of National Statistics (ONS). Statistical analysis was carried out using Graphpad Prism. 1688 children were reviewed at our burns centre. Statistical analysis using Pearson correlation coefficient showed a slight association between social deprivation and increasing burn incidence r 2 =0.1268, 95% confidence interval 0.018-0.219, p value<0.0001. There was a slight male preponderance (58%). The most common mechanism of injury was scalding (61%). The most commonly affected age group were 1-2 year olds (38%). There were statistically significant differences in the ethnicity of children with significantly more children from Asian and African backgrounds being referred compared to Caucasian children. We found that appropriate first aid was administered in 67% of cases overall. We did not find a statistically significant link between first aid provision and social deprivation score. There was only a slight positive correlation between social deprivation and burn incidence. However, there did not seem to be any change in mechanism of burn in the most deprived groups compared to overall pattern, nor was there a significant difference in appropriate first aid provision. It would seem that dissemination of burn prevention strategies and first aid advice need to be improved across all geographical areas as this was uniformly lacking and the increased burn incidence in more socially deprived groups, although present, was not statistically significant. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.
ERIC Educational Resources Information Center
1971
Computers have effected a comprehensive transformation of chemistry. Computers have greatly enhanced the chemist's ability to do model building, simulations, data refinement and reduction, analysis of data in terms of models, on-line data logging, automated control of experiments, quantum chemistry and statistical and mechanical calculations, and…
Coherent instability in wall-bounded turbulence
NASA Astrophysics Data System (ADS)
Hack, M. J. Philipp
2017-11-01
Hairpin vortices are commonly considered one of the major classes of coherent fluid motions in shear layers, even as their significance in the grand scheme of turbulence has remained an openly debated question. The statistical prevalence of the dynamic process that gives rise to the hairpins across different types of flows suggests an origin in a robust common mechanism triggered by conditions widespread in wall-bounded shear layers. This study seeks to shed light on the physical process which drives the generation of hairpin vortices. It is primarily facilitated through an algorithm based on concepts developed in the field of computer vision which allows the topological identification and analysis of coherent flow processes across multiple scales. Application to direct numerical simulations of boundary layers enables the time-resolved sampling and exploration of the hairpin process in natural flow. The analysis yields rich statistical results which lead to a refined characterization of the hairpin process. Linear stability theory offers further insight into the flow physics and especially into the connection between the hairpin and exponential amplification mechanisms. The results also provide a sharpened understanding of the underlying causality of events.
Fracture mechanics concepts in reliability analysis of monolithic ceramics
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.; Gyekenyesi, John P.
1987-01-01
Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.
Markov Logic Networks in the Analysis of Genetic Data
Sakhanenko, Nikita A.
2010-01-01
Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249
PREFACE: Mathematical Aspects of Generalized Entropies and their Applications
NASA Astrophysics Data System (ADS)
Suyari, Hiroki; Ohara, Atsumi; Wada, Tatsuaki
2010-01-01
In the recent increasing interests in power-law behaviors beyond the usual exponential ones, there have been some concrete attempts in statistical physics to generalize the standard Boltzmann-Gibbs statistics. Among such generalizations, nonextensive statistical mechanics has been well studied for about the last two decades with many modifications and refinements. The generalization has provided not only a theoretical framework but also many applications such as chaos, multi-fractal, complex systems, nonequilibrium statistical mechanics, biophysics, econophysics, information theory and so on. At the same time as the developments in the generalization of statistical mechanics, the corresponding mathematical structures have also been required and uncovered. In particular, some deep connections to mathematical sciences such as q-analysis, information geometry, information theory and quantum probability theory have been revealed recently. These results obviously indicate an existence of the generalized mathematical structure including the mathematical framework for the exponential family as a special case, but the whole structure is still unclear. In order to make an opportunity to discuss the mathematical structure induced from generalized entropies by scientists in many fields, the international workshop 'Mathematical Aspects of Generalized Entropies and their Applications' was held on 7-9 July 2009 at Kyoto TERRSA, Kyoto, Japan. This volume is the proceedings of the workshop which consisted of 6 invited speakers, 14 oral presenters, 7 poster presenters and 63 other participants. The topics of the workshop cover the nonextensive statistical mechanics, chaos, cosmology, information geometry, divergence theory, econophysics, materials engineering, molecular dynamics and entropy theory, information theory and so on. The workshop was organized as the first attempt to discuss these mathematical aspects with leading experts in each area. We would like to express special thanks to all the invited speakers, the contributors and the participants at the workshop. We are also grateful to RIMS (Research Institute for Mathematical Science) in Kyoto University and the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (B), 18300003, 2009 for their support. Organizing Committee Editors of the Proceedings Hiroki Suyari (Chiba University, Japan) Atsumi Ohara (Osaka University, Japan) Tatsuaki Wada (Ibaraki University, Japan) Conference photograph
Continuum radiation from active galactic nuclei: A statistical study
NASA Technical Reports Server (NTRS)
Isobe, T.; Feigelson, E. D.; Singh, K. P.; Kembhavi, A.
1986-01-01
The physics of the continuum spectrum of active galactic nuclei (AGNs) was examined using a large data set and rigorous statistical methods. A data base was constructed for 469 objects which include radio selected quasars, optically selected quasars, X-ray selected AGNs, BL Lac objects, and optically unidentified compact radio sources. Each object has measurements of its radio, optical, X-ray core continuum luminosity, though many of them are upper limits. Since many radio sources have extended components, the core component were carefully selected out from the total radio luminosity. With survival analysis statistical methods, which can treat upper limits correctly, these data can yield better statistical results than those previously obtained. A variety of statistical tests are performed, such as the comparison of the luminosity functions in different subsamples, and linear regressions of luminosities in different bands. Interpretation of the results leads to the following tentative conclusions: the main emission mechanism of optically selected quasars and X-ray selected AGNs is thermal, while that of BL Lac objects is synchrotron; radio selected quasars may have two different emission mechanisms in the X-ray band; BL Lac objects appear to be special cases of the radio selected quasars; some compact radio sources show the possibility of synchrotron self-Compton (SSC) in the optical band; and the spectral index between the optical and the X-ray bands depends on the optical luminosity.
Mechanical testing and finite element analysis of orthodontic teardrop loop.
Coimbra, Maria Elisa Rodrigues; Penedo, Norman Duque; de Gouvêa, Jayme Pereira; Elias, Carlos Nelson; de Souza Araújo, Mônica Tirre; Coelho, Paulo Guilherme
2008-02-01
Understanding how teeth move in response to mechanical loads is an important aspect of orthodontic treatment. Treatment planning should include consideration of the appliances that will meet the desired loading of the teeth to result in optimized treatment outcomes. The purpose of this study was to evaluate the use of computer simulation to predict the force and the torsion obtained after the activation of tear drop loops of 3 heights. Seventy-five retraction loops were divided into 3 groups according to height (6, 7, and 8 mm). The loops were subjected to tensile load through displacements of 0.5, 1.0, 1.5, and 2.0 mm, and the resulting forces and torques were recorded. The loops were designed in AutoCAD software(2005; Autodesk Systems, Alpharetta, GA), and finite element analysis was performed with Ansys software(version 7.0; Swanson Analysis System, Canonsburg, PA). Statistical analysis of the mechanical experiment results was obtained by ANOVA and the Tukey post-hoc test (P < .01). The correlation test and the paired t test (P < .05) were used to compare the computer simulation with the mechanical experiment. The computer simulation accurately predicted the experimentally determined mechanical behavior of tear drop loops of different heights and should be considered an alternative for designing orthodontic appliances before treatment.
Statistical Model Analysis of (n,p) Cross Sections and Average Energy For Fission Neutron Spectrum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odsuren, M.; Khuukhenkhuu, G.
2011-06-28
Investigation of charged particle emission reaction cross sections for fast neutrons is important to both nuclear reactor technology and the understanding of nuclear reaction mechanisms. In particular, the study of (n,p) cross sections is necessary to estimate radiation damage due to hydrogen production, nuclear heating and transmutations in the structural materials of fission and fusion reactors. On the other hand, it is often necessary in practice to evaluate the neutron cross sections of the nuclides for which no experimental data are available.Because of this, we carried out the systematical analysis of known experimental (n,p) and (n,a) cross sections for fastmore » neutrons and observed a systematical regularity in the wide energy interval of 6-20 MeV and for broad mass range of target nuclei. To explain this effect using the compound, pre-equilibrium and direct reaction mechanisms some formulae were deduced. In this paper, in the framework of the statistical model known experimental (n,p) cross sections averaged over the thermal fission neutron spectrum of U-235 are analyzed. It was shown that the experimental data are satisfactorily described by the statistical model. Also, in the case of (n,p) cross sections the effective average neutron energy for fission spectrum of U-235 was found to be around 3 MeV.« less
Lian, Bin; Xia, Jinjun; Yang, Xun; Zhou, Chanjuan; Gong, Xue; Gui, Siwen; Mao, Qiang; Wang, Ling; Li, Pengfei; Huang, Cheng; Qi, Xunzhong; Xie, Peng
2018-06-13
In the present study, we used a gas chromatography-mass spectrometry-based metabolomics method to evaluate the effects of ketamine on mice hippocampi. Multivariate statistical analysis and ingenuity pathway analysis were then used to identify and explore the potential mechanisms and biofunction of ketamine. Compared with the control (CON) group, 14 differential metabolites that involved amino acid metabolism, energy metabolism, and oxidative stress metabolism were identified. After combination with 2,3-dihydroxy-6-nitro-7-sulfamoyl-benzo[f]quinoxaline-2,3-dione (NBQX) administration, six of the 14 metabolites remained significantly differentially expressed between the ketamine (KET) and KET+NBQX groups, including glycine, alanine, glutamine, aspartic acid, myoinositol, and ascorbate, whereas no difference was found in the levels of the other eight metabolites between the KET and KET+NBQX groups, including phosphate, 4-aminobutyric acid, urea, creatine, L-malic acid, galactinol, inosine, and aminomalonic. Our findings indicate that ketamine exerts antidepressant effects through an α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid inhibition-dependent mechanism and a mechanism not affected by α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid inhibition. Which provides further insight into the therapeutic mechanisms of ketamine in the hippocampus.
Origin of the correlations between exit times in pedestrian flows through a bottleneck
NASA Astrophysics Data System (ADS)
Nicolas, Alexandre; Touloupas, Ioannis
2018-01-01
Robust statistical features have emerged from the microscopic analysis of dense pedestrian flows through a bottleneck, notably with respect to the time gaps between successive passages. We pinpoint the mechanisms at the origin of these features thanks to simple models that we develop and analyse quantitatively. We disprove the idea that anticorrelations between successive time gaps (i.e. an alternation between shorter ones and longer ones) are a hallmark of a zipper-like intercalation of pedestrian lines and show that they simply result from the possibility that pedestrians from distinct ‘lines’ or directions cross the bottleneck within a short time interval. A second feature concerns the bursts of escapes, i.e. egresses that come in fast succession. Despite the ubiquity of exponential distributions of burst sizes, entailed by a Poisson process, we argue that anomalous (power-law) statistics arise if the bottleneck is nearly congested, albeit only in a tiny portion of parameter space. The generality of the proposed mechanisms implies that similar statistical features should also be observed for other types of particulate flows.
NASA Astrophysics Data System (ADS)
Kimball, Jorja; Cole, Bryan; Hobson, Margaret; Watson, Karan; Stanley, Christine
This paper reports findings on gender that were part of a larger study reviewing time to completion of course work that includes the first two semesters of calculus, chemistry, and physics, which are often considered the stumbling points or "barrier courses" to an engineering baccalaureate degree. Texas A&M University terms these courses core body of knowledge (CBK), and statistical analysis was conducted on two cohorts of first-year enrolling engineering students at the institution. Findings indicate that gender is statistically significantly related to completion of CBK with female engineering students completing required courses faster than males at the .01 level (p = 0.008). Statistical significance for gender and ethnicity was found between white male and white female students at the .01 level (p = 0.008). Descriptive analysis indicated that of the five majors studied (chemical, civil, computer, electrical, and mechanical engineering), women completed CBK faster than men, and African American and Hispanic women completed CBK faster than males of the same ethnicity.
The development of ensemble theory. A new glimpse at the history of statistical mechanics
NASA Astrophysics Data System (ADS)
Inaba, Hajime
2015-12-01
This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.
Fan, Yannan; Siklenka, Keith; Arora, Simran K.; Ribeiro, Paula; Kimmins, Sarah; Xia, Jianguo
2016-01-01
MicroRNAs (miRNAs) can regulate nearly all biological processes and their dysregulation is implicated in various complex diseases and pathological conditions. Recent years have seen a growing number of functional studies of miRNAs using high-throughput experimental technologies, which have produced a large amount of high-quality data regarding miRNA target genes and their interactions with small molecules, long non-coding RNAs, epigenetic modifiers, disease associations, etc. These rich sets of information have enabled the creation of comprehensive networks linking miRNAs with various biologically important entities to shed light on their collective functions and regulatory mechanisms. Here, we introduce miRNet, an easy-to-use web-based tool that offers statistical, visual and network-based approaches to help researchers understand miRNAs functions and regulatory mechanisms. The key features of miRNet include: (i) a comprehensive knowledge base integrating high-quality miRNA-target interaction data from 11 databases; (ii) support for differential expression analysis of data from microarray, RNA-seq and quantitative PCR; (iii) implementation of a flexible interface for data filtering, refinement and customization during network creation; (iv) a powerful fully featured network visualization system coupled with enrichment analysis. miRNet offers a comprehensive tool suite to enable statistical analysis and functional interpretation of various data generated from current miRNA studies. miRNet is freely available at http://www.mirnet.ca. PMID:27105848
NASA Technical Reports Server (NTRS)
Ellis, David L.
2007-01-01
Room temperature tensile testing of Chemically Pure (CP) Titanium Grade 2 was conducted for as-received commercially produced sheet and following thermal exposure at 550 and 650 K for times up to 5,000 h. No significant changes in microstructure or failure mechanism were observed. A statistical analysis of the data was performed. Small statistical differences were found, but all properties were well above minimum values for CP Ti Grade 2 as defined by ASTM standards and likely would fall within normal variation of the material.
Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics
NASA Astrophysics Data System (ADS)
Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.
2003-03-01
Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Improving information retrieval in functional analysis.
Rodriguez, Juan C; González, Germán A; Fresno, Cristóbal; Llera, Andrea S; Fernández, Elmer A
2016-12-01
Transcriptome analysis is essential to understand the mechanisms regulating key biological processes and functions. The first step usually consists of identifying candidate genes; to find out which pathways are affected by those genes, however, functional analysis (FA) is mandatory. The most frequently used strategies for this purpose are Gene Set and Singular Enrichment Analysis (GSEA and SEA) over Gene Ontology. Several statistical methods have been developed and compared in terms of computational efficiency and/or statistical appropriateness. However, whether their results are similar or complementary, the sensitivity to parameter settings, or possible bias in the analyzed terms has not been addressed so far. Here, two GSEA and four SEA methods and their parameter combinations were evaluated in six datasets by comparing two breast cancer subtypes with well-known differences in genetic background and patient outcomes. We show that GSEA and SEA lead to different results depending on the chosen statistic, model and/or parameters. Both approaches provide complementary results from a biological perspective. Hence, an Integrative Functional Analysis (IFA) tool is proposed to improve information retrieval in FA. It provides a common gene expression analytic framework that grants a comprehensive and coherent analysis. Only a minimal user parameter setting is required, since the best SEA/GSEA alternatives are integrated. IFA utility was demonstrated by evaluating four prostate cancer and the TCGA breast cancer microarray datasets, which showed its biological generalization capabilities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
Inverse tissue mechanics of cell monolayer expansion.
Kondo, Yohei; Aoki, Kazuhiro; Ishii, Shin
2018-03-01
Living tissues undergo deformation during morphogenesis. In this process, cells generate mechanical forces that drive the coordinated cell motion and shape changes. Recent advances in experimental and theoretical techniques have enabled in situ measurement of the mechanical forces, but the characterization of mechanical properties that determine how these forces quantitatively affect tissue deformation remains challenging, and this represents a major obstacle for the complete understanding of morphogenesis. Here, we proposed a non-invasive reverse-engineering approach for the estimation of the mechanical properties, by combining tissue mechanics modeling and statistical machine learning. Our strategy is to model the tissue as a continuum mechanical system and to use passive observations of spontaneous tissue deformation and force fields to statistically estimate the model parameters. This method was applied to the analysis of the collective migration of Madin-Darby canine kidney cells, and the tissue flow and force were simultaneously observed by the phase contrast imaging and traction force microscopy. We found that our monolayer elastic model, whose elastic moduli were reverse-engineered, enabled a long-term forecast of the traction force fields when given the tissue flow fields, indicating that the elasticity contributes to the evolution of the tissue stress. Furthermore, we investigated the tissues in which myosin was inhibited by blebbistatin treatment, and observed a several-fold reduction in the elastic moduli. The obtained results validate our framework, which paves the way to the estimation of mechanical properties of living tissues during morphogenesis.
Statistical mechanics explanation for the structure of ocean eddies and currents
NASA Astrophysics Data System (ADS)
Venaille, A.; Bouchet, F.
2010-12-01
The equilibrium statistical mechanics of two dimensional and geostrophic flows predicts the outcome for the large scales of the flow, resulting from the turbulent mixing. This theory has been successfully applied to describe detailed properties of Jupiter's Great Red Spot. We discuss the range of applicability of this theory to ocean dynamics. It is able to reproduce mesoscale structures like ocean rings. It explains, from statistical mechanics, the westward drift of rings at the speed of non dispersive baroclinic waves, and the recently observed (Chelton and col.) slower northward drift of cyclonic eddies and southward drift of anticyclonic eddies. We also uncover relations between strong eastward mid-basin inertial jets, like the Kuroshio extension and the Gulf Stream, and statistical equilibria. We explain under which conditions such strong mid-basin jets can be understood as statistical equilibria. We claim that these results are complementary to the classical Sverdrup-Munk theory: they explain the inertial part basin dynamics, the jets structure and location, using very simple theoretical arguments. References: A. VENAILLE and F. BOUCHET, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. BOUCHET and A. VENAILLE, Statistical mechanics of two-dimensional and geophysical flows, arxiv ...., submitted to Physics Reports P. BERLOFF, A. M. HOGG, W. DEWAR, The Turbulent Oscillator: A Mechanism of Low- Frequency Variability of the Wind-Driven Ocean Gyres, Journal of Physical Oceanography 37 (2007) 2363-+. D. B. CHELTON, M. G. SCHLAX, R. M. SAMELSON, R. A. de SZOEKE, Global observations of large oceanic eddies, Geo. Res. Lett.34 (2007) 15606-+ b) and c) are snapshots of streamfunction and potential vorticity (red: positive values; blue: negative values) in the upper layer of a three layer quasi-geostrophic model of a mid-latitude ocean basin (from Berloff and co.). a) Streamfunction predicted by statistical mechanics. Even in an out-equilibrium situation like this one, equilibrium statistical mechanics predicts remarkably the overall qualitative flow structure. Observation of westward drift of ocean eddies and of slower northward drift of cyclones and southward drift of anticyclones by Chelton and co. We explain these observations from statistical mechanics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J.F.
Research in the biomedical sciences at PNL is described. Activities reported include: inhaled plutonium in dogs; national radiobiology archives; statistical analysis of data from animal studies; genotoxicity of inhaled energy effluents; molecular events during tumor initiation; biochemistry of free radical induced DNA damage; radon hazards in homes; mechanisms of radon injury; genetics of radon induced lung cancer; and in vivo/in vitro radon induced cellular damage.
Pan-Cancer Analysis of Mutation Hotspots in Protein Domains.
Miller, Martin L; Reznik, Ed; Gauthier, Nicholas P; Aksoy, Bülent Arman; Korkut, Anil; Gao, Jianjiong; Ciriello, Giovanni; Schultz, Nikolaus; Sander, Chris
2015-09-23
In cancer genomics, recurrence of mutations in independent tumor samples is a strong indicator of functional impact. However, rare functional mutations can escape detection by recurrence analysis owing to lack of statistical power. We enhance statistical power by extending the notion of recurrence of mutations from single genes to gene families that share homologous protein domains. Domain mutation analysis also sharpens the functional interpretation of the impact of mutations, as domains more succinctly embody function than entire genes. By mapping mutations in 22 different tumor types to equivalent positions in multiple sequence alignments of domains, we confirm well-known functional mutation hotspots, identify uncharacterized rare variants in one gene that are equivalent to well-characterized mutations in another gene, detect previously unknown mutation hotspots, and provide hypotheses about molecular mechanisms and downstream effects of domain mutations. With the rapid expansion of cancer genomics projects, protein domain hotspot analysis will likely provide many more leads linking mutations in proteins to the cancer phenotype. Copyright © 2015 Elsevier Inc. All rights reserved.
Step-stress analysis for predicting dental ceramic reliability
Borba, Márcia; Cesar, Paulo F.; Griggs, Jason A.; Bona, Álvaro Della
2013-01-01
Objective To test the hypothesis that step-stress analysis is effective to predict the reliability of an alumina-based dental ceramic (VITA In-Ceram AL blocks) subjected to a mechanical aging test. Methods Bar-shaped ceramic specimens were fabricated, polished to 1µm finish and divided into 3 groups (n=10): (1) step-stress accelerating test; (2) flexural strength- control; (3) flexural strength- mechanical aging. Specimens from group 1 were tested in an electromagnetic actuator (MTS Evolution) using a three-point flexure fixture (frequency: 2Hz; R=0.1) in 37°C water bath. Each specimen was subjected to an individual stress profile, and the number of cycles to failure was recorded. A cumulative damage model with an inverse power law lifetime-stress relation and Weibull lifetime distribution were used to fit the fatigue data. The data were used to predict the stress level and number of cycles for mechanical aging (group 3). Groups 2 and 3 were tested for three-point flexural strength (σ) in a universal testing machine with 1.0 s in 37°C water. Data were statistically analyzed using Mann-Whitney Rank Sum test. Results Step-stress data analysis showed that the profile most likely to weaken the specimens without causing fracture during aging (95% CI: 0–14% failures) was: 80 MPa stress amplitude and 105 cycles. The median σ values (MPa) for groups 2 (493±54) and 3 (423±103) were statistically different (p=0.009). Significance The aging profile determined by step-stress analysis was effective to reduce alumina ceramic strength as predicted by the reliability estimate, confirming the study hypothesis. PMID:23827018
Step-stress analysis for predicting dental ceramic reliability.
Borba, Márcia; Cesar, Paulo F; Griggs, Jason A; Della Bona, Alvaro
2013-08-01
To test the hypothesis that step-stress analysis is effective to predict the reliability of an alumina-based dental ceramic (VITA In-Ceram AL blocks) subjected to a mechanical aging test. Bar-shaped ceramic specimens were fabricated, polished to 1μm finish and divided into 3 groups (n=10): (1) step-stress accelerating test; (2) flexural strength-control; (3) flexural strength-mechanical aging. Specimens from group 1 were tested in an electromagnetic actuator (MTS Evolution) using a three-point flexure fixture (frequency: 2Hz; R=0.1) in 37°C water bath. Each specimen was subjected to an individual stress profile, and the number of cycles to failure was recorded. A cumulative damage model with an inverse power law lifetime-stress relation and Weibull lifetime distribution were used to fit the fatigue data. The data were used to predict the stress level and number of cycles for mechanical aging (group 3). Groups 2 and 3 were tested for three-point flexural strength (σ) in a universal testing machine with 1.0MPa/s stress rate, in 37°C water. Data were statistically analyzed using Mann-Whitney Rank Sum test. Step-stress data analysis showed that the profile most likely to weaken the specimens without causing fracture during aging (95% CI: 0-14% failures) was: 80MPa stress amplitude and 10(5) cycles. The median σ values (MPa) for groups 2 (493±54) and 3 (423±103) were statistically different (p=0.009). The aging profile determined by step-stress analysis was effective to reduce alumina ceramic strength as predicted by the reliability estimate, confirming the study hypothesis. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
2012-01-01
Background It is known from recent studies that more than 90% of human multi-exon genes are subject to Alternative Splicing (AS), a key molecular mechanism in which multiple transcripts may be generated from a single gene. It is widely recognized that a breakdown in AS mechanisms plays an important role in cellular differentiation and pathologies. Polymerase Chain Reactions, microarrays and sequencing technologies have been applied to the study of transcript diversity arising from alternative expression. Last generation Affymetrix GeneChip Human Exon 1.0 ST Arrays offer a more detailed view of the gene expression profile providing information on the AS patterns. The exon array technology, with more than five million data points, can detect approximately one million exons, and it allows performing analyses at both gene and exon level. In this paper we describe BEAT, an integrated user-friendly bioinformatics framework to store, analyze and visualize exon arrays datasets. It combines a data warehouse approach with some rigorous statistical methods for assessing the AS of genes involved in diseases. Meta statistics are proposed as a novel approach to explore the analysis results. BEAT is available at http://beat.ba.itb.cnr.it. Results BEAT is a web tool which allows uploading and analyzing exon array datasets using standard statistical methods and an easy-to-use graphical web front-end. BEAT has been tested on a dataset with 173 samples and tuned using new datasets of exon array experiments from 28 colorectal cancer and 26 renal cell cancer samples produced at the Medical Genetics Unit of IRCCS Casa Sollievo della Sofferenza. To highlight all possible AS events, alternative names, accession Ids, Gene Ontology terms and biochemical pathways annotations are integrated with exon and gene level expression plots. The user can customize the results choosing custom thresholds for the statistical parameters and exploiting the available clinical data of the samples for a multivariate AS analysis. Conclusions Despite exon array chips being widely used for transcriptomics studies, there is a lack of analysis tools offering advanced statistical features and requiring no programming knowledge. BEAT provides a user-friendly platform for a comprehensive study of AS events in human diseases, displaying the analysis results with easily interpretable and interactive tables and graphics. PMID:22536968
NASA Astrophysics Data System (ADS)
Li, Y.; Robertson, C.
2018-06-01
The influence of irradiation defect dispersions on plastic strain spreading is investigated by means of three-dimensional dislocation dynamics (DD) simulations, accounting for thermally activated slip and cross-slip mechanisms in Fe-2.5%Cr grains. The defect-induced evolutions of the effective screw dislocation mobility are evaluated by means of statistical comparisons, for various defect number density and defect size cases. Each comparison is systematically associated with a quantitative Defect-Induced Apparent Straining Temperature shift (or «ΔDIAT»), calculated without any adjustable parameters. In the investigated cases, the ΔDIAT level associated with a given defect dispersion closely replicates the measured ductile to brittle transition temperature shift (ΔDBTT) due to the same, actual defect dispersion. The results are further analyzed in terms of dislocation-based plasticity mechanisms and their possible relations with the dose-dependent changes of the ductile to brittle transition temperature.
Analysis of surface sputtering on a quantum statistical basis
NASA Technical Reports Server (NTRS)
Wilhelm, H. E.
1975-01-01
Surface sputtering is explained theoretically by means of a 3-body sputtering mechanism involving the ion and two surface atoms of the solid. By means of quantum-statistical mechanics, a formula for the sputtering ratio S(E) is derived from first principles. The theoretical sputtering rate S(E) was found experimentally to be proportional to the square of the difference between incident ion energy and the threshold energy for sputtering of surface atoms at low ion energies. Extrapolation of the theoretical sputtering formula to larger ion energies indicates that S(E) reaches a saturation value and finally decreases at high ion energies. The theoretical sputtering ratios S(E) for wolfram, tantalum, and molybdenum are compared with the corresponding experimental sputtering curves in the low energy region from threshold sputtering energy to 120 eV above the respective threshold energy. Theory and experiment are shown to be in good agreement.
Conceptual developments of non-equilibrium statistical mechanics in the early days of Japan
NASA Astrophysics Data System (ADS)
Ichiyanagi, Masakazu
1995-11-01
This paper reviews the research in nonequilibrium statistical mechanics made in Japan in the period between 1930 and 1960. Nearly thirty years have passed since the discovery of the exact formula for the electrical conductivity. With the rise of the linear response theory, the methods and results of which are quickly grasped by anyone, its rationale was pushed aside and even at the stage where the formulation was still incomplete some authors hurried to make physical applications. Such an attitude robbed it of most of its interest for the average physicist, who would approach an understanding of some basic concept, not through abstract and logical analysis but by simply increasing his technical experiences with the concept. The purpose of this review is to rescue the linear response theory from being labeled a mathematical tool and to show that it has considerable physical content. Many key papers, originally written in Japanese, are reproduced.
Can the behavioral sciences self-correct? A social epistemic study.
Romero, Felipe
2016-12-01
Advocates of the self-corrective thesis argue that scientific method will refute false theories and find closer approximations to the truth in the long run. I discuss a contemporary interpretation of this thesis in terms of frequentist statistics in the context of the behavioral sciences. First, I identify experimental replications and systematic aggregation of evidence (meta-analysis) as the self-corrective mechanism. Then, I present a computer simulation study of scientific communities that implement this mechanism to argue that frequentist statistics may converge upon a correct estimate or not depending on the social structure of the community that uses it. Based on this study, I argue that methodological explanations of the "replicability crisis" in psychology are limited and propose an alternative explanation in terms of biases. Finally, I conclude suggesting that scientific self-correction should be understood as an interaction effect between inference methods and social structures. Copyright © 2016 Elsevier Ltd. All rights reserved.
Landau's statistical mechanics for quasi-particle models
NASA Astrophysics Data System (ADS)
Bannur, Vishnu M.
2014-04-01
Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.
Ito, Reio; Shinoda, Masamichi; Honda, Kuniya; Urata, Kentaro; Lee, Jun; Maruno, Mitsuru; Soma, Kumi; Okada, Shinji; Gionhaku, Nobuhito; Iwata, Koichi
To determine the involvement of tumor necrosis factor alpha (TNFα) signaling in the trigeminal ganglion (TG) in the mechanical hypersensitivity of the masseter muscle during temporomandibular joint (TMJ) inflammation. A total of 55 male Sprague-Dawley rats were used. Following injection of Complete Freund's Adjuvant into the TMJ, the mechanical sensitivities of the masseter muscle and the overlying facial skin were measured. Satellite glial cell (SGC) activation and TNFα expression in the TG were investigated immunohistochemically, and the effects of their inhibition on the mechanical hypersensitivity of the masseter muscle were also examined. Student t test or two-way repeated-measures analysis of variance followed by Bonferroni multiple comparisons test were used for statistical analyses. P < .05 was considered to reflect statistical significance. Mechanical allodynia in the masseter muscle was induced without any inflammatory cell infiltration in the muscle after TMJ inflammation. SGC activation and an increased number of TNFα-immunoreactive cells were induced in the TG following TMJ inflammation. Intra-TG administration of an inhibitor of SGC activity or of TNFα-neutralizing antibody depressed both the increased number of TG cells encircled by activated SGCs and the mechanical hypersensitivity of the masseter following TMJ inflammation. These findings suggest that persistent masseter hypersensitivity associated with TMJ inflammation was mediated by SGC-TG neuron interactions via TNFα signaling in the TG.
A statistical study of EMIC waves observed by Cluster: 1. Wave properties
NASA Astrophysics Data System (ADS)
Allen, R. C.; Zhang, J.-C.; Kistler, L. M.; Spence, H. E.; Lin, R.-L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.
2015-07-01
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In this study, we present a statistical analysis of EMIC wave properties using 10 years (2001-2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. The statistical analysis is presented in two papers. This paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.
NASA Astrophysics Data System (ADS)
Nykyri, K.; Moore, T.; Dimmock, A. P.
2017-12-01
In the Earth's magnetosphere, the magnetotail plasma sheet ions are much hotter than in the shocked solar wind. On the dawn-sector, the cold-component ions are more abundant and hotter by 30-40 percent when compared to the dusk sector. Recent statistical studies of the flank magnetopause and magnetosheath have shown that the level of temperature asymmetry of the magnetosheath is unable to account for this, so additional physical mechanisms must be at play, either at the magnetopause or plasma sheet that contribute to this asymmetry. In this study, we perform a statistical analysis on the ion-scale wave properties in the three main plasma regimes common to flank magnetopause boundary crossings when the boundary is unstable to KHI: hot and tenuous magnetospheric, cold and dense magnetosheath and mixed [Hasegawa 2004 et al., 2004]. These statistics of ion-scale wave properties are compared to observations of fast magnetosonic wave modes that have recently been linked to Kelvin-Helmholtz vortex centered ion heating [Moore et al., 2016]. The statistical analysis shows that during KH events there is enhanced non-adiabatic heating calculated during (temporal) ion scale wave intervals when compared to non-KH events.
Ergodic theorem, ergodic theory, and statistical mechanics
Moore, Calvin C.
2015-01-01
This perspective highlights the mean ergodic theorem established by John von Neumann and the pointwise ergodic theorem established by George Birkhoff, proofs of which were published nearly simultaneously in PNAS in 1931 and 1932. These theorems were of great significance both in mathematics and in statistical mechanics. In statistical mechanics they provided a key insight into a 60-y-old fundamental problem of the subject—namely, the rationale for the hypothesis that time averages can be set equal to phase averages. The evolution of this problem is traced from the origins of statistical mechanics and Boltzman's ergodic hypothesis to the Ehrenfests' quasi-ergodic hypothesis, and then to the ergodic theorems. We discuss communications between von Neumann and Birkhoff in the Fall of 1931 leading up to the publication of these papers and related issues of priority. These ergodic theorems initiated a new field of mathematical-research called ergodic theory that has thrived ever since, and we discuss some of recent developments in ergodic theory that are relevant for statistical mechanics. PMID:25691697
Wu, Xu; Shao, Chuan; Zhang, Liang; Tu, Jinjing; Xu, Hui; Lin, Zhihui; Xu, Shuguang; Yu, Biyun; Tang, Yaodong; Li, Shanqun
2018-03-01
Chronic obstructive pulmonary disease (COPD) is often accompanied by acute exacerbations. Patients of COPD exacerbation suffering from respiratory failure often need the support of mechanical ventilation. Helium-oxygen can be used to reduce airway resistance during mechanical ventilation. The aim of this study is to evaluate the effect of helium-oxygen-assisted mechanical ventilation on COPD exacerbation through a meta-analysis. A comprehensive literature search through databases of Pub Med (1966∼2016), Ovid MEDLINE (1965∼2016), Cochrane EBM (1991∼2016), EMBASE (1974∼2016) and Ovid MEDLINE was performed to identify associated studies. Randomized clinical trials met our inclusion criteria that focus on helium-oxygen-assisted mechanical ventilation on COPD exacerbation were included. The quality of the papers was evaluated after inclusion and information was extracted for meta-analysis. Six articles and 392 patients were included in total. Meta-analysis revealed that helium-oxygen-assisted mechanical ventilation reduced Borg dyspnea scale and increased arterial PH compared with air-oxygen. No statistically significant difference was observed between helium-oxygen and air-oxygen as regards to WOB, PaCO 2 , OI, tracheal intubation rates and mortality within hospital. Our study suggests helium-oxygen-assisted mechanical ventilation can help to reduce Borg dyspnea scale. In terms of the tiny change of PH, its clinical benefit is negligible. There is no conclusive evidence indicating the beneficial effect of helium-oxygen-assisted mechanical ventilation on clinical outcomes or prognosis of COPD exacerbation. © 2017 John Wiley & Sons Ltd.
A statistical mechanical approach to restricted integer partition functions
NASA Astrophysics Data System (ADS)
Zhou, Chi-Chun; Dai, Wu-Sheng
2018-05-01
The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.
Relationships between microstructure and mechanical properties of Ti-5Al-5Mo-5V-3Cr-1Zr alloy
NASA Astrophysics Data System (ADS)
Li, Z. Y.; Wu, G. Q.; Huang, Z.
2018-03-01
Through a statistical, quantitative analysis on microstructure of Ti-5Al-5Mo-5V-3Cr-1Zr (Ti55531) alloy, the relationships between microstructure and mechanical properties and heat treatment temperatures were investigated. The results show that in Widmanstätten structure, the size of β grain is greatly increased with increasing annealing temperature. Static toughness is related to grain boundary alpha phase discontinuity, the tensile strength is related to acicular alpha phase interface length and acicular alpha phase proportion. In duplex microstructure, the tensile strength is related to the equiaxed alpha proportion. Elongation, static toughness and crack forming work are related to the equiaxed alpha proportion and negatively related to secondary phase proportion. The microstructure can be described quantitatively and the mechanical properties can be predicted by analysis of microstructure.
Partitioning-based mechanisms under personalized differential privacy.
Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian
2017-05-01
Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t -round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms.
Partitioning-based mechanisms under personalized differential privacy
Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian
2017-01-01
Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t-round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms. PMID:28932827
NASA Astrophysics Data System (ADS)
Esbrand, C.; Royle, G.; Griffiths, J.; Speller, R.
2009-07-01
The integration of technology with healthcare has undoubtedly propelled the medical imaging sector well into the twenty first century. The concept of digital imaging introduced during the 1970s has since paved the way for established imaging techniques where digital mammography, phase contrast imaging and CT imaging are just a few examples. This paper presents a prototype intelligent digital mammography system designed and developed by a European consortium. The final system, the I-ImaS system, utilises CMOS monolithic active pixel sensor (MAPS) technology promoting on-chip data processing, enabling the acts of data processing and image acquisition to be achieved simultaneously; consequently, statistical analysis of tissue is achievable in real-time for the purpose of x-ray beam modulation via a feedback mechanism during the image acquisition procedure. The imager implements a dual array of twenty 520 pixel × 40 pixel CMOS MAPS sensing devices with a 32μm pixel size, each individually coupled to a 100μm thick thallium doped structured CsI scintillator. This paper presents the first intelligent images of real breast tissue obtained from the prototype system of real excised breast tissue where the x-ray exposure was modulated via the statistical information extracted from the breast tissue itself. Conventional images were experimentally acquired where the statistical analysis of the data was done off-line, resulting in the production of simulated real-time intelligently optimised images. The results obtained indicate real-time image optimisation using the statistical information extracted from the breast as a means of a feedback mechanisms is beneficial and foreseeable in the near future.
NASA Astrophysics Data System (ADS)
Ferro, Carlo Giovanni; Brischetto, Salvatore; Torre, Roberto; Maggiore, Paolo
2016-07-01
The Fused Deposition Modelling (FDM) technology is widely used in rapid prototyping. 3D printers for home desktop applications are usually employed to make non-structural objects. When the mechanical stresses are not excessive, this technology can also be successfully employed to produce structural objects, not only in prototyping stage but also in the realization of series pieces. The innovative idea of the present work is the application of this technology, implemented in a desktop 3D printer, to the realization of components for aeronautical use, especially for unmanned aerial systems. For this purpose, the paper is devoted to the statistical study of the performance of a desktop 3D printer to understand how the process performs and which are the boundary limits of acceptance. Mechanical and geometrical properties of ABS (Acrylonitrile Butadiene Styrene) specimens, such as tensile strength and stiffness, have been evaluated. ASTM638 type specimens have been used. A capability analysis has been applied for both mechanical and dimensional performances. Statistically stable limits have been determined using experimentally collected data.
OSO 8 observational limits to the acoustic coronal heating mechanism
NASA Technical Reports Server (NTRS)
Bruner, E. C., Jr.
1981-01-01
An improved analysis of time-resolved line profiles of the C IV resonance line at 1548 A has been used to test the acoustic wave hypothesis of solar coronal heating. It is shown that the observed motions and brightness fluctuations are consistent with the existence of acoustic waves. Specific account is taken of the effect of photon statistics on the observed velocities, and a test is devised to determine whether the motions represent propagating or evanescent waves. It is found that on the average about as much energy is carried upward as downward such that the net acoustic flux density is statistically consistent with zero. The statistical uncertainty in this null result is three orders of magnitue lower than the flux level needed to heat the corona.
Mars: Noachian hydrology by its statistics and topology
NASA Technical Reports Server (NTRS)
Cabrol, N. A.; Grin, E. A.
1993-01-01
Discrimination between fluvial features generated by surface drainage and subsurface aquifer discharges will provide clues to the understanding of early Mars' climatic history. Our approach is to define the process of formation of the oldest fluvial valleys by statistical and topological analyses. Formation of fluvial valley systems reached its highest statistical concentration during the Noachian Period. Nevertheless, they are a scarce phenomenom in Martian history, localized on the craterized upland, and subject to latitudinal distribution. They occur sparsely on Noachian geological units with a weak distribution density, and appear in reduced isolated surface (around 5 x 10(exp 3)(sq km)), filled by short streams (100-300 km length). Topological analysis of the internal organization of 71 surveyed Noachian fluvial valley networks also provides information on the mechanisms of formation.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
Memory matters: influence from a cognitive map on animal space use.
Gautestad, Arild O
2011-10-21
A vertebrate individual's cognitive map provides a capacity for site fidelity and long-distance returns to favorable patches. Fractal-geometrical analysis of individual space use based on collection of telemetry fixes makes it possible to verify the influence of a cognitive map on the spatial scatter of habitat use and also to what extent space use has been of a scale-specific versus a scale-free kind. This approach rests on a statistical mechanical level of system abstraction, where micro-scale details of behavioral interactions are coarse-grained to macro-scale observables like the fractal dimension of space use. In this manner, the magnitude of the fractal dimension becomes a proxy variable for distinguishing between main classes of habitat exploration and site fidelity, like memory-less (Markovian) Brownian motion and Levy walk and memory-enhanced space use like Multi-scaled Random Walk (MRW). In this paper previous analyses are extended by exploring MRW simulations under three scenarios: (1) central place foraging, (2) behavioral adaptation to resource depletion (avoidance of latest visited locations) and (3) transition from MRW towards Levy walk by narrowing memory capacity to a trailing time window. A generalized statistical-mechanical theory with the power to model cognitive map influence on individual space use will be important for statistical analyses of animal habitat preferences and the mechanics behind site fidelity and home ranges. Copyright © 2011 Elsevier Ltd. All rights reserved.
Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis
NASA Astrophysics Data System (ADS)
Xiao, Di; Wang, Jun
2012-10-01
The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.
On-Orbit System Identification
NASA Technical Reports Server (NTRS)
Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.
1987-01-01
Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.
Non-operative management (NOM) of blunt hepatic trauma: 80 cases.
Özoğul, Bünyami; Kısaoğlu, Abdullah; Aydınlı, Bülent; Öztürk, Gürkan; Bayramoğlu, Atıf; Sarıtemur, Murat; Aköz, Ayhan; Bulut, Özgür Hakan; Atamanalp, Sabri Selçuk
2014-03-01
Liver is the most frequently injured organ upon abdominal trauma. We present a group of patients with blunt hepatic trauma who were managed without any invasive diagnostic tools and/or surgical intervention. A total of 80 patients with blunt liver injury who were hospitalized to the general surgery clinic or other clinics due to the concomitant injuries were followed non-operatively. The normally distributed numeric variables were evaluated by Student's t-test or one way analysis of variance, while non-normally distributed variables were analyzed by Mann-Whitney U-test or Kruskal-Wallis variance analysis. Chi-square test was also employed for the comparison of categorical variables. Statistical significance was assumed for p<0.05. There was no significant relationship between patients' Hgb level and liver injury grade, outcome, and mechanism of injury. Also, there was no statistical relationship between liver injury grade, outcome, and mechanism of injury and ALT levels as well as AST level. There was no mortality in any of the patients. During the last quarter of century, changes in the diagnosis and treatment of liver injury were associated with increased survival. NOM of liver injury in patients with stable hemodynamics and hepatic trauma seems to be the gold standard.
Statistical Learning of Phonetic Categories: Insights from a Computational Approach
ERIC Educational Resources Information Center
McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.
2009-01-01
Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…
NASA Technical Reports Server (NTRS)
Speziale, Charles G.
1988-01-01
The invariance of constitutive equations in continuum mechanics is examined from a basic theoretical standpoint. It is demonstrated the constitutive equations which are not form invariant under arbitrary translational accelerations of the reference frame are in violation of the Einstein equivalane principle. Furthermore, by making use of an analysis based on statistical mechanics, it is argued that any frame-dependent terms in constitutive equations must arise from the intrinsic spin tensor and are negligible provided that the ratio of microscopic to macroscopic time scales is extremely small. The consistency of these results with existing constitutive theories is discussed in detail along with possible avenues of future research.
Giannotti, Marina I; Cabeza de Vaca, Israel; Artés, Juan M; Sanz, Fausto; Guallar, Victor; Gorostiza, Pau
2015-09-10
The structural basis of the low reorganization energy of cupredoxins has long been debated. These proteins reconcile a conformationally heterogeneous and exposed metal-chelating site with the highly rigid copper center required for efficient electron transfer. Here we combine single-molecule mechanical unfolding experiments with statistical analysis and computer simulations to show that the metal-binding region of apo-azurin is mechanically flexible and that high mechanical stability is imparted by copper binding. The unfolding pathway of the metal site depends on the pulling residue and suggests that partial unfolding of the metal-binding site could be facilitated by the physical interaction with certain regions of the redox protein.
Chung, Dongjun; Kim, Hang J; Zhao, Hongyu
2017-02-01
Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.
Demystification of Bell inequality
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2009-08-01
The main aim of this review is to show that the common conclusion that Bell's argument implies that any attempt to proceed beyond quantum mechanics induces a nonlocal model was not totally justified. Our analysis of Bell's argument demonstrates that violation of Bell's inequality implies neither "death of realism" nor nonlocality. This violation is just a sign of non-Kolmogorovness of statistical data - impossibility to put statistical data collected in a few different experiments (corresponding to incompatible settings of polarization beam splitters) in one probability space. This inequality was well known in theoretical probability since 19th century (from works of Boole). We couple non-Kolmogorovness of data with design of modern detectors of photons.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Uçar, Yurdanur; Aysan Meriç, İpek; Ekren, Orhun
2018-02-11
To compare the fracture mechanics, microstructure, and elemental composition of lithography-based ceramic manufacturing with pressing and CAD/CAM. Disc-shaped specimens (16 mm diameter, 1.2 mm thick) were used for mechanical testing (n = 10/group). Biaxial flexural strength of three groups (In-Ceram alumina [ICA], lithography-based alumina, ZirkonZahn) were determined using the "piston on 3-ball" technique as suggested in test Standard ISO-6872. Vickers hardness test was performed. Fracture toughness was calculated using fractography. Results were statistically analyzed using Kruskal-Wallis test followed by Dunnett T3 (α = 0.05). Weibull analysis was conducted. Polished and fracture surface characterization was made using scanning electron microscope (SEM). Energy dispersive spectroscopy (EDS) was used for elemental analysis. Biaxial flexural strength of ICA, LCM alumina (LCMA), and ZirkonZahn were 147 ± 43 MPa, 490 ± 44 MPa, and 709 ± 94 MPa, respectively, and were statistically different (P ≤ 0.05). The Vickers hardness number of ICA was 850 ± 41, whereas hardness values for LCMA and ZirkonZahn were 1581 ± 144 and 1249 ± 57, respectively, and were statistically different (P ≤ 0.05). A statistically significant difference was found between fracture toughness of ICA (2 ± 0.4 MPa⋅m 1/2 ), LCMA (6.5 ± 1.5 MPa⋅m 1/2 ), and ZirkonZahn (7.7 ± 1 MPa⋅m 1/2 ) (P ≤ 0.05). Weibull modulus was highest for LCMA (m = 11.43) followed by ZirkonZahn (m = 8.16) and ICA (m = 5.21). Unlike LCMA and ZirkonZahn groups, a homogeneous microstructure was not observed for ICA. EDS results supported the SEM images. Within the limitations of this in vitro study, it can be concluded that LCM seems to be a promising technique for final ceramic object manufacturing in dental applications. Both the manufacturing method and the material used should be improved. © 2018 by the American College of Prosthodontists.
NASA Technical Reports Server (NTRS)
Goldsmith, Marlana B.; Sankar, Bhavani V.; Haftka, Raphael T.; Goldberg, Robert K.
2013-01-01
The objectives of this paper include identifying important architectural parameters that describe the SiC/SiC five-harness satin weave composite and characterizing the statistical distributions and correlations of those parameters from photomicrographs of various cross sections. In addition, realistic artificial cross sections of a 2D representative volume element (RVE) are generated reflecting the variability found in the photomicrographs, which are used to determine the effects of architectural variability on the thermo-mechanical properties. Lastly, preliminary information is obtained on the sensitivity of thermo-mechanical properties to architectural variations. Finite element analysis is used in combination with a response surface and it is shown that the present method is effective in determining the effects of architectural variability on thermo-mechanical properties.
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
NASA Astrophysics Data System (ADS)
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
Statistical methods and neural network approaches for classification of data from multiple sources
NASA Technical Reports Server (NTRS)
Benediktsson, Jon Atli; Swain, Philip H.
1990-01-01
Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
Humans make efficient use of natural image statistics when performing spatial interpolation.
D'Antona, Anthony D; Perry, Jeffrey S; Geisler, Wilson S
2013-12-16
Visual systems learn through evolution and experience over the lifespan to exploit the statistical structure of natural images when performing visual tasks. Understanding which aspects of this statistical structure are incorporated into the human nervous system is a fundamental goal in vision science. To address this goal, we measured human ability to estimate the intensity of missing image pixels in natural images. Human estimation accuracy is compared with various simple heuristics (e.g., local mean) and with optimal observers that have nearly complete knowledge of the local statistical structure of natural images. Human estimates are more accurate than those of simple heuristics, and they match the performance of an optimal observer that knows the local statistical structure of relative intensities (contrasts). This optimal observer predicts the detailed pattern of human estimation errors and hence the results place strong constraints on the underlying neural mechanisms. However, humans do not reach the performance of an optimal observer that knows the local statistical structure of the absolute intensities, which reflect both local relative intensities and local mean intensity. As predicted from a statistical analysis of natural images, human estimation accuracy is negligibly improved by expanding the context from a local patch to the whole image. Our results demonstrate that the human visual system exploits efficiently the statistical structure of natural images.
Huang, Qi; Lv, Xin; He, Yushuang; Wei, Xing; Ma, Meigang; Liao, Yuhan; Qin, Chao; Wu, Yuan
2017-12-01
Patients with epilepsy (PWE) are more likely to suffer from migraine attack, and aberrant white matter (WM) organization may be the mechanism underlying this phenomenon. This study aimed to use diffusion tensor imaging (DTI) technique to quantify WM structural differences in PWE with interictal migraine. Diffusion tensor imaging data were acquired in 13 PWE with migraine and 12 PWE without migraine. Diffusion metrics were analyzed using tract-atlas-based spatial statistics analysis. Atlas-based and tract-based spatial statistical analyses were conducted for robustness analysis. Correlation was explored between altered DTI metrics and clinical parameters. The main results are as follows: (i) Axonal damage plays a key role in PWE with interictal migraine. (ii) Significant diffusing alterations included higher fractional anisotropy (FA) in the fornix, higher mean diffusivity (MD) in the middle cerebellar peduncle (CP), left superior CP, and right uncinate fasciculus, and higher axial diffusivity (AD) in the middle CP and right medial lemniscus. (iii) Diffusion tensor imaging metrics has the tendency of correlation with seizure/migraine type and duration. Results indicate that characteristic structural impairments exist in PWE with interictal migraine. Epilepsy may contribute to migraine by altering WMs in the brain stem. White matter tracts in the fornix and right uncinate fasciculus also mediate migraine after epilepsy. This finding may improve our understanding of the pathological mechanisms underlying migraine attack after epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
Majid, Kamran; Crowder, Terence; Baker, Erin; Baker, Kevin; Koueiter, Denise; Shields, Edward; Herkowitz, Harry N
2011-12-01
One hundred eighteen patients retrieved 316L stainless steel thoracolumbar plates, of 3 different designs, used for fusion in 60 patients were examined for evidence of corrosion. A medical record review and statistical analysis were also carried out. This study aims to identify types of corrosion and examine preferential metal ion release and the possibility of statistical correlation to clinical effects. Earlier studies have found that stainless steel spine devices showed evidence of mild-to-severe corrosion; fretting and crevice corrosion were the most commonly reported types. Studies have also shown the toxicity of metal ions released from stainless steel corrosion and how the ions may adversely affect bone formation and/or induce granulomatous foreign body responses. The retrieved plates were visually inspected and graded based on the degree of corrosion. The plates were then analyzed with optical microscopy, scanning electron microscopy, and energy dispersive x-ray spectroscopy. A retrospective medical record review was performed and statistical analysis was carried out to determine any correlations between experimental findings and patient data. More than 70% of the plates exhibited some degree of corrosion. Both fretting and crevice corrosion mechanisms were observed, primarily at the screw plate interface. Energy dispersive x-ray spectroscopy analysis indicated reductions in nickel content in corroded areas, suggestive of nickel ion release to the surrounding biological environment. The incidence and severity of corrosion was significantly correlated with the design of the implant. Stainless steel thoracolumbar plates show a high incidence of corrosion, with statistical dependence on device design.
Statistical Thermodynamics and Microscale Thermophysics
NASA Astrophysics Data System (ADS)
Carey, Van P.
1999-08-01
Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.
A study of tensile test on open-cell aluminum foam sandwich
NASA Astrophysics Data System (ADS)
Ibrahim, N. A.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Abdullah Sidek, Atiah Bt.; Endut, N. A.
2018-01-01
Aluminum foam sandwich (AFS) panels are one of the growing materials in the various industries because of its lightweight behavior. AFS also known for having excellent stiffness to weight ratio and high-energy absorption. Due to their advantages, many researchers’ shows an interest in aluminum foam material for expanding the use of foam structure. However, there is still a gap need to be fill in order to develop reliable data on mechanical behavior of AFS with different parameters and analysis method approach. Least of researcher focusing on open-cell aluminum foam and statistical analysis. Thus, this research conducted by using open-cell aluminum foam core grade 6101 with aluminum sheets skin tested under tension. The data is analyzed using full factorial in JMP statistical analysis software (version 11). ANOVA result show a significant value of the model which less than 0.500. While scatter diagram and 3D plot surface profiler found that skins thickness gives a significant impact to stress/strain value compared to core thickness.
Systolic and Diastolic Left Ventricular Mechanics during and after Resistance Exercise.
Stöhr, Eric J; Stembridge, Mike; Shave, Rob; Samuel, T Jake; Stone, Keeron; Esformes, Joseph I
2017-10-01
To improve the current understanding of the impact of resistance exercise on the heart, by examining the acute responses of left ventricular (LV) strain, twist, and untwisting rate ("LV mechanics"). LV echocardiographic images were recorded in systole and diastole before, during and immediately after (7-12 s) double-leg press exercise at two intensities (30% and 60% of maximum strength, one-repetition maximum). Speckle tracking analysis generated LV strain, twist, and untwisting rate data. Additionally, beat-by-beat blood pressure was recorded and systemic vascular resistance (SVR) and LV wall stress were calculated. Responses in both exercise trials were statistically similar (P > 0.05). During effort, stroke volume decreased, whereas SVR and LV wall stress increased (P < 0.05). Immediately after effort, stroke volume returned to baseline, whereas SVR and wall stress decreased (P < 0.05). Similarly, acute exercise was accompanied by a significant decrease in systolic parameters of LV muscle mechanics (P < 0.05). However, diastolic parameters, including LV untwisting rate, were statistically unaltered (P > 0.05). Immediately after exercise, systolic LV mechanics returned to baseline levels (P < 0.05) but LV untwisting rate increased significantly (P < 0.05). A single, acute bout of double-leg press resistance exercise transiently reduces systolic LV mechanics, but increases diastolic mechanics after exercise, suggesting that resistance exercise has a differential impact on systolic and diastolic heart muscle function. The findings may explain why acute resistance exercise has been associated with reduced stroke volume but chronic exercise training may result in increased LV volumes.
Study of pre-seismic kHz EM emissions by means of complex systems
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos
2010-05-01
The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.
Studies in Non-Equilibrium Statistical Mechanics.
1982-09-01
in the formalism, and this is used to simulate the effects of rotational states and collisions. At each stochastic step the energy changes in the...uses of this method. 10. A Scaling Theoretical Analysis of Vibrational Relaxation Experiments: Rotational Effects and Long-Range Collisions 0...in- elude rotational effects through the rotational energy gaps and the rotational distributions. The variables in this theory are a fundamental set
Multispecies, Integrative GWAS for Focal Segmental Glomerulosclerosis
2017-09-01
is a frequent cause of end-stage renal disease (ESRD. We investigated the genetic basis of FSGS and recruited a heterogeneous population of...understanding the complex genetic mechanisms of FSGS. 15. SUBJECT TERMS FSGS, MCD, GWAS, CNV 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT uu...disease (MCD). Using a variety of statistical and genetic approaches, including genome wide association analysis and rare copy number variations (CNVs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hallerman, G.; Gray, R.J.
An instrument for crushing-strength determinations of uncoated and pyrolytic-carbon-coated fuel particles (50 to 500 mu in diameter) was developed to relate the crushing strength of the particles to their fabricability. The instrument consists of a loading mechanism, load cell, and a power supply-readout unit. The information that can be obtained by statistical methods of the data analysis is illustrated by results on two batches of fuel particles. (auth)
NASA Technical Reports Server (NTRS)
Mixson, John S.; Wilby, John F.
1991-01-01
The generation and control of flight vehicle interior noise is discussed. Emphasis is placed on the mechanisms of transmission through airborne and structure-borne paths and the control of cabin noise by path modification. Techniques for identifying the relative contributions of the various source-path combinations are also discussed along with methods for the prediction of aircraft interior noise such as those based on the general modal theory and statistical energy analysis.
ERIC Educational Resources Information Center
Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.
2010-01-01
Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…
Johnson, Quentin R; Lindsay, Richard J; Shen, Tongye
2018-02-21
A computational method which extracts the dominant motions from an ensemble of biomolecular conformations via a correlation analysis of residue-residue contacts is presented. The algorithm first renders the structural information into contact matrices, then constructs the collective modes based on the correlated dynamics of a selected set of dynamic contacts. Associated programs can bridge the results for further visualization using graphics software. The aim of this method is to provide an analysis of conformations of biopolymers from the contact viewpoint. It may assist a systematical uncovering of conformational switching mechanisms existing in proteins and biopolymer systems in general by statistical analysis of simulation snapshots. In contrast to conventional correlation analyses of Cartesian coordinates (such as distance covariance analysis and Cartesian principal component analysis), this program also provides an alternative way to locate essential collective motions in general. Herein, we detail the algorithm in a stepwise manner and comment on the importance of the method as applied to decoding allosteric mechanisms. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Huang, J; Du, P; Ao, C; Ho, M; Lei, M; Zhao, D; Wang, Z
2007-12-01
Statistical analysis of stormwater runoff data enables general identification of runoff characteristics. Six catchments with different urban surface type including roofs, roadway, park, and residential/commercial in Macau were selected for sampling and study during the period from June 2005 to September 2006. Based on univariate statistical analysis of data sampled, major pollutants discharged from different urban surface type were identified. As for iron roof runoff, Zn is the most significant pollutant. The major pollutants from urban roadway runoff are TSS and COD. Stormwater runoff from commercial/residential and Park catchments show high level of COD, TN, and TP concentration. Principal component analysis was further done for identification of linkages between stormwater quality and urban surface types. Two potential pollution sources were identified for study catchments with different urban surface types. The first one is referred as nutrients losses, soil losses and organic pollutants discharges, the second is related to heavy metals losses. PCA was proved to be a viable tool to explain the type of pollution sources and its mechanism for different urban surface type catchments.
A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties
Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; ...
2015-07-23
Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of datamore » from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.« less
Statistical physics of the symmetric group.
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Statistical physics of the symmetric group
NASA Astrophysics Data System (ADS)
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Namani, Ravi; Wood, Matthew D.; Sakiyama-Elbert, Shelly E.; Bayly, Philip V.
2009-01-01
The anisotropic mechanical properties of magnetically aligned fibrin gels were measured by magnetic resonance elastography (MRE) and by a standard mechanical test: unconfined compression. Soft anisotropic biomaterials are notoriously difficult to characterize, especially in vivo. MRE is well-suited for efficient, non-invasive, and nondestructive assessment of shear modulus. Direction-dependent differences in shear modulus were found to be statistically significant for gels polymerized at magnetic fields of 11.7T and 4.7T compared to control gels. Mechanical anisotropy was greater in the gels polymerized at the higher magnetic field. These observations were consistent with results from unconfined compression tests. Analysis of confocal microscopy images of gels showed measurable alignment of fibrils in gels polymerized at 11.7T. This study provides direct, quantitative measurements of the anisotropy in mechanical properties that accompanies fibril alignment in fibrin gels. PMID:19656516
Hagenfeld, Daniel; Koch, Raphael; Jünemann, Sebastian; Prior, Karola; Harks, Inga; Eickholz, Peter; Hoffmann, Thomas; Kim, Ti-Sun; Kocher, Thomas; Meyle, Jörg; Kaner, Doğan; Schlagenhauf, Ulrich; Ehmke, Benjamin; Harmsen, Dag
2018-01-01
Empiric antibiotics are often used in combination with mechanical debridement to treat patients suffering from periodontitis and to eliminate disease-associated pathogens. Until now, only a few next generation sequencing 16S rDNA amplicon based publications with rather small sample sizes studied the effect of those interventions on the subgingival microbiome. Therefore, we studied subgingival samples of 89 patients with chronic periodontitis (solely non-smokers) before and two months after therapy. Forty-seven patients received mechanical periodontal therapy only, whereas 42 patients additionally received oral administered amoxicillin plus metronidazole (500 and 400 mg, respectively; 3x/day for 7 days). Samples were sequenced with Illumina MiSeq 300 base pairs paired end technology (V3 and V4 hypervariable regions of the 16S rDNA). Inter-group differences before and after therapy of clinical variables (percentage of sites with pocket depth ≥ 5mm, percentage of sites with bleeding on probing) and microbiome variables (diversity, richness, evenness, and dissimilarity) were calculated, a principal coordinate analysis (PCoA) was conducted, and differential abundance of agglomerated ribosomal sequence variants (aRSVs) classified on genus level was calculated using a negative binomial regression model. We found statistically noticeable decreased richness, and increased dissimilarity in the antibiotic, but not in the placebo group after therapy. The PCoA revealed a clear compositional separation of microbiomes after therapy in the antibiotic group, which could not be seen in the group receiving mechanical therapy only. This difference was even more pronounced on aRSV level. Here, adjunctive antibiotics were able to induce a microbiome shift by statistically noticeably reducing aRSVs belonging to genera containing disease-associated species, e.g., Porphyromonas, Tannerella, Treponema, and Aggregatibacter, and by noticeably increasing genera containing health-associated species. Mechanical therapy alone did not statistically noticeably affect any disease-associated taxa. Despite the difference in microbiome modulation both therapies improved the tested clinical parameters after two months. These results cast doubt on the relevance of the elimination and/or reduction of disease-associated taxa as a main goal of periodontal therapy.
ERIC Educational Resources Information Center
National Centre for Vocational Education Research, Leabrook (Australia).
Statistics regarding Australians participating in apprenticeships and traineeships in the mechanical engineering and fabrication trades in 1995-1999 were reviewed to provide an indication of where skill shortages may be occurring or will likely occur in relation to the following occupations: mechanical engineering trades; fabrication engineering…
NASA Astrophysics Data System (ADS)
Dennison, Andrew G.
Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.
GAFFE: a gaze-attentive fixation finding engine.
Rajashekar, U; van der Linde, I; Bovik, A C; Cormack, L K
2008-04-01
The ability to automatically detect visually interesting regions in images has many practical applications, especially in the design of active machine vision and automatic visual surveillance systems. Analysis of the statistics of image features at observers' gaze can provide insights into the mechanisms of fixation selection in humans. Using a foveated analysis framework, we studied the statistics of four low-level local image features: luminance, contrast, and bandpass outputs of both luminance and contrast, and discovered that image patches around human fixations had, on average, higher values of each of these features than image patches selected at random. Contrast-bandpass showed the greatest difference between human and random fixations, followed by luminance-bandpass, RMS contrast, and luminance. Using these measurements, we present a new algorithm that selects image regions as likely candidates for fixation. These regions are shown to correlate well with fixations recorded from human observers.
Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics
NASA Astrophysics Data System (ADS)
Wang, Min; Wang, Jun
A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.
Quantitative analysis of spatial variability of geotechnical parameters
NASA Astrophysics Data System (ADS)
Fang, Xing
2018-04-01
Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.
Optimization of factors to obtain cassava starch films with improved mechanical properties
NASA Astrophysics Data System (ADS)
Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle
2017-08-01
In this study, was investigated the optimization of the factors that significantly influenced the mechanical property improvement of cassava starch films through complete factorial design 23. The factors to be analyzed were cassava starch, glycerol and modified clay contents. A regression model was proposed by the factorial analysis, aiming to estimate the condition of the individual factors investigated in the optimum state of the mechanical properties of the biofilm, using the following statistical tool: desirability function and response surface. The response variable that delimits the improvement of the mechanical property of the biofilm is the tensile strength, such improvement is obtained by maximizing the response variable. The factorial analysis showed that the best combination of factor configurations to reach the best response was found to be: with 5g of cassava starch, 10% of glycerol and 5% of modified clay, both percentages in relation to the dry mass of starch used. In addition, the starch biofilm showing the lowest response contained 2g of cassava starch, 0% of modified clay and 30% of glycerol, and was consequently considered the worst biofilm.
Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey
2012-01-01
The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143
Statistical analysis of magnetically soft particles in magnetorheological elastomers
NASA Astrophysics Data System (ADS)
Gundermann, T.; Cremer, P.; Löwen, H.; Menzel, A. M.; Odenbach, S.
2017-04-01
The physical properties of magnetorheological elastomers (MRE) are a complex issue and can be influenced and controlled in many ways, e.g. by applying a magnetic field, by external mechanical stimuli, or by an electric potential. In general, the response of MRE materials to these stimuli is crucially dependent on the distribution of the magnetic particles inside the elastomer. Specific knowledge of the interactions between particles or particle clusters is of high relevance for understanding the macroscopic rheological properties and provides an important input for theoretical calculations. In order to gain a better insight into the correlation between the macroscopic effects and microstructure and to generate a database for theoretical analysis, x-ray micro-computed tomography (X-μCT) investigations as a base for a statistical analysis of the particle configurations were carried out. Different MREs with quantities of 2-15 wt% (0.27-2.3 vol%) of iron powder and different allocations of the particles inside the matrix were prepared. The X-μCT results were edited by an image processing software regarding the geometrical properties of the particles with and without the influence of an external magnetic field. Pair correlation functions for the positions of the particles inside the elastomer were calculated to statistically characterize the distributions of the particles in the samples.
A review of failure models for unidirectional ceramic matrix composites under monotonic loads
NASA Technical Reports Server (NTRS)
Tripp, David E.; Hemann, John H.; Gyekenyesi, John P.
1989-01-01
Ceramic matrix composites offer significant potential for improving the performance of turbine engines. In order to achieve their potential, however, improvements in design methodology are needed. In the past most components using structural ceramic matrix composites were designed by trial and error since the emphasis of feasibility demonstration minimized the development of mathematical models. To understand the key parameters controlling response and the mechanics of failure, the development of structural failure models is required. A review of short term failure models with potential for ceramic matrix composite laminates under monotonic loads is presented. Phenomenological, semi-empirical, shear-lag, fracture mechanics, damage mechanics, and statistical models for the fast fracture analysis of continuous fiber unidirectional ceramic matrix composites under monotonic loads are surveyed.
Small Systems and Limitations on the Use of Chemical Thermodynamics
NASA Astrophysics Data System (ADS)
Tovbin, Yu. K.
2018-01-01
Limitations on using chemical thermodynamics to describe small systems are formulated. These limitations follow from statistical mechanics for equilibrium and nonequilibrium processes and reflect (1) differences between characteristic relaxation times in momentum, energy, and mass transfer in different aggregate states of investigated systems; (2) achievements of statistical mechanics that allow us to determine criteria for the size of smallest region in which thermodynamics can be applied and the scale of the emergence of a new phase, along with criteria for the conditions of violating a local equilibrium. Based on this analysis, the main thermodynamic results are clarified: the phase rule for distorted interfaces, the sense and area of applicability of Gibbs's concept of passive forces, and the artificiality of Kelvin's equation as a result of limitations on the thermodynamic approach to considering small bodies. The wrongness of introducing molecular parameters into thermodynamic derivations, and the activity coefficient for an activated complex into the expression for a reaction rate constant, is demonstrated.
Hedden, Sarra L; Woolson, Robert F; Carter, Rickey E; Palesch, Yuko; Upadhyaya, Himanshu P; Malcolm, Robert J
2009-07-01
"Loss to follow-up" can be substantial in substance abuse clinical trials. When extensive losses to follow-up occur, one must cautiously analyze and interpret the findings of a research study. Aims of this project were to introduce the types of missing data mechanisms and describe several methods for analyzing data with loss to follow-up. Furthermore, a simulation study compared Type I error and power of several methods when missing data amount and mechanism varies. Methods compared were the following: Last observation carried forward (LOCF), multiple imputation (MI), modified stratified summary statistics (SSS), and mixed effects models. Results demonstrated nominal Type I error for all methods; power was high for all methods except LOCF. Mixed effect model, modified SSS, and MI are generally recommended for use; however, many methods require that the data are missing at random or missing completely at random (i.e., "ignorable"). If the missing data are presumed to be nonignorable, a sensitivity analysis is recommended.
Learning place cells, grid cells and invariances with excitatory and inhibitory plasticity
2018-01-01
Neurons in the hippocampus and adjacent brain areas show a large diversity in their tuning to location and head direction, and the underlying circuit mechanisms are not yet resolved. In particular, it is unclear why certain cell types are selective to one spatial variable, but invariant to another. For example, place cells are typically invariant to head direction. We propose that all observed spatial tuning patterns – in both their selectivity and their invariance – arise from the same mechanism: Excitatory and inhibitory synaptic plasticity driven by the spatial tuning statistics of synaptic inputs. Using simulations and a mathematical analysis, we show that combined excitatory and inhibitory plasticity can lead to localized, grid-like or invariant activity. Combinations of different input statistics along different spatial dimensions reproduce all major spatial tuning patterns observed in rodents. Our proposed model is robust to changes in parameters, develops patterns on behavioral timescales and makes distinctive experimental predictions. PMID:29465399
Sex differences in mechanical allodynia: how can it be preclinically quantified and analyzed?
Nicotra, Lauren; Tuke, Jonathan; Grace, Peter M.; Rolan, Paul E.; Hutchinson, Mark R.
2014-01-01
Translating promising preclinical drug discoveries to successful clinical trials remains a significant hurdle in pain research. Although animal models have significantly contributed to understanding chronic pain pathophysiology, the majority of research has focused on male rodents using testing procedures that produce sex difference data that do not align well with comparable clinical experiences. Additionally, the use of animal pain models presents ongoing ethical challenges demanding continuing refinement of preclinical methods. To this end, this study sought to test a quantitative allodynia assessment technique and associated statistical analysis in a modified graded nerve injury pain model with the aim to further examine sex differences in allodynia. Graded allodynia was established in male and female Sprague Dawley rats by altering the number of sutures placed around the sciatic nerve and quantified by the von Frey test. Linear mixed effects modeling regressed response on each fixed effect (sex, oestrus cycle, pain treatment). On comparison with other common von Frey assessment techniques, utilizing lower threshold filaments than those ordinarily tested, at 1 s intervals, appropriately and successfully investigated female mechanical allodynia, revealing significant sex and oestrus cycle difference across the graded allodynia that other common behavioral methods were unable to detect. Utilizing this different von Frey approach and graded allodynia model, a single suture inflicting less allodynia was sufficient to demonstrate exaggerated female mechanical allodynia throughout the phases of dioestrus and pro-oestrus. Refining the von Frey testing method, statistical analysis technique and the use of a graded model of chronic pain, allowed for examination of the influences on female mechanical nociception that other von Frey methods cannot provide. PMID:24592221
Vaginal orgasm is associated with less use of immature psychological defense mechanisms.
Brody, Stuart; Costa, Rui Miguel
2008-05-01
Freud implied a link between inability to have a vaginal orgasm and psychosexual immaturity. Since Kinsey, many sexologists have asserted that no such link exists. However, empirical testing of the issue has been lacking. The objective was to determine the relationship between different sexual behavior triggers of female orgasm and use of immature psychological defense mechanisms. Women reported their past month frequency of different sexual behaviors and corresponding orgasm rates and completed the Defense Style Questionnaire (DSQ-40). The association between ability to have vaginal intercourse orgasm (versus clitoral orgasm) and the use of DSQ-40 immature psychological defense mechanisms (associated with various psychopathologies) was examined. In a sample of 94 healthy Portuguese women, vaginal orgasm (triggered solely by penile-vaginal intercourse) was associated with less use of DSQ-40 immature defenses. Vaginal orgasm was associated with less somatization, dissociation, displacement, autistic fantasy, devaluation, and isolation of affect. Orgasm from clitoral stimulation or combined clitoral-intercourse stimulation was not associated with less use of immature defenses, and was associated with more use of some immature defenses. In one regression analysis, more masturbation and less vaginal orgasm consistency made independent contributions to the statistical prediction of immature defenses. In another regression analysis, any use of extrinsic clitoral stimulation for intercourse orgasm, and lack of any vaginal orgasm, made independent contributions to the statistical prediction of immature defenses. Vaginally anorgasmic women had immature defenses scores comparable to those of established (depression, social anxiety disorder, panic disorder, and obsessive-compulsive disorder) outpatient psychiatric groups. Results were not confounded by social desirability responding or relationship quality. The results linking penile-vaginal orgasm with less use of immature psychological defense mechanisms are consistent with both early psychoanalytic personality theory and recent advances in sexual physiology. Implications for diagnosis and sex therapy are noted.
Yin, Jianfei; Hopkins, Carl
2013-04-01
Prediction of structure-borne sound transmission on built-up structures at audio frequencies is well-suited to Statistical Energy Analysis (SEA) although the inclusion of periodic ribbed plates presents challenges. This paper considers an approach using Advanced SEA (ASEA) that can incorporate tunneling mechanisms within a statistical approach. The coupled plates used for the investigation form an L-junction comprising a periodic ribbed plate with symmetric ribs and an isotropic homogeneous plate. Experimental SEA (ESEA) is carried out with input data from Finite Element Methods (FEM). This indicates that indirect coupling is significant at high frequencies where bays on the periodic ribbed plate can be treated as individual subsystems. SEA using coupling loss factors from wave theory leads to significant underestimates in the energy of the bays when the isotropic homogeneous plate is excited. This is due to the absence of tunneling mechanisms in the SEA model. In contrast, ASEA shows close agreement with FEM and laboratory measurements. The errors incurred with SEA rapidly increase as the bays become more distant from the source subsystem. ASEA provides significantly more accurate predictions by accounting for the spatial filtering that leads to non-diffuse vibration fields on these more distant bays.
On the (In)Validity of Tests of Simple Mediation: Threats and Solutions
Pek, Jolynn; Hoyle, Rick H.
2015-01-01
Mediation analysis is a popular framework for identifying underlying mechanisms in social psychology. In the context of simple mediation, we review and discuss the implications of three facets of mediation analysis: (a) conceptualization of the relations between the variables, (b) statistical approaches, and (c) relevant elements of design. We also highlight the issue of equivalent models that are inherent in simple mediation. The extent to which results are meaningful stem directly from choices regarding these three facets of mediation analysis. We conclude by discussing how mediation analysis can be better applied to examine causal processes, highlight the limits of simple mediation, and make recommendations for better practice. PMID:26985234
Alterations of Vertical Jump Mechanics after a Half-Marathon Mountain Running Race
Rousanoglou, Elissavet N.; Noutsos, Konstantinos; Pappas, Achilleas; Bogdanis, Gregory; Vagenas, Georgios; Bayios, Ioannis A.; Boudolos, Konstantinos D.
2016-01-01
The fatiguing effect of long-distance running has been examined in the context of a variety of parameters. However, there is scarcity of data regarding its effect on the vertical jump mechanics. The purpose of this study was to investigate the alterations of countermovement jump (CMJ) mechanics after a half-marathon mountain race. Twenty-seven runners performed CMJs before the race (Pre), immediately after the race (Post 1) and five minutes after Post 1 (Post 2). Instantaneous and ensemble-average analysis focused on jump height and, the maximum peaks and time-to-maximum peaks of: Displacement, vertical force (Fz), anterior-posterior force (Fx), Velocity and Power, in the eccentric (tECC) and concentric (tCON) phase of the jump, respectively. Repeated measures ANOVAs were used for statistical analysis (p ≤ 0.05). The jump height decrease was significant in Post 2 (-7.9%) but not in Post 1 (-4.1%). Fx and Velocity decreased significantly in both Post 1 (only in tECC) and Post 2 (both tECC and tCON). Α timing shift of the Fz peaks (earlier during tECC and later during tCON) and altered relative peak times (only in tECC) were also observed. Ensemble-average analysis revealed several time intervals of significant post-race alterations and a timing shift in the Fz-Velocity loop. An overall trend of lowered post-race jump output and mechanics was characterised by altered jump timing, restricted anterior-posterior movement and altered force-velocity relations. The specificity of mountain running fatigue to eccentric muscle work, appears to be reflected in the different time order of the post-race reductions, with the eccentric phase reductions preceding those of the concentric one. Thus, those who engage in mountain running should particularly consider downhill training to optimise eccentric muscular action. Key points The 4.1% reduction of jump height immediately after the race is not statistically significant The eccentric phase alterations of jump mechanics precede those of the concentric ones. Force-velocity alterations present a timing shift rather than a change in force or velocity magnitude. PMID:27274665
Alterations of Vertical Jump Mechanics after a Half-Marathon Mountain Running Race.
Rousanoglou, Elissavet N; Noutsos, Konstantinos; Pappas, Achilleas; Bogdanis, Gregory; Vagenas, Georgios; Bayios, Ioannis A; Boudolos, Konstantinos D
2016-06-01
The fatiguing effect of long-distance running has been examined in the context of a variety of parameters. However, there is scarcity of data regarding its effect on the vertical jump mechanics. The purpose of this study was to investigate the alterations of countermovement jump (CMJ) mechanics after a half-marathon mountain race. Twenty-seven runners performed CMJs before the race (Pre), immediately after the race (Post 1) and five minutes after Post 1 (Post 2). Instantaneous and ensemble-average analysis focused on jump height and, the maximum peaks and time-to-maximum peaks of: Displacement, vertical force (Fz), anterior-posterior force (Fx), Velocity and Power, in the eccentric (tECC) and concentric (tCON) phase of the jump, respectively. Repeated measures ANOVAs were used for statistical analysis (p ≤ 0.05). The jump height decrease was significant in Post 2 (-7.9%) but not in Post 1 (-4.1%). Fx and Velocity decreased significantly in both Post 1 (only in tECC) and Post 2 (both tECC and tCON). Α timing shift of the Fz peaks (earlier during tECC and later during tCON) and altered relative peak times (only in tECC) were also observed. Ensemble-average analysis revealed several time intervals of significant post-race alterations and a timing shift in the Fz-Velocity loop. An overall trend of lowered post-race jump output and mechanics was characterised by altered jump timing, restricted anterior-posterior movement and altered force-velocity relations. The specificity of mountain running fatigue to eccentric muscle work, appears to be reflected in the different time order of the post-race reductions, with the eccentric phase reductions preceding those of the concentric one. Thus, those who engage in mountain running should particularly consider downhill training to optimise eccentric muscular action. Key pointsThe 4.1% reduction of jump height immediately after the race is not statistically significantThe eccentric phase alterations of jump mechanics precede those of the concentric ones.Force-velocity alterations present a timing shift rather than a change in force or velocity magnitude.
NASA Astrophysics Data System (ADS)
Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo
2017-08-01
We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.
Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics
NASA Astrophysics Data System (ADS)
Sugiyama, Masaru
. Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.
Di Lorenzo, Rosaria; Baraldi, Sara; Ferrara, Maria; Mimmi, Stefano; Rigatelli, Marco
2012-04-01
To analyze physical restraint use in an Italian acute psychiatric ward, where mechanical restraint by belt is highly discouraged but allowed. Data were retrospectively collected from medical and nursing charts, from January 1, 2005, to December 31, 2008. Physical restraint rate and relationships between restraints and selected variables were statistically analyzed. Restraints were statistically significantly more frequent in compulsory or voluntary admissions of patients with an altered state of consciousness, at night, to control aggressive behavior, and in patients with "Schizophrenia and other Psychotic Disorders" during the first 72 hr of hospitalization. Analysis of clinical and organizational factors conditioning restraints may limit its use. © 2011 Wiley Periodicals, Inc.
Optical zone centration: a retrospective analysis of the excimer laser after three years
NASA Astrophysics Data System (ADS)
Vervecken, Filip; Trau, Rene; Mertens, Erik L.; Vanhorenbeeck, R.; Van Aerde, F.; Zen, J.; Haustrate, F.; Tassignon, Marie J.
1996-12-01
The aim of this study was to evaluate the implication of the mechanical factor 'decentration' on the visual outcome after PRK. 100 eyes of 70 patients were included. The mean decentration was 0.27 mm +/- 0.18. Decentration was less than 0.5 mm in 84 percent of the cases. The importance of the decentration was investigated by the statistical correlation of decentration from the pupilcenter and the visual outcome. We did not find any statistical significant association for decentrations less than 1 mm. Our conclusion is that decentration, if less than 1 mm, does not play an important role in the final visual outcome after PRK.
Statistical methods to detect novel genetic variants using publicly available GWAS summary data.
Guo, Bin; Wu, Baolin
2018-03-01
We propose statistical methods to detect novel genetic variants using only genome-wide association studies (GWAS) summary data without access to raw genotype and phenotype data. With more and more summary data being posted for public access in the post GWAS era, the proposed methods are practically very useful to identify additional interesting genetic variants and shed lights on the underlying disease mechanism. We illustrate the utility of our proposed methods with application to GWAS meta-analysis results of fasting glucose from the international MAGIC consortium. We found several novel genome-wide significant loci that are worth further study. Copyright © 2018 Elsevier Ltd. All rights reserved.
Souto, R Seoane; Martín-Rodero, A; Yeyati, A Levy
2016-12-23
We analyze the quantum quench dynamics in the formation of a phase-biased superconducting nanojunction. We find that in the absence of an external relaxation mechanism and for very general conditions the system gets trapped in a metastable state, corresponding to a nonequilibrium population of the Andreev bound states. The use of the time-dependent full counting statistics analysis allows us to extract information on the asymptotic population of even and odd many-body states, demonstrating that a universal behavior, dependent only on the Andreev state energy, is reached in the quantum point contact limit. These results shed light on recent experimental observations on quasiparticle trapping in superconducting atomic contacts.
Learning Predictive Statistics: Strategies and Brain Mechanisms.
Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe
2017-08-30
When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.
Portraits of self-organization in fish schools interacting with robots
NASA Astrophysics Data System (ADS)
Aureli, M.; Fiorilli, F.; Porfiri, M.
2012-05-01
In this paper, we propose an enabling computational and theoretical framework for the analysis of experimental instances of collective behavior in response to external stimuli. In particular, this work addresses the characterization of aggregation and interaction phenomena in robot-animal groups through the exemplary analysis of fish schooling in the vicinity of a biomimetic robot. We adapt global observables from statistical mechanics to capture the main features of the shoal collective motion and its response to the robot from experimental observations. We investigate the shoal behavior by using a diffusion mapping analysis performed on these global observables that also informs the definition of relevant portraits of self-organization.
Six new mechanics corresponding to further shape theories
NASA Astrophysics Data System (ADS)
Anderson, Edward
2016-02-01
In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.
Sidhu, Meneka Kaur; Duncan, John S; Sander, Josemir W
2018-05-17
Epilepsy neuroimaging is important for detecting the seizure onset zone, predicting and preventing deficits from surgery and illuminating mechanisms of epileptogenesis. An aspiration is to integrate imaging and genetic biomarkers to enable personalized epilepsy treatments. The ability to detect lesions, particularly focal cortical dysplasia and hippocampal sclerosis, is increased using ultra high-field imaging and postprocessing techniques such as automated volumetry, T2 relaxometry, voxel-based morphometry and surface-based techniques. Statistical analysis of PET and single photon emission computer tomography (STATISCOM) are superior to qualitative analysis alone in identifying focal abnormalities in MRI-negative patients. These methods have also been used to study mechanisms of epileptogenesis and pharmacoresistance.Recent language fMRI studies aim to localize, and also lateralize language functions. Memory fMRI has been recommended to lateralize mnemonic function and predict outcome after surgery in temporal lobe epilepsy. Combinations of structural, functional and post-processing methods have been used in multimodal and machine learning models to improve the identification of the seizure onset zone and increase understanding of mechanisms underlying structural and functional aberrations in epilepsy.
Astephen, J L; Deluzio, K J
2005-02-01
Osteoarthritis of the knee is related to many correlated mechanical factors that can be measured with gait analysis. Gait analysis results in large data sets. The analysis of these data is difficult due to the correlated, multidimensional nature of the measures. A multidimensional model that uses two multivariate statistical techniques, principal component analysis and discriminant analysis, was used to discriminate between the gait patterns of the normal subject group and the osteoarthritis subject group. Nine time varying gait measures and eight discrete measures were included in the analysis. All interrelationships between and within the measures were retained in the analysis. The multidimensional analysis technique successfully separated the gait patterns of normal and knee osteoarthritis subjects with a misclassification error rate of <6%. The most discriminatory feature described a static and dynamic alignment factor. The second most discriminatory feature described a gait pattern change during the loading response phase of the gait cycle. The interrelationships between gait measures and between the time instants of the gait cycle can provide insight into the mechanical mechanisms of pathologies such as knee osteoarthritis. These results suggest that changes in frontal plane loading and alignment and the loading response phase of the gait cycle are characteristic of severe knee osteoarthritis gait patterns. Subsequent investigations earlier in the disease process may suggest the importance of these factors to the progression of knee osteoarthritis.
Capture approximations beyond a statistical quantum mechanical method for atom-diatom reactions
NASA Astrophysics Data System (ADS)
Barrios, Lizandra; Rubayo-Soneira, Jesús; González-Lezana, Tomás
2016-03-01
Statistical techniques constitute useful approaches to investigate atom-diatom reactions mediated by insertion dynamics which involves complex-forming mechanisms. Different capture schemes based on energy considerations regarding the specific diatom rovibrational states are suggested to evaluate the corresponding probabilities of formation of such collision species between reactants and products in an attempt to test reliable alternatives for computationally demanding processes. These approximations are tested in combination with a statistical quantum mechanical method for the S + H2(v = 0 ,j = 1) → SH + H and Si + O2(v = 0 ,j = 1) → SiO + O reactions, where this dynamical mechanism plays a significant role, in order to probe their validity.
Prevention of the Posttraumatic Fibrotic Response in Joints
2015-10-01
used on a regular basis. Major Task 4: Evaluating the efficacy of inhibitory chIgG to reduce the consequences of traumatic joint injury. During...the second year of study, we successfully employed all assays needed to evaluate the utility of the inhibitory antibody to reduce the flexion...1. Major Task 5: Task 4. Data analysis and statistical evaluation of results. All data from the mechanical measurements, from the biochemical
1994-06-30
tip Opening Displacement (CTOD) Fracture Toughness Measurement". 48 The method has found application in the elastic-plastic fracture mechanics ( EPFM ...68 6.1 Proposed Material Property Database Format and Hierarchy .............. 68 6.2 Sample Application of the Material Property Database...the E 49.05 sub-committee. The relevant quality indicators applicable to the present program are: source of data, statistical basis of data
Targeted Riluzole Delivery by Antioxidant Nanovectors for Treating Amyotrophic Lateral Sclerosis
2015-06-01
neuronal marker ( choline acetyltransferase) and quantified image analysis. Motoneurons were counted in the anterior horn region of the lumbar spinal...cord (both sides , then averaged). We do not detect a statistical difference in surviving motoneurons between PEG-HCC and vehicle-treated subjects...beyond this particular funding mechanism in order to better develop PEG-HCCs as a novel and effective treatment for ALS. What was the impact on other
Gillespie, Paddy; O'Shea, Eamon; Smith, Susan M; Cupples, Margaret E; Murphy, Andrew W
2016-12-01
Data on health care utilization may be collected using a variety of mechanisms within research studies, each of which may have implications for cost and cost effectiveness. The aim of this observational study is to compare data collected from medical records searches and self-report questionnaires for the cost analysis of a cardiac secondary prevention intervention. Secondary data analysis of the Secondary Prevention of Heart Disease in General Practice (SPHERE) randomized controlled trial (RCT). Resource use data for a range of health care services were collected by research nurse searches of medical records and self-report questionnaires and costs of care estimated for each data collection mechanism. A series of statistical analyses were conducted to compare the mean costs for medical records data versus questionnaire data and to conduct incremental analyses for the intervention and control arms in the trial. Data were available to estimate costs for 95% of patients in the intervention and 96% of patients in the control using the medical records data compared to 65% and 66%, respectively, using the questionnaire data. The incremental analysis revealed a statistically significant difference in mean cost of -€796 (95% CI: -1447, -144; P-value: 0.017) for the intervention relative to the control. This compared to no significant difference in mean cost (95% CI: -1446, 860; P-value: 0.619) for the questionnaire analysis. Our findings illustrate the importance of the choice of health care utilization data collection mechanism for the conduct of economic evaluation alongside randomized trials in primary care. This choice will have implications for the costing methodology employed and potentially, for the cost and cost effectiveness outcomes generated. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
El-Malah, Yasser; Nazzal, Sami
2013-01-01
The objective of this work was to study the dissolution and mechanical properties of fast-dissolving films prepared from a tertiary mixture of pullulan, polyvinylpyrrolidone and hypromellose. Disintegration studies were performed in real-time by probe spectroscopy to detect the onset of film disintegration. Tensile strength and elastic modulus of the films were measured by texture analysis. Disintegration time of the films ranged from 21 to 105 seconds whereas their mechanical properties ranged from approximately 2 to 49 MPa for tensile strength and 1 to 21 MPa% for young's modulus. After generating polynomial models correlating the variables using a D-Optimal mixture design, an optimal formulation with desired responses was proposed by the statistical package. For validation, a new film formulation loaded with diclofenac sodium based on the optimized composition was prepared and tested for dissolution and tensile strength. Dissolution of the optimized film was found to commence almost immediately with 50% of the drug released within one minute. Tensile strength and young's modulus of the film were 11.21 MPa and 6, 78 MPa%, respectively. Real-time spectroscopy in conjunction with statistical design were shown to be very efficient for the optimization and development of non-conventional intraoral delivery system such as fast dissolving films.
Senior Computational Scientist | Center for Cancer Research
The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results
The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics
NASA Astrophysics Data System (ADS)
Pavlos, George
2015-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time series at three cases: sunspot index, solar flare and solar wind data. The non-linear analysis of the sunspot index is embedded in the non-extensive statistical theory of Tsallis (1988; 2004; 2009). The q-triplet of Tsallis, as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the SVD components of the sunspot index timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000, 2001). Our analysis showed clearly the following: (a) a phase transition process in the solar dynamics from high dimensional non-Gaussian SOC state to a low dimensional non-Gaussian chaotic state, (b) strong intermittent solar turbulence and anomalous (multifractal) diffusion solar process, which is strengthened as the solar dynamics makes a phase transition to low dimensional chaos in accordance to Ruzmaikin, Zelenyi and Milovanov's studies (Zelenyi and Milovanov, 1991; Milovanov and Zelenyi, 1993; Ruzmakin et al., 1996), (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of: (i) non-Gaussian probability distribution function P(x), (ii) multifractal scaling exponent spectrum f(a) and generalized Renyi dimension spectrum Dq, (iii) exponent spectrum J(p) of the structure functions estimated for the sunspot index and its underlying non equilibrium solar dynamics. Also, the q-triplet of Tsallis as well as the correlation dimension and the Lyapunov exponent spectrum were estimated for the singular value decomposition (SVD) components of the solar flares timeseries. Also the multifractal scaling exponent spectrum f(a), the generalized Renyi dimension spectrum D(q) and the spectrum J(p) of the structure function exponents were estimated experimentally and theoretically by using the q-entropy principle included in Tsallis non-extensive statistical theory, following Arimitsu and Arimitsu (2000). Our analysis showed clearly the following: (a) a phase transition process in the solar flare dynamics from a high dimensional non-Gaussian self-organized critical (SOC) state to a low dimensional also non-Gaussian chaotic state, (b) strong intermittent solar corona turbulence and an anomalous (multifractal) diffusion solar corona process, which is strengthened as the solar corona dynamics makes a phase transition to low dimensional chaos, (c) faithful agreement of Tsallis non-equilibrium statistical theory with the experimental estimations of the functions: (i) non-Gaussian probability distribution function P(x), (ii) f(a) and D(q), and (iii) J(p) for the solar flares timeseries and its underlying non-equilibrium solar dynamics, and (d) the solar flare dynamical profile is revealed similar to the dynamical profile of the solar corona zone as far as the phase transition process from self-organized criticality (SOC) to chaos state. However the solar low corona (solar flare) dynamical characteristics can be clearly discriminated from the dynamical characteristics of the solar convection zone. At last we present novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which can take place in Solar wind plasma system. The solar wind plasma as well as the entire solar plasma system is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields ( ) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992). References 1. T. Arimitsu, N. Arimitsu, Tsallis statistics and fully developed turbulence, J. Phys. A: Math. Gen. 33 (2000) L235. 2. T. Arimitsu, N. Arimitsu, Analysis of turbulence by statistics based on generalized entropies, Physica A 295 (2001) 177-194. 3. T. Chang, Low-dimensional behavior and symmetry braking of stochastic systems near criticality can these effects be observed in space and in the laboratory, IEEE 20 (6) (1992) 691-694. 4. U. Frisch, Turbulence, Cambridge University Press, Cambridge, UK, 1996, p. 310. 5. L.P. Karakatsanis, G.P. Pavlos, M.N. Xenakis, Tsallis non-extensive statistics, intermittent turbulence, SOC and chaos in the solar plasma. Part two: Solar flares dynamics, Physica A 392 (2013) 3920-3944. 6. A.V. Milovanov, Topological proof for the Alexander-Orbach conjecture, Phys. Rev. E 56 (3) (1997) 2437-2446. 7. A.V. Milovanov, L.M. Zelenyi, Fracton excitations as a driving mechanism for the self-organized dynamical structuring in the solar wind, Astrophys. Space Sci. 264 (1-4) (1999) 317-345. 8. A.V. Milovanov, Stochastic dynamics from the fractional Fokker-Planck-Kolmogorov equation: large-scale behavior of the turbulent transport coefficient, Phys. Rev. E 63 (2001) 047301. 9. G.P. Pavlos, et al., Universality of non-extensive Tsallis statistics and time series analysis: Theory and applications, Physica A 395 (2014) 58-95. 10. G.P. Pavlos, et al., Tsallis non-extensive statistics and solar wind plasma complexity, Physica A 422 (2015) 113-135. 11. A.A. Ruzmaikin, et al., Spectral properties of solar convection and diffusion, ApJ 471 (1996) 1022. 12. V.E. Tarasov, Review of some promising fractional physical models, Internat. J. Modern Phys. B 27 (9) (2013) 1330005. 13. C. Tsallis, Possible generalization of BG statistics, J. Stat. Phys. J 52 (1-2) (1988) 479-487. 14. C. Tsallis, Nonextensive statistical mechanics: construction and physical interpretation, in: G.M. Murray, C. Tsallis (Eds.), Nonextensive Entropy-Interdisciplinary Applications, Oxford Univ. Press, 2004, pp. 1-53. 15. C. Tsallis, Introduction to Non-Extensive Statistical Mechanics, Springer, 2009. 16. G.M. Zaslavsky, Chaos, fractional kinetics, and anomalous transport, Physics Reports 371 (2002) 461-580. 17. L.M. Zelenyi, A.V. Milovanov, Fractal properties of sunspots, Sov. Astron. Lett. 17 (6) (1991) 425. 18. L.M. Zelenyi, A.V. Milovanov, Fractal topology and strange kinetics: from percolation theory to problems in cosmic electrodynamics, Phys.-Usp. 47 (8), (2004) 749-788.
Econophysical visualization of Adam Smith’s invisible hand
NASA Astrophysics Data System (ADS)
Cohen, Morrel H.; Eliazar, Iddo I.
2013-02-01
Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.
Statistical patterns of visual search for hidden objects
Credidio, Heitor F.; Teixeira, Elisângela N.; Reis, Saulo D. S.; Moreira, André A.; Andrade Jr, José S.
2012-01-01
The movement of the eyes has been the subject of intensive research as a way to elucidate inner mechanisms of cognitive processes. A cognitive task that is rather frequent in our daily life is the visual search for hidden objects. Here we investigate through eye-tracking experiments the statistical properties associated with the search of target images embedded in a landscape of distractors. Specifically, our results show that the twofold process of eye movement, composed of sequences of fixations (small steps) intercalated by saccades (longer jumps), displays characteristic statistical signatures. While the saccadic jumps follow a log-normal distribution of distances, which is typical of multiplicative processes, the lengths of the smaller steps in the fixation trajectories are consistent with a power-law distribution. Moreover, the present analysis reveals a clear transition between a directional serial search to an isotropic random movement as the difficulty level of the searching task is increased. PMID:23226829
Statistical mechanics in the context of special relativity.
Kaniadakis, G
2002-11-01
In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of the ordinary statistical mechanics and is suitable to describe a very large class of experimentally observed phenomena in low and high energy physics and in natural, economic, and social sciences. Finally, in order to test the correctness and predictability of the theory, as working example we consider the cosmic rays spectrum, which spans 13 decades in energy and 33 decades in flux, finding a high quality agreement between our predictions and observed data.
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model
NASA Astrophysics Data System (ADS)
Yuan, Zhongda; Deng, Junxiang; Wang, Dawei
2018-02-01
Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.
Patel, Dishant; Bashetty, Kusum; Srirekha, A.; Archana, S.; Savitha, B.; Vijay, R.
2016-01-01
Aim: The aim of this study was to evaluate the influence of manual versus mechanical glide path (GP) on the surface changes of two different nickel-titanium rotary instruments used during root canal therapy in a moderately curved root canal. Materials and Methods: Sixty systemically healthy controls were selected for the study. Controls were divided randomly into four groups: Group 1: Manual GP followed by RaCe rotary instruments, Group 2: Manual GP followed by HyFlex rotary instruments, Group 3: Mechanical GP followed by RaCe rotary instruments, Group 4: Mechanical GP followed by HyFlex rotary instruments. After access opening, GP was prepared and rotary instruments were used according to manufacturer's instructions. All instruments were evaluated for defects under standard error mean before their use and after a single use. The scorings for the files were given at apical and middle third. Statistical Analysis Used: Chi-squared test was used. Results: The results showed that there is no statistical difference between any of the groups. Irrespective of the GP and rotary files used, more defects were present in the apical third when compared to middle third of the rotary instrument. Conclusion: Within the limitations of this study, it can be concluded that there was no effect of manual or mechanical GP on surface defects of subsequent rotary file system used. PMID:27994317
Assessment of the mechanics of a tissue-engineered rat trachea in an image-processing environment.
Silva, Thiago Henrique Gomes da; Pazetti, Rogerio; Aoki, Fabio Gava; Cardoso, Paulo Francisco Guerreiro; Valenga, Marcelo Henrique; Deffune, Elenice; Evaristo, Thaiane; Pêgo-Fernandes, Paulo Manuel; Moriya, Henrique Takachi
2014-07-01
Despite the recent success regarding the transplantation of tissue-engineered airways, the mechanical properties of these grafts are not well understood. Mechanical assessment of a tissue-engineered airway graft before implantation may be used in the future as a predictor of function. The aim of this preliminary work was to develop a noninvasive image-processing environment for the assessment of airway mechanics. Decellularized, recellularized and normal tracheas (groups DECEL, RECEL, and CONTROL, respectively) immersed in Krebs-Henseleit solution were ventilated by a small-animal ventilator connected to a Fleisch pneumotachograph and two pressure transducers (differential and gauge). A camera connected to a stereomicroscope captured images of the pulsation of the trachea before instillation of saline solution and after instillation of Krebs-Henseleit solution, followed by instillation with Krebs-Henseleit with methacholine 0.1 M (protocols A, K and KMCh, respectively). The data were post-processed with computer software and statistical comparisons between groups and protocols were performed. There were statistically significant variations in the image measurements of the medial region of the trachea between the groups (two-way analysis of variance [ANOVA], p<0.01) and of the proximal region between the groups and protocols (two-way ANOVA, p<0.01). The technique developed in this study is an innovative method for performing a mechanical assessment of engineered tracheal grafts that will enable evaluation of the viscoelastic properties of neo-tracheas prior to transplantation.
NASA Astrophysics Data System (ADS)
Abid, Najmul; Mirkhalaf, Mohammad; Barthelat, Francois
2018-03-01
Natural materials such as nacre, collagen, and spider silk are composed of staggered stiff and strong inclusions in a softer matrix. This type of hybrid microstructure results in remarkable combinations of stiffness, strength, and toughness and it now inspires novel classes of high-performance composites. However, the analytical and numerical approaches used to predict and optimize the mechanics of staggered composites often neglect statistical variations and inhomogeneities, which may have significant impacts on modulus, strength, and toughness. Here we present an analysis of localization using small representative volume elements (RVEs) and large scale statistical volume elements (SVEs) based on the discrete element method (DEM). DEM is an efficient numerical method which enabled the evaluation of more than 10,000 microstructures in this study, each including about 5,000 inclusions. The models explore the combined effects of statistics, inclusion arrangement, and interface properties. We find that statistical variations have a negative effect on all properties, in particular on the ductility and energy absorption because randomness precipitates the localization of deformations. However, the results also show that the negative effects of random microstructures can be offset by interfaces with large strain at failure accompanied by strain hardening. More specifically, this quantitative study reveals an optimal range of interface properties where the interfaces are the most effective at delaying localization. These findings show how carefully designed interfaces in bioinspired staggered composites can offset the negative effects of microstructural randomness, which is inherent to most current fabrication methods.
Nonplanar KdV and KP equations for quantum electron-positron-ion plasma
NASA Astrophysics Data System (ADS)
Dutta, Debjit
2015-12-01
Nonlinear quantum ion-acoustic waves with the effects of nonplanar cylindrical geometry, quantum corrections, and transverse perturbations are studied. By using the standard reductive perturbation technique, a cylindrical Kadomtsev-Petviashvili equation for ion-acoustic waves is derived by incorporating quantum-mechanical effects. The quantum-mechanical effects via quantum diffraction and quantum statistics and the role of transverse perturbations in cylindrical geometry on the dynamics of this wave are studied analytically. It is found that the dynamics of ion-acoustic solitary waves (IASWs) is governed by a three-dimensional cylindrical Kadomtsev-Petviashvili equation (CKPE). The results could help in a theoretical analysis of astrophysical and laser produced plasmas.
Moral foundations in an interacting neural networks society: A statistical mechanics analysis
NASA Astrophysics Data System (ADS)
Vicente, R.; Susemihl, A.; Jericó, J. P.; Caticha, N.
2014-04-01
The moral foundations theory supports that people, across cultures, tend to consider a small number of dimensions when classifying issues on a moral basis. The data also show that the statistics of weights attributed to each moral dimension is related to self-declared political affiliation, which in turn has been connected to cognitive learning styles by the recent literature in neuroscience and psychology. Inspired by these data, we propose a simple statistical mechanics model with interacting neural networks classifying vectors and learning from members of their social neighbourhood about their average opinion on a large set of issues. The purpose of learning is to reduce dissension among agents when disagreeing. We consider a family of learning algorithms parametrized by δ, that represents the importance given to corroborating (same sign) opinions. We define an order parameter that quantifies the diversity of opinions in a group with homogeneous learning style. Using Monte Carlo simulations and a mean field approximation we find the relation between the order parameter and the learning parameter δ at a temperature we associate with the importance of social influence in a given group. In concordance with data, groups that rely more strongly on corroborating evidence sustain less opinion diversity. We discuss predictions of the model and propose possible experimental tests.
Atmospheric Convective Organization: Self-Organized Criticality or Homeostasis?
NASA Astrophysics Data System (ADS)
Yano, Jun-Ichi
2015-04-01
Atmospheric convection has a tendency organized on a hierarchy of scales ranging from the mesoscale to the planetary scales, with the latter especially manifested by the Madden-Julian oscillation. The present talk examines two major possible mechanisms of self-organization identified in wider literature from a phenomenological thermodynamic point of view by analysing a planetary-scale cloud-resolving model simulation. The first mechanism is self-organized criticality. A saturation tendency of precipitation rate with the increasing column-integrated water, reminiscence of critical phenomena, indicates self-organized criticality. The second is a self-regulation mechanism that is known as homeostasis in biology. A thermodynamic argument suggests that such self-regulation maintains the column-integrated water below a threshold by increasing the precipitation rate. Previous analyses of both observational data as well as cloud-resolving model (CRM) experiments give mixed results. A satellite data analysis suggests self-organized criticality. Some observational data as well as CRM experiments support homeostasis. Other analyses point to a combination of these two interpretations. In this study, a CRM experiment over a planetary-scale domain with a constant sea-surface temperature is analyzed. This analysis shows that the relation between the column-integrated total water and precipitation suggests self-organized criticality, whereas the one between the column-integrated water vapor and precipitation suggests homeostasis. The concurrent presence of these two mechanisms are further elaborated by detailed statistical and budget analyses. These statistics are scale invariant, reflecting a spatial scaling of precipitation processes. These self-organization mechanisms are most likely be best theoretically understood by the energy cycle of the convective systems consisting of the kinetic energy and the cloud-work function. The author has already investigated the behavior of this cycle system under a zero-dimensional configuration. Preliminary simulations of this cycle system over a two-dimensional domain will be presented.
From Mechanical Motion to Brownian Motion, Thermodynamics and Particle Transport Theory
ERIC Educational Resources Information Center
Bringuier, E.
2008-01-01
The motion of a particle in a medium is dealt with either as a problem of mechanics or as a transport process in non-equilibrium statistical physics. The two kinds of approach are often unrelated as they are taught in different textbooks. The aim of this paper is to highlight the link between the mechanical and statistical treatments of particle…
Network approach towards understanding the crazing in glassy amorphous polymers
NASA Astrophysics Data System (ADS)
Venkatesan, Sudarkodi; Vivek-Ananth, R. P.; Sreejith, R. P.; Mangalapandi, Pattulingam; Hassanali, Ali A.; Samal, Areejit
2018-04-01
We have used molecular dynamics to simulate an amorphous glassy polymer with long chains to study the deformation mechanism of crazing and associated void statistics. The Van der Waals interactions and the entanglements between chains constituting the polymer play a crucial role in crazing. Thus, we have reconstructed two underlying weighted networks, namely, the Van der Waals network and the entanglement network from polymer configurations extracted from the molecular dynamics simulation. Subsequently, we have performed graph-theoretic analysis of the two reconstructed networks to reveal the role played by them in the crazing of polymers. Our analysis captured various stages of crazing through specific trends in the network measures for Van der Waals networks and entanglement networks. To further corroborate the effectiveness of network analysis in unraveling the underlying physics of crazing in polymers, we have contrasted the trends in network measures for Van der Waals networks and entanglement networks in the light of stress-strain behaviour and voids statistics during deformation. We find that the Van der Waals network plays a crucial role in craze initiation and growth. Although, the entanglement network was found to maintain its structure during craze initiation stage, it was found to progressively weaken and undergo dynamic changes during the hardening and failure stages of crazing phenomena. Our work demonstrates the utility of network theory in quantifying the underlying physics of polymer crazing and widens the scope of applications of network science to characterization of deformation mechanisms in diverse polymers.
CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Bandopadhyay; N. Nagabhushana
2003-10-01
Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably wellmore » developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.« less
Wang, Cheng; Peng, Jingjin; Kuang, Yanling; Zhang, Jiaqiang; Dai, Luming
2017-01-01
Pleural effusion is a common clinical manifestation with various causes. Current diagnostic and therapeutic methods have exhibited numerous limitations. By involving the analysis of dynamic changes in low molecular weight catabolites, metabolomics has been widely applied in various types of disease and have provided platforms to distinguish many novel biomarkers. However, to the best of our knowledge, there are few studies regarding the metabolic profiling for pleural effusion. In the current study, 58 pleural effusion samples were collected, among which 20 were malignant pleural effusions, 20 were tuberculous pleural effusions and 18 were transudative pleural effusions. The small molecule metabolite spectrums were obtained by adopting 1H nuclear magnetic resonance technology, and pattern-recognition multi-variable statistical analysis was used to screen out different metabolites. One-way analysis of variance, and Student-Newman-Keuls and the Kruskal-Wallis test were adopted for statistical analysis. Over 400 metabolites were identified in the untargeted metabolomic analysis and 26 metabolites were identified as significantly different among tuberculous, malignant and transudative pleural effusions. These metabolites were predominantly involved in the metabolic pathways of amino acids metabolism, glycometabolism and lipid metabolism. Statistical analysis revealed that eight metabolites contributed to the distinction between the three groups: Tuberculous, malignant and transudative pleural effusion. In the current study, the feasibility of identifying small molecule biochemical profiles in different types of pleural effusion were investigated reveal novel biological insights into the underlying mechanisms. The results provide specific insights into the biology of tubercular, malignant and transudative pleural effusion and may offer novel strategies for the diagnosis and therapy of associated diseases, including tuberculosis, advanced lung cancer and congestive heart failure. PMID:28627685
Effect of CorrelatedRotational Noise
NASA Astrophysics Data System (ADS)
Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna
The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Quantum-mechanical analysis of low-gain free-electron laser oscillators
NASA Astrophysics Data System (ADS)
Fares, H.; Yamada, M.; Chiadroni, E.; Ferrario, M.
2018-05-01
In the previous classical theory of the low-gain free-electron laser (FEL) oscillators, the electron is described as a point-like particle, a delta function in the spatial space. On the other hand, in the previous quantum treatments, the electron is described as a plane wave with a single momentum state, a delta function in the momentum space. In reality, an electron must have statistical uncertainties in the position and momentum domains. Then, the electron is neither a point-like charge nor a plane wave of a single momentum. In this paper, we rephrase the theory of the low-gain FEL where the interacting electron is represented quantum mechanically by a plane wave with a finite spreading length (i.e., a wave packet). Using the concepts of the transformation of reference frames and the statistical quantum mechanics, an expression for the single-pass radiation gain is derived. The spectral broadening of the radiation is expressed in terms of the spreading length of an electron, the relaxation time characterizing the energy spread of electrons, and the interaction time. We introduce a comparison between our results and those obtained in the already known classical analyses where a good agreement between both results is shown. While the correspondence between our results and the classical results are shown, novel insights into the electron dynamics and the interaction mechanism are presented.
Insights into Corona Formation through Statistical Analyses
NASA Technical Reports Server (NTRS)
Glaze, L. S.; Stofan, E. R.; Smrekar, S. E.; Baloga, S. M.
2002-01-01
Statistical analysis of an expanded database of coronae on Venus indicates that the populations of Type 1 (with fracture annuli) and 2 (without fracture annuli) corona diameters are statistically indistinguishable, and therefore we have no basis for assuming different formation mechanisms. Analysis of the topography and diameters of coronae shows that coronae that are depressions, rimmed depressions, and domes tend to be significantly smaller than those that are plateaus, rimmed plateaus, or domes with surrounding rims. This is consistent with the model of Smrekar and Stofan and inconsistent with predictions of the spreading drop model of Koch and Manga. The diameter range for domes, the initial stage of corona formation, provides a broad constraint on the buoyancy of corona-forming plumes. Coronae are only slightly more likely to be topographically raised than depressions, with Type 1 coronae most frequently occurring as rimmed depressions and Type 2 coronae most frequently occuring with flat interiors and raised rims. Most Type 1 coronae are located along chasmata systems or fracture belts, while Type 2 coronas are found predominantly as isolated features in the plains. Coronae at hotspot rises tend to be significantly larger than coronae in other settings, consistent with a hotter upper mantle at hotspot rises and their active state.
Choroidal Thickness Analysis in Patients with Usher Syndrome Type 2 Using EDI OCT.
Colombo, L; Sala, B; Montesano, G; Pierrottet, C; De Cillà, S; Maltese, P; Bertelli, M; Rossetti, L
2015-01-01
To portray Usher Syndrome type 2, analyzing choroidal thickness and comparing data reported in published literature on RP and healthy subjects. Methods. 20 eyes of 10 patients with clinical signs and genetic diagnosis of Usher Syndrome type 2. Each patient underwent a complete ophthalmologic examination including Best Corrected Visual Acuity (BCVA), intraocular pressure (IOP), axial length (AL), automated visual field (VF), and EDI OCT. Both retinal and choroidal measures were measured. Statistical analysis was performed to correlate choroidal thickness with age, BCVA, IOP, AL, VF, and RT. Comparison with data about healthy people and nonsyndromic RP patients was performed. Results. Mean subfoveal choroidal thickness (SFCT) was 248.21 ± 79.88 microns. SFCT was statistically significant correlated with age (correlation coefficient -0.7248179, p < 0.01). No statistically significant correlation was found between SFCT and BCVA, IOP, AL, VF, and RT. SFCT was reduced if compared to healthy subjects (p < 0.01). No difference was found when compared to choroidal thickness from nonsyndromic RP patients (p = 0.2138). Conclusions. Our study demonstrated in vivo choroidal thickness reduction in patients with Usher Syndrome type 2. These data are important for the comprehension of mechanisms of disease and for the evaluation of therapeutic approaches.
COMPUTATIONAL ANALYSIS OF SWALLOWING MECHANICS UNDERLYING IMPAIRED EPIGLOTTIC INVERSION
Pearson, William G.; Taylor, Brandon K; Blair, Julie; Martin-Harris, Bonnie
2015-01-01
Objective Determine swallowing mechanics associated with the first and second epiglottic movements, that is, movement to horizontal and full inversion respectively, in order to provide a clinical interpretation of impaired epiglottic function. Study Design Retrospective cohort study. Methods A heterogeneous cohort of patients with swallowing difficulties was identified (n=92). Two speech-language pathologists reviewed 5ml thin and 5ml pudding videofluoroscopic swallow studies per subject, and assigned epiglottic component scores of 0=complete inversion, 1=partial inversion, and 2=no inversion forming three groups of videos for comparison. Coordinates mapping minimum and maximum excursion of the hyoid, pharynx, larynx, and tongue base during pharyngeal swallowing were recorded using ImageJ software. A canonical variate analysis with post-hoc discriminant function analysis of coordinates was performed using MorphoJ software to evaluate mechanical differences between groups. Eigenvectors characterizing swallowing mechanics underlying impaired epiglottic movements were visualized. Results Nineteen of 184 video-swallows were rejected for poor quality (n=165). A Goodman-Kruskal index of predictive association showed no correlation between epiglottic component scores and etiologies of dysphagia (λ=.04). A two-way analysis of variance by epiglottic component scores showed no significant interaction effects between sex and age (f=1.4, p=.25). Discriminant function analysis demonstrated statistically significant mechanical differences between epiglottic component scores: 1&2, representing the first epiglottic movement (Mahalanobis distance=1.13, p=.0007); and, 0&1, representing the second epiglottic movement (Mahalanobis distance=0.83, p=.003). Eigenvectors indicate that laryngeal elevation and tongue base retraction underlie both epiglottic movements. Conclusion Results suggest that reduced tongue base retraction and laryngeal elevation underlie impaired first and second epiglottic movements. The styloglossus, hyoglossus and long pharyngeal muscles are implicated as targets for rehabilitation in dysphagic patients with impaired epiglottic inversion. PMID:27426940
Relating triggering processes in lab experiments with earthquakes.
NASA Astrophysics Data System (ADS)
Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.
2016-12-01
Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.
Neurological Outcomes Following Suicidal Hanging: A Prospective Study of 101 Patients
Jawaid, Mohammed Turab; Amalnath, S. Deepak; Subrahmanyam, D. K. S.
2017-01-01
Context: Survivors of suicidal hanging can have variable neurological outcomes – from complete recovery to irreversible brain damage. Literature on the neurological outcomes in these patients is confined to retrospective studies and case series. Hence, this prospective study was carried out. Aims: The aim is to study the neurological outcomes in suicidal hanging. Settings and Design: This was a prospective observational study carried out from July 2014 to July 2016. Subjects and Methods: Consecutive patients admitted to the emergency and medicine wards were included in the study. Details of the clinical and radiological findings, course in hospital and at 1 month postdischarge were analyzed. Statistical Analysis Used: Statistical analysis was performed using IBM SPSS advanced statistics 20.0 (SPSS Inc., Chicago, USA). Univariate analysis was performed using Chi-square test for significance and Odd's ratio was calculated. Results: Of the 101 patients, 6 died and 4 had residual neuro deficits. Cervical spine injury was seen in 3 patients. Interestingly, 39 patients could not remember the act of hanging (retrograde amnesia). Hypotension, pulmonary edema, Glasgow coma scale (GCS) score <8 at admission, need for mechanical ventilation, and cerebral edema on plain computed tomography were more in those with amnesia as compared to those with normal memory and these findings were statistically significant. Conclusions: Majority of patients recovered without any sequelae. Routine imaging of cervical spine may not be warranted in all patients, even in those with poor GCS. Retrograde amnesia might be more common than previously believed and further studies are needed to analyze this peculiar feature. PMID:28584409
Chahal, Gurparkash Singh; Chhina, Kamalpreet; Chhabra, Vipin; Bhatnagar, Rakhi; Chahal, Amna
2014-01-01
Background: A surface smear layer consisting of organic and inorganic material is formed on the root surface following mechanical instrumentation and may inhibit the formation of new connective tissue attachment to the root surface. Modification of the tooth surface by root conditioning has resulted in improved connective tissue attachment and has advanced the goal of reconstructive periodontal treatment. Aim: The aim of this study was to compare the effects of citric acid, tetracycline, and doxycycline on the instrumented periodontally involved root surfaces in vitro using a scanning electron microscope. Settings and Design: A total of 45 dentin samples obtained from 15 extracted, scaled, and root planed teeth were divided into three groups. Materials and Methods: The root conditioning agents were applied with cotton pellets using the Passive burnishing technique for 5 minutes. The samples were then examined by the scanning electron microscope. Statistical Analysis Used: The statistical analysis was carried out using Statistical Package for Social Sciences (SPSS Inc., Chicago, IL, version 15.0 for Windows). For all quantitative variables means and standard deviations were calculated and compared. For more than two groups ANOVA was applied. For multiple comparisons post hoc tests with Bonferroni correction was used. Results: Upon statistical analysis the root conditioning agents used in this study were found to be effective in removing the smear layer, uncovering and widening the dentin tubules and unmasking the dentin collagen matrix. Conclusion: Tetracycline HCl was found to be the best root conditioner among the three agents used. PMID:24744541
Outbreak of resistant Acinetobacter baumannii- measures and proposal for prevention and control.
Romanelli, Roberta Maia de Castro; Jesus, Lenize Adriana de; Clemente, Wanessa Trindade; Lima, Stella Sala Soares; Rezende, Edna Maria; Coutinho, Rosane Luiza; Moreira, Ricardo Luiz Fontes; Neves, Francelli Aparecida Cordeiro; Brás, Nelma de Jesus
2009-10-01
Acinetobacter baumannii colonization and infection, frequent in Intensive Care Unit (ICU) patients, is commonly associated with high morbimortality. Several outbreaks due to multidrug-resistant (MDR) A. baumanii have been reported but few of them in Brazil. This study aimed to identify risk factors associated with colonization and infection by MDR and carbapenem-resistant A. baumannii strains isolated from patients admitted to the adult ICU at HC/UFMG. A case-control study was performed from January 2007 to June 2008. Cases were defined as patients colonized or infected by MDR/carbapenem-resistant A. baumannii, and controls were patients without MDR/carbapenem-resistant A. baumannii isolation, in a 1:2 proportion. For statistical analysis, due to changes in infection control guidelines, infection criteria and the notification process, this study was divided into two periods. During the first period analyzed, from January to December 2007, colonization or infection by MDR/carbapenem-resistant A. baumannii was associated with prior infection, invasive device utilization, prior carbapenem use and clinical severity. In the multivariate analysis, prior infection and mechanical ventilation proved to be statistically significant risk factors. Carbapenem use showed a tendency towards a statistical association. During the second study period, from January to June 2008, variables with a significant association with MDR/carbapenem-resistant A. baumannii colonization/infection were catheter utilization, carbapenem and third-generation cephalosporin use, hepatic transplantation, and clinical severity. In the multivariate analysis, only CVC use showed a statistical difference. Carbapenem and third-generation cephalosporin use displayed a tendency to be risk factors. Risk factors must be focused on infection control and prevention measures considering A. baumanni dissemination.
Sherrill, Joel T; Sommers, David I; Nierenberg, Andrew A; Leon, Andrew C; Arndt, Stephan; Bandeen-Roche, Karen; Greenhouse, Joel; Guthrie, Donald; Normand, Sharon-Lise; Phillips, Katharine A; Shear, M Katherine; Woolson, Robert
2009-01-01
The authors summarize points for consideration generated in a National Institute of Mental Health (NIMH) workshop convened to provide an opportunity for reviewers from different disciplines-specifically clinical researchers and statisticians-to discuss how their differing and complementary expertise can be well integrated in the review of intervention-related grant applications. A 1-day workshop was convened in October, 2004. The workshop featured panel presentations on key topics followed by interactive discussion. This article summarizes the workshop and subsequent discussions, which centered on topics including weighting the statistics/data analysis elements of an application in the assessment of the application's overall merit; the level of statistical sophistication appropriate to different stages of research and for different funding mechanisms; some key considerations in the design and analysis portions of applications; appropriate statistical methods for addressing essential questions posed by an application; and the role of the statistician in the application's development, study conduct, and interpretation and dissemination of results. A number of key elements crucial to the construction and review of grant applications were identified. It was acknowledged that intervention-related studies unavoidably involve trade-offs. Reviewers are helped when applications acknowledge such trade-offs and provide good rationale for their choices. Clear linkage among the design, aims, hypotheses, and data analysis plan and avoidance of disconnections among these elements also strengthens applications. The authors identify multiple points to consider when constructing intervention-related grant applications. The points are presented here as questions and do not reflect institute policy or comprise a list of best practices, but rather represent points for consideration.
Analysis of Yb3+/Er3+-codoped microring resonator cross-grid matrices
NASA Astrophysics Data System (ADS)
Vallés, Juan A.; Gǎlǎtuş, Ramona
2014-09-01
An analytic model of the scattering response of a highly Yb3+/Er3+-codoped phosphate glass microring resonator matrix is considered to obtain the transfer functions of an M x N cross-grid microring resonator structure. Then a detailed model is used to calculate the pump and signal propagation, including a microscopic statistical formalism to describe the high-concentration induced energy-transfer mechanisms and passive and active features are combined to realistically simulate the performance as a wavelength-selective amplifier or laser. This analysis allows the optimization of these structures for telecom or sensing applications.
Analyzing Data for Systems Biology: Working at the Intersection of Thermodynamics and Data Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cannon, William R.; Baxter, Douglas J.
2012-08-15
Many challenges in systems biology have to do with analyzing data within the framework of molecular phenomena and cellular pathways. How does this relate to thermodynamics that we know govern the behavior of molecules? Making progress in relating data analysis to thermodynamics is essential in systems biology if we are to build predictive models that enable the field of synthetic biology. This report discusses work at the crossroads of thermodynamics and data analysis, and demonstrates that statistical mechanical free energy is a multinomial log likelihood. Applications to systems biology are presented.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
Attard, Phil
2005-04-15
The concept of second entropy is introduced for the dynamic transitions between macrostates. It is used to develop a theory for fluctuations in velocity, and is exemplified by deriving Onsager reciprocal relations for Brownian motion. The cases of free, driven, and pinned Brownian particles are treated in turn, and Stokes' law is derived. The second entropy analysis is applied to the general case of thermodynamic fluctuations, and the Onsager reciprocal relations for these are derived using the method. The Green-Kubo formulas for the transport coefficients emerge from the analysis, as do Langevin dynamics.
Multi-scale mechanics of granular solids from grain-resolved X-ray measurements
NASA Astrophysics Data System (ADS)
Hurley, R. C.; Hall, S. A.; Wright, J. P.
2017-11-01
This work discusses an experimental technique for studying the mechanics of three-dimensional (3D) granular solids. The approach combines 3D X-ray diffraction and X-ray computed tomography to measure grain-resolved strains, kinematics and contact fabric in the bulk of a granular solid, from which continuum strains, grain stresses, interparticle forces and coarse-grained elasto-plastic moduli can be determined. We demonstrate the experimental approach and analysis of selected results on a sample of 1099 stiff, frictional grains undergoing multiple uniaxial compression cycles. We investigate the inter-particle force network, elasto-plastic moduli and associated length scales, reversibility of mechanical responses during cyclic loading, the statistics of microscopic responses and microstructure-property relationships. This work serves to highlight both the fundamental insight into granular mechanics that is furnished by combined X-ray measurements and describes future directions in the field of granular materials that can be pursued with such approaches.
Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang
2014-10-01
Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Min, J. B.; Xue, D.; Shi, Y.
2013-01-01
A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.
A Large-Scale Analysis of Variance in Written Language.
Johns, Brendan T; Jamieson, Randall K
2018-01-22
The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers, & Tenenbaum, ; Jones & Mewhort, ; Landauer & Dumais, ; Mikolov, Sutskever, Chen, Corrado, & Dean, ). The models treat knowledge as an interaction of processing mechanisms and the structure of language experience. But language experience is often treated agnostically. We report a distributional semantic analysis that shows written language in fiction books varies appreciably between books from the different genres, books from the same genre, and even books written by the same author. Given that current theories assume that word knowledge reflects an interaction between processing mechanisms and the language environment, the analysis shows the need for the field to engage in a more deliberate consideration and curation of the corpora used in computational studies of natural language processing. Copyright © 2018 Cognitive Science Society, Inc.
2009-07-16
0.25 0.26 -0.85 1 SSR SSE R SSTO SSTO = = − 2 2 ˆ( ) : Regression sum of square, ˆwhere : mean value, : value from the fitted line ˆ...Error sum of square : Total sum of square i i i i SSR Y Y Y Y SSE Y Y SSTO SSE SSR = − = − = + ∑ ∑ Statistical analysis: Coefficient of correlation
Prevention of the Posttraumatic Fibrotic Response in Joints
2015-10-01
are currently used on a regular basis. Major Task 4: Evaluating the efficacy of inhibitory chIgG to reduce the consequences of traumatic joint...injury. During the second year of study, we successfully employed all assays needed to evaluate the utility of the inhibitory antibody to reduce the...32nd week 1. Major Task 5: Task 4. Data analysis and statistical evaluation of results. All data from the mechanical measurements, from the
Expression Profiling of Nonpolar Lipids in Meibum From Patients With Dry Eye: A Pilot Study
Chen, Jianzhong; Keirsey, Jeremy K.; Green, Kari B.; Nichols, Kelly K.
2017-01-01
Purpose The purpose of this investigation was to characterize differentially expressed lipids in meibum samples from patients with dry eye disease (DED) in order to better understand the underlying pathologic mechanisms. Methods Meibum samples were collected from postmenopausal women with DED (PW-DED; n = 5) and a control group of postmenopausal women without DED (n = 4). Lipid profiles were analyzed by direct infusion full-scan electrospray ionization mass spectrometry (ESI-MS). An initial analysis of 145 representative peaks from four classes of lipids in PW-DED samples revealed that additional manual corrections for peak overlap and isotopes only slightly affected the statistical analysis. Therefore, analysis of uncorrected data, which can be applied to a greater number of peaks, was used to compare more than 500 lipid peaks common to PW-DED and control samples. Statistical analysis of peak intensities identified several lipid species that differed significantly between the two groups. Data from contact lens wearers with DED (CL-DED; n = 5) were also analyzed. Results Many species of the two types of diesters (DE) and very long chain wax esters (WE) were decreased by ∼20% in PW-DED, whereas levels of triacylglycerols were increased by an average of 39% ± 3% in meibum from PW-DED compared to that in the control group. Approximately the same reduction (20%) of similar DE and WE was observed for CL-DED. Conclusions Statistical analysis of peak intensities from direct infusion ESI-MS results identified differentially expressed lipids in meibum from dry eye patients. Further studies are warranted to support these findings. PMID:28426869
A functional U-statistic method for association analysis of sequencing data.
Jadhav, Sneha; Tong, Xiaoran; Lu, Qing
2017-11-01
Although sequencing studies hold great promise for uncovering novel variants predisposing to human diseases, the high dimensionality of the sequencing data brings tremendous challenges to data analysis. Moreover, for many complex diseases (e.g., psychiatric disorders) multiple related phenotypes are collected. These phenotypes can be different measurements of an underlying disease, or measurements characterizing multiple related diseases for studying common genetic mechanism. Although jointly analyzing these phenotypes could potentially increase the power of identifying disease-associated genes, the different types of phenotypes pose challenges for association analysis. To address these challenges, we propose a nonparametric method, functional U-statistic method (FU), for multivariate analysis of sequencing data. It first constructs smooth functions from individuals' sequencing data, and then tests the association of these functions with multiple phenotypes by using a U-statistic. The method provides a general framework for analyzing various types of phenotypes (e.g., binary and continuous phenotypes) with unknown distributions. Fitting the genetic variants within a gene using a smoothing function also allows us to capture complexities of gene structure (e.g., linkage disequilibrium, LD), which could potentially increase the power of association analysis. Through simulations, we compared our method to the multivariate outcome score test (MOST), and found that our test attained better performance than MOST. In a real data application, we apply our method to the sequencing data from Minnesota Twin Study (MTS) and found potential associations of several nicotine receptor subunit (CHRN) genes, including CHRNB3, associated with nicotine dependence and/or alcohol dependence. © 2017 WILEY PERIODICALS, INC.
Statistical Analysis of the First Passage Path Ensemble of Jump Processes
NASA Astrophysics Data System (ADS)
von Kleist, Max; Schütte, Christof; Zhang, Wei
2018-02-01
The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.
Physical concepts in the development of constitutive equations
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1985-01-01
Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.
Maximum entropy models of ecosystem functioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertram, Jason, E-mail: jason.bertram@anu.edu.au
2014-12-05
Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on themore » information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.« less
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Renormalization Group Tutorial
NASA Technical Reports Server (NTRS)
Bell, Thomas L.
2004-01-01
Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.
Desensitized Optimal Filtering and Sensor Fusion Toolkit
NASA Technical Reports Server (NTRS)
Karlgaard, Christopher D.
2015-01-01
Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.
Wu, Zheyang; Zhao, Hongyu
2012-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.
Wu, Zheyang; Zhao, Hongyu
2013-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610
NASA Astrophysics Data System (ADS)
Bovier, Anton
2006-06-01
Our mathematical understanding of the statistical mechanics of disordered systems is going through a period of stunning progress. This self-contained book is a graduate-level introduction for mathematicians and for physicists interested in the mathematical foundations of the field, and can be used as a textbook for a two-semester course on mathematical statistical mechanics. It assumes only basic knowledge of classical physics and, on the mathematics side, a good working knowledge of graduate-level probability theory. The book starts with a concise introduction to statistical mechanics, proceeds to disordered lattice spin systems, and concludes with a presentation of the latest developments in the mathematical understanding of mean-field spin glass models. In particular, recent progress towards a rigorous understanding of the replica symmetry-breaking solutions of the Sherrington-Kirkpatrick spin glass models, due to Guerra, Aizenman-Sims-Starr and Talagrand, is reviewed in some detail. Comprehensive introduction to an active and fascinating area of research Clear exposition that builds to the state of the art in the mathematics of spin glasses Written by a well-known and active researcher in the field
Evidence, temperature, and the laws of thermodynamics.
Vieland, Veronica J
2014-01-01
A primary purpose of statistical analysis in genetics is the measurement of the strength of evidence for or against hypotheses. As with any type of measurement, a properly calibrated measurement scale is necessary if we want to be able to meaningfully compare degrees of evidence across genetic data sets, across different types of genetic studies and/or across distinct experimental modalities. In previous papers in this journal and elsewhere, my colleagues and I have argued that geneticists ought to care about the scale on which statistical evidence is measured, and we have proposed the Kelvin temperature scale as a template for a context-independent measurement scale for statistical evidence. Moreover, we have claimed that, mathematically speaking, evidence and temperature may be one and the same thing. On first blush, this might seem absurd. Temperature is a property of systems following certain laws of nature (in particular, the 1st and 2nd Law of Thermodynamics) involving very physical quantities (e.g., energy) and processes (e.g., mechanical work). But what do the laws of thermodynamics have to do with statistical systems? Here I address that question. © 2014 S. Karger AG, Basel.
Statistical learning of novel graphotactic constraints in children and adults.
Samara, Anna; Caravolas, Markéta
2014-05-01
The current study explored statistical learning processes in the acquisition of orthographic knowledge in school-aged children and skilled adults. Learning of novel graphotactic constraints on the position and context of letter distributions was induced by means of a two-phase learning task adapted from Onishi, Chambers, and Fisher (Cognition, 83 (2002) B13-B23). Following incidental exposure to pattern-embedding stimuli in Phase 1, participants' learning generalization was tested in Phase 2 with legality judgments about novel conforming/nonconforming word-like strings. Test phase performance was above chance, suggesting that both types of constraints were reliably learned even after relatively brief exposure. As hypothesized, signal detection theory d' analyses confirmed that learning permissible letter positions (d'=0.97) was easier than permissible neighboring letter contexts (d'=0.19). Adults were more accurate than children in all but a strict analysis of the contextual constraints condition. Consistent with the statistical learning perspective in literacy, our results suggest that statistical learning mechanisms contribute to children's and adults' acquisition of knowledge about graphotactic constraints similar to those existing in their orthography. Copyright © 2013 Elsevier Inc. All rights reserved.
Competitive Processes in Cross-Situational Word Learning
Yurovsky, Daniel; Yu, Chen; Smith, Linda B.
2013-01-01
Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. But the information that learners pick up from these regularities is dependent on their learning mechanism. This paper investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input, and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales – both within and across trials/situations –learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is then considered from the perspective of a process-level understanding of cross-situational learning. PMID:23607610
Competitive processes in cross-situational word learning.
Yurovsky, Daniel; Yu, Chen; Smith, Linda B
2013-07-01
Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. However, the information that learners pick up from these regularities is dependent on their learning mechanism. This article investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales-both within and across trials/situations-learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is considered from the perspective of a process-level understanding of cross-situational learning. Copyright © 2013 Cognitive Science Society, Inc.
Silva, Bruna Larissa Lago; Medeiros, Danila Lima; Soares, Ana Prates; Line, Sérgio Roberto Peres; Pinto, Maria das Graças Farias; Soares, Telma de Jesus; do Espírito Santo, Alexandre Ribeiro
2018-03-01
Type 1 diabetes mellitus (T1DM) largely affects children, occurring therefore at the same period of deciduous and permanent teeth development. The aim of this work was to investigate birefringence and morphology of the secretory stage enamel organic extracellular matrix (EOECM), and structural and mechanical features of mature enamel from T1DM rats. Adult Wistar rats were maintained alive for a period of 56 days after the induction of experimental T1DM with a single dose of streptozotocin (60 mg/kg). After proper euthanasia of the animals, fixed upper incisors were accurately processed, and secretory stage EOECM and mature enamel were analyzed by transmitted polarizing and bright field light microscopies (TPLM and BFLM), energy-dispersive x-ray (EDX) analysis, scanning electron microscopy (SEM), and microhardness testing. Bright field light microscopies and transmitted polarizing light microscopies showed slight morphological changes in the secretory stage EOECM from diabetic rats, which also did not exhibit statistically significant alterations in birefringence brightness when compared to control animals (P > .05). EDX analysis showed that T1DM induced statistically significant little increases in the amount of calcium and phosphorus in outer mature enamel (P < .01) with preservation of calcium/phosphorus ratio in that structure (P > .05). T1DM also caused important ultrastructural alterations in mature enamel as revealed by SEM and induced a statistically significant reduction of about 13.67% in its microhardness at 80 μm from dentin-enamel junction (P < .01). This study shows that T1DM may disturb enamel development, leading to alterations in mature enamel ultrastructure and in its mechanical features. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A quantitative study of nanoparticle skin penetration with interactive segmentation.
Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook
2016-10-01
In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.
Insights into teaching quantum mechanics in secondary and lower undergraduate education
NASA Astrophysics Data System (ADS)
Krijtenburg-Lewerissa, K.; Pol, H. J.; Brinkman, A.; van Joolingen, W. R.
2017-06-01
This study presents a review of the current state of research on teaching quantum mechanics in secondary and lower undergraduate education. A conceptual approach to quantum mechanics is being implemented in more and more introductory physics courses around the world. Because of the differences between the conceptual nature of quantum mechanics and classical physics, research on misconceptions, testing, and teaching strategies for introductory quantum mechanics is needed. For this review, 74 articles were selected and analyzed for the misconceptions, research tools, teaching strategies, and multimedia applications investigated. Outcomes were categorized according to their contribution to the various subtopics of quantum mechanics. Analysis shows that students have difficulty relating quantum physics to physical reality. It also shows that the teaching of complex quantum behavior, such as time dependence, superposition, and the measurement problem, has barely been investigated for the secondary and lower undergraduate level. At the secondary school level, this article shows a need to investigate student difficulties concerning wave functions and potential wells. Investigation of research tools shows the necessity for the development of assessment tools for secondary and lower undergraduate education, which cover all major topics and are suitable for statistical analysis. Furthermore, this article shows the existence of very diverse ideas concerning teaching strategies for quantum mechanics and a lack of research into which strategies promote understanding. This article underlines the need for more empirical research into student difficulties, teaching strategies, activities, and research tools intended for a conceptual approach for quantum mechanics.
A Conditional Curie-Weiss Model for Stylized Multi-group Binary Choice with Social Interaction
NASA Astrophysics Data System (ADS)
Opoku, Alex Akwasi; Edusei, Kwame Owusu; Ansah, Richard Kwame
2018-04-01
This paper proposes a conditional Curie-Weiss model as a model for decision making in a stylized society made up of binary decision makers that face a particular dichotomous choice between two options. Following Brock and Durlauf (Discrete choice with social interaction I: theory, 1955), we set-up both socio-economic and statistical mechanical models for the choice problem. We point out when both the socio-economic and statistical mechanical models give rise to the same self-consistent equilibrium mean choice level(s). Phase diagram of the associated statistical mechanical model and its socio-economic implications are discussed.
Welch, Kyle J; Hastings-Hauss, Isaac; Parthasarathy, Raghuveer; Corwin, Eric I
2014-04-01
We have constructed a macroscopic driven system of chaotic Faraday waves whose statistical mechanics, we find, are surprisingly simple, mimicking those of a thermal gas. We use real-time tracking of a single floating probe, energy equipartition, and the Stokes-Einstein relation to define and measure a pseudotemperature and diffusion constant and then self-consistently determine a coefficient of viscous friction for a test particle in this pseudothermal gas. Because of its simplicity, this system can serve as a model for direct experimental investigation of nonequilibrium statistical mechanics, much as the ideal gas epitomizes equilibrium statistical mechanics.
Lee, O-Sung; Ahn, Soyeon; Lee, Yong Seuk
2017-07-01
The purpose of this systematic review and meta-analysis was to evaluate the effectiveness and safety of early weight-bearing by comparing clinical and radiological outcomes between early and traditional delayed weight-bearing after OWHTO. A rigorous and systematic approach was used. The methodological quality was also assessed. Results that are possible to be compared in two or more than two articles were presented as forest plots. A 95% confidence interval was calculated for each effect size, and we calculated the I 2 statistic, which presents the percentage of total variation attributable to the heterogeneity among studies. The random-effects model was used to calculate the effect size. Six articles were included in the final analysis. All case groups were composed of early full weight-bearing within 2 weeks. All control groups were composed of late full weight-bearing between 6 weeks and 2 months. Pooled analysis was possible for the improvement in Lysholm score, but there was no statistically significant difference shown between groups. Other clinical results were also similar between groups. Four studies reported mechanical femorotibial angle (mFTA) and this result showed no statistically significant difference between groups in the pooled analysis. Furthermore, early weight-bearing showed more favorable results in some radiologic results (osseointegration and patellar height) and complications (thrombophlebitis and recurrence). Our analysis supports that early full weight-bearing after OWHTO using a locking plate leads to improvement in outcomes and was comparable to the delayed weight-bearing in terms of clinical and radiological outcomes. On the contrary, early weight-bearing was more favorable with respect to some radiologic parameters and complications compared with delayed weight-bearing.
Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi
2017-01-01
Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.
Processing, thermal and mechanical behaviour of PEI/MWCNT/carbon fiber nanostructured laminate
NASA Astrophysics Data System (ADS)
Santos, L. F. P.; Ribeiro, B.; Hein, L. R. O.; Botelho, E. C.; Costa, M. L.
2017-11-01
In this work, nanostructured composites of polyetherimide (PEI) with addition of functionalized multiwall carbon nanotube (MWCNT) were processed via solution mixing. After processing, these nanocomposites were evaluated by thermogravimetry (TGA), dynamic-mechanical analysis (DMA), scanning electron microscopy (SEM) and atomic force microscopy (AFM). Subsequently, the nanocomposite was processed with carbon fibers by using hot compression molding. In order to evaluate interlaminar fracture strength, the processed laminates were mechanically evaluated by interlaminar shear strength (ILSS) and compression shear test (CST). Also, the Weibull distribution was employed to help in the statistical treatment of the data obtained from the mechanical tests. With regards to the fracture of the specimens, optical microscopy was used for the evaluation of the material. The addition of 1 wt% of MWCNT in the polymer matrix increased both thermal stability and viscoelastic behavior of the material. These improvements positively impacted the mechanical properties, generating a 16% and 58% increase in the short-beam strength and apparent interlaminar shear, respectively. In addition, it can be verified from morphological analysis of the fracture a change in the failure mode of the laminate by the incorporation of MWCNT. This behavior can be proven from CST test where there was no presence of the shear force by compression.
Pazira, Parvin; Rostami Haji-Abadi, Mahdi; Zolaktaf, Vahid; Sabahi, Mohammadfarzan; Pazira, Toomaj
2016-06-08
In relation to statistical analysis, studies to determine the validity, reliability, objectivity and precision of new measuring devices are usually incomplete, due in part to using only correlation coefficient and ignoring the data dispersion. The aim of this study was to demonstrate the best way to determine the validity, reliability, objectivity and accuracy of an electro-inclinometer or other measuring devices. Another purpose of this study is to answer the question of whether reliability and objectivity represent accuracy of measuring devices. The validity of an electro-inclinometer was examined by mechanical and geometric methods. The objectivity and reliability of the device was assessed by calculating Cronbach's alpha for repeated measurements by three raters and by measurements on the same person by mechanical goniometer and the electro-inclinometer. Measurements were performed on "hip flexion with the extended knee" and "shoulder abduction with the extended elbow." The raters measured every angle three times within an interval of two hours. The three-way ANOVA was used to determine accuracy. The results of mechanical and geometric analysis showed that validity of the electro-inclinometer was 1.00 and level of error was less than one degree. Objectivity and reliability of electro-inclinometer was 0.999, while objectivity of mechanical goniometer was in the range of 0.802 to 0.966 and the reliability was 0.760 to 0.961. For hip flexion, the difference between raters in joints angle measurement by electro-inclinometer and mechanical goniometer was 1.74 and 16.33 degree (P<0.05), respectively. The differences for shoulder abduction measurement by electro-inclinometer and goniometer were 0.35 and 4.40 degree (P<0.05). Although both the objectivity and reliability are acceptable, the results showed that measurement error was very high in the mechanical goniometer. Therefore, it can be concluded that objectivity and reliability alone cannot determine the accuracy of a device and it is preferable to use other statistical methods to compare and evaluate the accuracy of these two devices.
Ion-Nedelcu, Niculae; Iordăchescu, Corina; Gherasim, Patricia; Mihailovici, Rodica; Dragomirescu, Cornelia; Dumitrache-Marian, Ruxanda; Moculescu, Cristina
2009-01-01
Analysis of risk factors for achieving clinically overt hepatitis B and hepatitis C in the population of Bucharest municipality. retrospective and descriptive study on hospital patients cohort. Cases - in the study have been enrolled all acute viral hepatitis B and C confirmed by the two infectious diseases university clinics of Bucharest municipality, during the time interval 2001-2008, among the residents of the municipality. Infection risk factors - for every case of hepatitis B and hepatitis C with the simptoms onset placed during the time interval 2001-2008, it was associated "the most plausible" risk factor, detected by case investigation. For contemplation of control strategies the risk factors were stratified by mechanisms of virus transmission and by age groups. The analysis consists mainly in statistical comparing of cases prevalence in each etiology by risk factors and mechanisms of visus transmission. Patients cohort included 1440 hepatitis B cases and 227 hepatitis C cases, respectively. The most prevalent individual risk factor in hepatitis B was the sexual contact with multiple partners (51,0%) while in hepatitis C the use of ilegal injectable drugs (46,3%). The prevalences of hepatitis B and hepatitis C cases by the four mechanisms of virus transmmission were similar (p = 0,52). For both etiologies the high risk behaviours represented the principal mechanism of virus transmission (64,1% in hepatitis B and 63,4% in hepatitis C, respectively); additionaly, for both etiologies the most prevalent mechanisms of virus transmission by age groups were indentically, namely: (a) consumption of medical services in the age group 55+ years, (b) high risk behaviours in the age group 13-54 years and (c) contact with case or virus carrier in the age group 0-12 years, respectively. in the time period 2001 - 2008 the structure by mechanisms of virus transmission in hepatitis B and hepatitis C cases reported in the population of Bucharest municipaly was statistically similar, for both etiologies the most prevalent mechanism (> 60%) was represented by high risk behaviours. This reality strongly suggests that additionaly to the current strategies for prevention of the infection with hepatitic visuses B and C, the decisive strategy to control of the two infection needs to be extended with an effective education satelite focused on high risk groups.
Change Mechanisms of Schema-Centered Group Psychotherapy with Personality Disorder Patients
Tschacher, Wolfgang; Zorn, Peter; Ramseyer, Fabian
2012-01-01
Background This study addressed the temporal properties of personality disorders and their treatment by schema-centered group psychotherapy. It investigated the change mechanisms of psychotherapy using a novel method by which psychotherapy can be modeled explicitly in the temporal domain. Methodology and Findings 69 patients were assigned to a specific schema-centered behavioral group psychotherapy, 26 to social skills training as a control condition. The largest diagnostic subgroups were narcissistic and borderline personality disorder. Both treatments offered 30 group sessions of 100 min duration each, at a frequency of two sessions per week. Therapy process was described by components resulting from principal component analysis of patients' session-reports that were obtained after each session. These patient-assessed components were Clarification, Bond, Rejection, and Emotional Activation. The statistical approach focused on time-lagged associations of components using time-series panel analysis. This method provided a detailed quantitative representation of therapy process. It was found that Clarification played a core role in schema-centered psychotherapy, reducing rejection and regulating the emotion of patients. This was also a change mechanism linked to therapy outcome. Conclusions/Significance The introduced process-oriented methodology allowed to highlight the mechanisms by which psychotherapeutic treatment became effective. Additionally, process models depicted the actual patterns that differentiated specific diagnostic subgroups. Time-series analysis explores Granger causality, a non-experimental approximation of causality based on temporal sequences. This methodology, resting upon naturalistic data, can explicate mechanisms of action in psychotherapy research and illustrate the temporal patterns underlying personality disorders. PMID:22745811
Recurrence time statistics of landslide events simulated by a cellular automaton model
NASA Astrophysics Data System (ADS)
Piegari, Ester; Di Maio, Rosa; Avella, Adolfo
2014-05-01
The recurrence time statistics of a cellular automaton modelling landslide events is analyzed by performing a numerical analysis in the parameter space and estimating Fano factor behaviors. The model is an extended version of the OFC model, which is a paradigm for SOC in non-conserved systems, but it works differently from the original OFC model as a finite value of the driving rate is applied. By driving the system to instability with different rates, the model exhibits a smooth transition from a correlated to an uncorrelated regime as the effect of a change in predominant mechanisms to propagate instability. If the rate at which instability is approached is small, chain processes dominate the landslide dynamics, and power laws govern probability distributions. However, the power-law regime typical of SOC-like systems is found in a range of return intervals that becomes shorter and shorter by increasing the values of the driving rates. Indeed, if the rates at which instability is approached are large, domino processes are no longer active in propagating instability, and large events simply occur because a large number of cells simultaneously reach instability. Such a gradual loss of the effectiveness of the chain propagation mechanism causes the system gradually enter to an uncorrelated regime where recurrence time distributions are characterized by Weibull behaviors. Simulation results are qualitatively compared with those from a recent analysis performed by Witt et al.(Earth Surf. Process. Landforms, 35, 1138, 2010) for the first complete databases of landslide occurrences over a period as large as fifty years. From the comparison with the extensive landslide data set, the numerical analysis suggests that statistics of such landslide data seem to be described by a crossover region between a correlated regime and an uncorrelated regime, where recurrence time distributions are characterized by power-law and Weibull behaviors for short and long return times, respectively. Finally, in such a region of the parameter space, clear indications of temporal correlations and clustering by the Fano factor behaviors support, at least in part, the analysis performed by Witt et al. (2010).
Brands, H; Maassen, S R; Clercx, H J
1999-09-01
In this paper the applicability of a statistical-mechanical theory to freely decaying two-dimensional (2D) turbulence on a bounded domain is investigated. We consider an ensemble of direct numerical simulations in a square box with stress-free boundaries, with a Reynolds number that is of the same order as in experiments on 2D decaying Navier-Stokes turbulence. The results of these simulations are compared with the corresponding statistical equilibria, calculated from different stages of the evolution. It is shown that the statistical equilibria calculated from early times of the Navier-Stokes evolution do not correspond to the dynamical quasistationary states. At best, the global topological structure is correctly predicted from a relatively late time in the Navier-Stokes evolution, when the quasistationary state has almost been reached. This failure of the (basically inviscid) statistical-mechanical theory is related to viscous dissipation and net leakage of vorticity in the Navier-Stokes dynamics at moderate values of the Reynolds number.
NASA Astrophysics Data System (ADS)
Liu, Zhichao; Zhao, Yunjie; Zeng, Chen; Computational Biophysics Lab Team
As the main protein of the bacterial flagella, flagellin plays an important role in perception and defense response. The newly discovered locus, FLS2, is ubiquitously expressed. FLS2 encodes a putative receptor kinase and shares many homologies with some plant resistance genes and even with some components of immune system of mammals and insects. In Arabidopsis, FLS2 perception is achieved by the recognition of epitope flg22, which induces FLS2 heteromerization with BAK1 and finally the plant immunity. Here we use both analytical methods such as Direct Coupling Analysis (DCA) and Molecular Dynamics (MD) Simulations to get a better understanding of the defense mechanism of FLS2. This may facilitate a redesign of flg22 or de-novo design for desired specificity and potency to extend the immune properties of FLS2 to other important crops and vegetables.
Analysis of Failures of High Speed Shaft Bearing System in a Wind Turbine
NASA Astrophysics Data System (ADS)
Wasilczuk, Michał; Gawarkiewicz, Rafał; Bastian, Bartosz
2018-01-01
During the operation of wind turbines with gearbox of traditional configuration, consisting of one planetary stage and two helical stages high failure rate of high speed shaft bearings is observed. Such a high failures frequency is not reflected in the results of standard calculations of bearing durability. Most probably it can be attributed to atypical failure mechanism. The authors studied problems in 1.5 MW wind turbines of one of Polish wind farms. The analysis showed that the problems of high failure rate are commonly met all over the world and that the statistics for the analysed turbines were very similar. After the study of potential failure mechanism and its potential reasons, modification of the existing bearing system was proposed. Various options, with different bearing types were investigated. Different versions were examined for: expected durability increase, extent of necessary gearbox modifications and possibility to solve existing problems in operation.
Negative specific heat with trapped ultracold quantum gases
NASA Astrophysics Data System (ADS)
Strzys, M. P.; Anglin, J. R.
2014-01-01
The second law of thermodynamics normally prescribes that heat tends to disperse, but in certain cases it instead implies that heat will spontaneously concentrate. The spontaneous formation of stars out of cold cosmic nebulae, without which the universe would be dark and dead, is an example of this phenomenon. Here we show that the counter-intuitive thermodynamics of spontaneous heat concentration can be studied experimentally with trapped quantum gases, by using optical lattice potentials to realize weakly coupled arrays of simple dynamical subsystems, so that under the standard assumptions of statistical mechanics, the behavior of the whole system can be predicted from ensemble properties of the isolated components. A naive application of the standard statistical mechanical formalism then identifies the subsystem excitations as heat in this case, but predicts them to share the peculiar property of self-gravitating protostars, of having negative micro-canonical specific heat. Numerical solution of real-time evolution equations confirms the spontaneous concentration of heat in such arrays, with initially dispersed energy condensing quickly into dense ‘droplets’. Analysis of the nonlinear dynamics in adiabatic terms allows it to be related to familiar modulational instabilities. The model thus provides an example of a dictionary mesoscopic system, in which the same non-trivial phenomenon can be understood in both thermodynamical and mechanical terms.
The role of internal duplication in the evolution of multi-domain proteins.
Nacher, J C; Hayashida, M; Akutsu, T
2010-08-01
Many proteins consist of several structural domains. These multi-domain proteins have likely been generated by selective genome growth dynamics during evolution to perform new functions as well as to create structures that fold on a biologically feasible time scale. Domain units frequently evolved through a variety of genetic shuffling mechanisms. Here we examine the protein domain statistics of more than 1000 organisms including eukaryotic, archaeal and bacterial species. The analysis extends earlier findings on asymmetric statistical laws for proteome to a wider variety of species. While proteins are composed of a wide range of domains, displaying a power-law decay, the computation of domain families for each protein reveals an exponential distribution, characterizing a protein universe composed of a thin number of unique families. Structural studies in proteomics have shown that domain repeats, or internal duplicated domains, represent a small but significant fraction of genome. In spite of its importance, this observation has been largely overlooked until recently. We model the evolutionary dynamics of proteome and demonstrate that these distinct distributions are in fact rooted in an internal duplication mechanism. This process generates the contemporary protein structural domain universe, determines its reduced thickness, and tames its growth. These findings have important implications, ranging from protein interaction network modeling to evolutionary studies based on fundamental mechanisms governing genome expansion.
Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics
NASA Astrophysics Data System (ADS)
Ohzeki, Masayuki
2013-09-01
In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.
Lankadurai, Brian P.; Furdui, Vasile I.; Reiner, Eric J.; Simpson, André J.; Simpson, Myrna J.
2013-01-01
1H NMR-based metabolomics was used to measure the response of Eisenia fetida earthworms after exposure to sub-lethal concentrations of perfluorooctane sulfonate (PFOS) in soil. Earthworms were exposed to a range of PFOS concentrations (five, 10, 25, 50, 100 or 150 mg/kg) for two, seven and fourteen days. Earthworm tissues were extracted and analyzed by 1H NMR. Multivariate statistical analysis of the metabolic response of E. fetida to PFOS exposure identified time-dependent responses that were comprised of two separate modes of action: a non-polar narcosis type mechanism after two days of exposure and increased fatty acid oxidation after seven and fourteen days of exposure. Univariate statistical analysis revealed that 2-hexyl-5-ethyl-3-furansulfonate (HEFS), betaine, leucine, arginine, glutamate, maltose and ATP are potential indicators of PFOS exposure, as the concentrations of these metabolites fluctuated significantly. Overall, NMR-based metabolomic analysis suggests elevated fatty acid oxidation, disruption in energy metabolism and biological membrane structure and a possible interruption of ATP synthesis. These conclusions obtained from analysis of the metabolic profile in response to sub-lethal PFOS exposure indicates that NMR-based metabolomics is an excellent discovery tool when the mode of action (MOA) of contaminants is not clearly defined. PMID:24958147
Statistical mechanics of multipartite entanglement
NASA Astrophysics Data System (ADS)
Facchi, P.; Florio, G.; Marzolino, U.; Parisi, G.; Pascazio, S.
2009-02-01
We characterize the multipartite entanglement of a system of n qubits in terms of the distribution function of the bipartite purity over all balanced bipartitions. We search for those (maximally multipartite entangled) states whose purity is minimum for all bipartitions and recast this optimization problem into a problem of statistical mechanics.
An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics
ERIC Educational Resources Information Center
Ellis, Frank B.; Ellis, David C.
2008-01-01
Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…
Orbital Roof Fractures as an Indicator for Concomitant Ocular Injury
2017-11-12
between these two groups to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and exam findings, and...using Pearson’s x2 test or Fisher’s exact test to indicate a statistically significant difference in mechanism of injury, subjective symptoms, CT and
Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.
Venturi, D; Karniadakis, G E
2014-06-08
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.
Usov, Ivan; Nyström, Gustav; Adamcik, Jozef; Handschin, Stephan; Schütz, Christina; Fall, Andreas; Bergström, Lennart; Mezzenga, Raffaele
2015-01-01
Nanocellulose fibrils are ubiquitous in nature and nanotechnologies but their mesoscopic structural assembly is not yet fully understood. Here we study the structural features of rod-like cellulose nanoparticles on a single particle level, by applying statistical polymer physics concepts on electron and atomic force microscopy images, and we assess their physical properties via quantitative nanomechanical mapping. We show evidence of right-handed chirality, observed on both bundles and on single fibrils. Statistical analysis of contours from microscopy images shows a non-Gaussian kink angle distribution. This is inconsistent with a structure consisting of alternating amorphous and crystalline domains along the contour and supports process-induced kink formation. The intrinsic mechanical properties of nanocellulose are extracted from nanoindentation and persistence length method for transversal and longitudinal directions, respectively. The structural analysis is pushed to the level of single cellulose polymer chains, and their smallest associated unit with a proposed 2 × 2 chain-packing arrangement. PMID:26108282
NASA Astrophysics Data System (ADS)
Pilger, Christoph; Schmidt, Carsten; Bittner, Michael
2013-02-01
The detection of infrasonic signals in temperature time series of the mesopause altitude region (at about 80-100 km) is performed at the German Remote Sensing Data Center of the German Aerospace Center (DLR-DFD) using GRIPS instrumentation (GRound-based Infrared P-branch Spectrometers). Mesopause temperature values with a temporal resolution of up to 10 s are derived from the observation of nocturnal airglow emissions and permit the identification of signals within the long-period infrasound range.Spectral intensities of wave signatures with periods between 2.5 and 10 min are estimated applying the wavelet analysis technique to one minute mean temperature values. Selected events as well as the statistical distribution of 40 months of observation are presented and discussed with respect to resonant modes of the atmosphere. The mechanism of acoustic resonance generated by strong infrasonic sources is a potential explanation of distinct features with periods between 3 and 5 min observed in the dataset.
Quantum signature of chaos and thermalization in the kicked Dicke model
NASA Astrophysics Data System (ADS)
Ray, S.; Ghosh, A.; Sinha, S.
2016-09-01
We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.
Craven, Galen T; Nitzan, Abraham
2018-01-28
Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.
NASA Astrophysics Data System (ADS)
Craven, Galen T.; Nitzan, Abraham
2018-01-01
Statistical properties of Brownian motion that arise by analyzing, separately, trajectories over which the system energy increases (upside) or decreases (downside) with respect to a threshold energy level are derived. This selective analysis is applied to examine transport properties of a nonequilibrium Brownian process that is coupled to multiple thermal sources characterized by different temperatures. Distributions, moments, and correlation functions of a free particle that occur during upside and downside events are investigated for energy activation and energy relaxation processes and also for positive and negative energy fluctuations from the average energy. The presented results are sufficiently general and can be applied without modification to the standard Brownian motion. This article focuses on the mathematical basis of this selective analysis. In subsequent articles in this series, we apply this general formalism to processes in which heat transfer between thermal reservoirs is mediated by activated rate processes that take place in a system bridging them.
Quantum signature of chaos and thermalization in the kicked Dicke model.
Ray, S; Ghosh, A; Sinha, S
2016-09-01
We study the quantum dynamics of the kicked Dicke model (KDM) in terms of the Floquet operator, and we analyze the connection between chaos and thermalization in this context. The Hamiltonian map is constructed by suitably taking the classical limit of the Heisenberg equation of motion to study the corresponding phase-space dynamics, which shows a crossover from regular to chaotic motion by tuning the kicking strength. The fixed-point analysis and calculation of the Lyapunov exponent (LE) provide us with a complete picture of the onset of chaos in phase-space dynamics. We carry out a spectral analysis of the Floquet operator, which includes a calculation of the quasienergy spacing distribution and structural entropy to show the correspondence to the random matrix theory in the chaotic regime. Finally, we analyze the thermodynamics and statistical properties of the bosonic sector as well as the spin sector, and we discuss how such a periodically kicked system relaxes to a thermalized state in accordance with the laws of statistical mechanics.
Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems
Venturi, D.; Karniadakis, G. E.
2014-01-01
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519
Statistical principle and methodology in the NISAN system.
Asano, C
1979-01-01
The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594
atBioNet--an integrated network analysis tool for genomics and biomarker discovery.
Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida
2012-07-20
Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.
Mechanical Characterization of Polysilicon MEMS: A Hybrid TMCMC/POD-Kriging Approach.
Mirzazadeh, Ramin; Eftekhar Azam, Saeed; Mariani, Stefano
2018-04-17
Microscale uncertainties related to the geometry and morphology of polycrystalline silicon films, constituting the movable structures of micro electro-mechanical systems (MEMS), were investigated through a joint numerical/experimental approach. An on-chip testing device was designed and fabricated to deform a compliant polysilicon beam. In previous studies, we showed that the scattering in the input–output characteristics of the device can be properly described only if statistical features related to the morphology of the columnar polysilicon film and to the etching process adopted to release the movable structure are taken into account. In this work, a high fidelity finite element model of the device was used to feed a transitional Markov chain Monte Carlo (TMCMC) algorithm for the estimation of the unknown parameters governing the aforementioned statistical features. To reduce the computational cost of the stochastic analysis, a synergy of proper orthogonal decomposition (POD) and kriging interpolation was adopted. Results are reported for a batch of nominally identical tested devices, in terms of measurement error-affected probability distributions of the overall Young’s modulus of the polysilicon film and of the overetch depth.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, James W.; Liu, Da-Jiang
We develop statistical mechanical models amenable to analytic treatment for the dissociative adsorption of O2 at hollow sites on fcc(100) metal surfaces. The models incorporate exclusion of nearest-neighbor pairs of adsorbed O. However, corresponding simple site-blocking models, where adsorption requires a large ensemble of available sites, exhibit an anomalously fast initial decrease in sticking. Thus, in addition to blocking, our models also incorporate more facile adsorption via orientational steering and funneling dynamics (features supported by ab initio Molecular Dynamics studies). Behavior for equilibrated adlayers is distinct from those with finite adspecies mobility. We focus on the low-temperature limited-mobility regime wheremore » analysis of the associated master equations readily produces exact results for both short- and long-time behavior. Kinetic Monte Carlo simulation is also utilized to provide a more complete picture of behavior. These models capture both the initial decrease and the saturation of the experimentally observed sticking versus coverage, as well as features of non-equilibrium adlayer ordering as assessed by surface-sensitive diffraction.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, James W.; Department of Physics and Astronomy, Iowa State University, Ames, Iowa 50011; Liu, Da-Jiang
We develop statistical mechanical models amenable to analytic treatment for the dissociative adsorption of O{sub 2} at hollow sites on fcc(100) metal surfaces. The models incorporate exclusion of nearest-neighbor pairs of adsorbed O. However, corresponding simple site-blocking models, where adsorption requires a large ensemble of available sites, exhibit an anomalously fast initial decrease in sticking. Thus, in addition to blocking, our models also incorporate more facile adsorption via orientational steering and funneling dynamics (features supported by ab initio Molecular Dynamics studies). Behavior for equilibrated adlayers is distinct from those with finite adspecies mobility. We focus on the low-temperature limited-mobility regimemore » where analysis of the associated master equations readily produces exact results for both short- and long-time behavior. Kinetic Monte Carlo simulation is also utilized to provide a more complete picture of behavior. These models capture both the initial decrease and the saturation of the experimentally observed sticking versus coverage, as well as features of non-equilibrium adlayer ordering as assessed by surface-sensitive diffraction.« less
Analysis of swarm behaviors based on an inversion of the fluctuation theorem.
Hamann, Heiko; Schmickl, Thomas; Crailsheim, Karl
2014-01-01
A grand challenge in the field of artificial life is to find a general theory of emergent self-organizing systems. In swarm systems most of the observed complexity is based on motion of simple entities. Similarly, statistical mechanics focuses on collective properties induced by the motion of many interacting particles. In this article we apply methods from statistical mechanics to swarm systems. We try to explain the emergent behavior of a simulated swarm by applying methods based on the fluctuation theorem. Empirical results indicate that swarms are able to produce negative entropy within an isolated subsystem due to frozen accidents. Individuals of a swarm are able to locally detect fluctuations of the global entropy measure and store them, if they are negative entropy productions. By accumulating these stored fluctuations over time the swarm as a whole is producing negative entropy and the system ends up in an ordered state. We claim that this indicates the existence of an inverted fluctuation theorem for emergent self-organizing dissipative systems. This approach bears the potential of general applicability.
NASA Astrophysics Data System (ADS)
Yoshida, Yuki; Karakida, Ryo; Okada, Masato; Amari, Shun-ichi
2017-04-01
Weight normalization, a newly proposed optimization method for neural networks by Salimans and Kingma (2016), decomposes the weight vector of a neural network into a radial length and a direction vector, and the decomposed parameters follow their steepest descent update. They reported that learning with the weight normalization achieves better converging speed in several tasks including image recognition and reinforcement learning than learning with the conventional parameterization. However, it remains theoretically uncovered how the weight normalization improves the converging speed. In this study, we applied a statistical mechanical technique to analyze on-line learning in single layer linear and nonlinear perceptrons with weight normalization. By deriving order parameters of the learning dynamics, we confirmed quantitatively that weight normalization realizes fast converging speed by automatically tuning the effective learning rate, regardless of the nonlinearity of the neural network. This property is realized when the initial value of the radial length is near the global minimum; therefore, our theory suggests that it is important to choose the initial value of the radial length appropriately when using weight normalization.
West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael
2017-01-01
Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.
Do Turkish patients with lumbar disc herniation know body mechanics?
Topcu, Sacide Yildizeli
2017-01-01
Most common and important cause of the low back pain is lumbar disc herniation. Patients with lumbar disc herniation face with difficulties during daily activities due to the reduction of physical functions. In order to maintain daily activities without pain and discomfort, the patients should be informed about proper positions and body mechanics. The aim of the study was to determine the knowledge and the applications of the patients with lumbar disc herniation about body mechanics. This descriptive study was conducted with 75 patients with lumbar disc herniation in Edirne, Turkey. The population consisted of 75 patients who accepted to participate in the study. In the collection of data the questionnaire, which was developed according to literature by the researcher, was used. Descriptive statistics, student t-test, variance and correlation analysis were used for assessment of the data. The significance level was accepted at 0.05. It was found that 53.3% of the patients experienced awful/very severe pain. and there were some points that the patients have enough information about; mobilisation, standing, carrying the goods, leaning back while sitting, leaning somewhere while standing, getting support from the chair when standing up, avoiding sudden position changes, changing feet frequently while standing. It was detected that a statistical relation between educational level and knowledge about body mechanics exists. This study shows that individuals with lumbar disc herniation have not enough information about body mechanics and they experienced long-term severe pain. Nurses and other health care workers have important role in explaining the importance of body mechanics to the patients and should encourage them to use that in daily life.
Novel Mechanism for Reducing Acute and Chronic Neurodegeneration After Traumatic Brain Injury
2017-07-01
glutamate from the brain. Scope: We will test this novel and powerful neuroprotective treatment in a rat model of repetitive mild (concussive) TBIs...variability. 2. Completed statistical analysis of behavioral experiments examining effects of rGOT and rGOT + OxAc on outcome on rotarod and Morris water ...neuroprotective treatment in a rat model of a single moderate TBI and in a rat model of repetitive mild (concussive) TBIs. Outcome measures include blood and
[Study of beta-turns in globular proteins].
Amirova, S R; Milchevskiĭ, Iu V; Filatov, I V; Esipova, N G; Tumanian, V G
2005-01-01
The formation of beta-turns in globular proteins has been studied by the method of molecular mechanics. Statistical method of discriminant analysis was applied to calculate energy components and sequences of oligopeptide segments, and after this prediction of I type beta-turns has been drawn. The accuracy of true positive prediction is 65%. Components of conformational energy considerably affecting beta-turn formation were delineated. There are torsional energy, energy of hydrogen bonds, and van der Waals energy.
Frequency distribution histograms for the rapid analysis of data
NASA Technical Reports Server (NTRS)
Burke, P. V.; Bullen, B. L.; Poff, K. L.
1988-01-01
The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.
NASA Astrophysics Data System (ADS)
Benzannache, N.; Bezazi, A.; Bouchelaghem, H.; Boumaaza, M.; Amziane, S.; Scarpa, F.
2018-01-01
The mechanical performance of concrete polymer beams subjected to 3-point bending was investigated. The polymer concrete incorporates marble powder waste and quarry sand. The results obtained showed that the type of sand, and amount of marble powder and sand aggregate affected the resistance of the polymer concrete beams significantly. The marble waste increased their bending strength by reducing the porosity of polymer concrete.
Bose-Einstein distribution of money in a free-market economy. II
NASA Astrophysics Data System (ADS)
Kürten, K. E.; Kusmartsev, F. V.
2011-01-01
We argue about the application of methods of statistical mechanics to free economy (Kusmartsev F. V., Phys. Lett. A, 375 (2011) 966) and find that the most general distribution of money or income in a free-market economy has a general Bose-Einstein distribution form. Therewith the market is described by three parameters: temperature, chemical potential and the space dimensionality. Numerical simulations and a detailed analysis of a generic model confirm this finding.
Multiobjective Optimal Control Methodology for the Analysis of Certain Sociodynamic Problems
2009-03-01
but less expensive in both time and memory. 137 References [1] R. Albert and A-L Barabasi. Statistical mechanics of complex networks. Reviews of Modern...Review, E(51):4282–4286, 1995. [24] D. Helbing, P. Molnar, and F. Schweitzer . Computer simulation of pedestrian dynamics and trail formation. May 1998...Patterson AFB, OH, 2001. [49] F. Schweitzer . Brownian Agents and Active Particles. Springer, Santa Fe, NM, 2003. [50] P. Sen. Complexities of social
Statistical Analysis of Physiological Signals
NASA Astrophysics Data System (ADS)
Ruiz, María G.; Pérez, Leticia
2003-07-01
In spite of two hundred years of clinical practice, Homeopathy still lacks of scientific basis. Its fundamental laws, similia principle and the activity of the denominated ultra-high dilutions are controversial issues that do not fit into the mainstream medicine or current physical-chemistry field as well. Aside its clinical efficacy, the identification of physical - chemistry parameters, as markers of the homeopathic effect, would allow to construct mathematic models [1], which in turn, could provide clues regarding the involved mechanism.
Hao, Bo; Gao, Di; Tang, Da-Wei; Wang, Xiao-Guang; Liu, Shui-Ping; Kong, Xiao-Ping; Liu, Chao; Huang, Jing-Lu; Bi, Qi-Ming; Quan, Li; Luo, Bin
2012-04-01
To explore the mechanism that how human enterovirus 71 (EV71) invades the brainstem and how intercellular adhesion molecules-1 (ICAM-1) participates by analyzing the expression and distribution of human EV71, and ICAM-1 in brainstem of infants with brain stem encephalitis. Twenty-two brainstem of infants with brain stem encephalitis were collected as the experimental group and 10 brainstems of fatal congenital heart disease were selected as the control group. The sections with perivascular cuffings were selected to observe EV71-VP1 expression by immunohistochemistry method and ICAM-1 expression was detected for the sections with EV71-VP1 positive expression. The staining image analysis and statistics analysis were performed. The experiment and control groups were compared. (1) EV71-VP1 positive cells in the experimental group were mainly astrocytes in brainstem with nigger-brown particles, and the control group was negative. (2) ICAM-1 positive cells showed nigger-brown. The expression in inflammatory cells (around blood vessels of brain stem and in glial nodules) and gliocytes increased. The results showed statistical difference comparing with control group (P < 0.05). The brainstem encephalitis can be used to diagnose fatal EV71 infection in infants. EV71 can invade the brainstem via hematogenous route. ICAM-1 may play an important role in the pathogenic process.
Alhadlaq, Adel; Alkhadra, Thamer; El-Bialy, Tarek
2016-05-01
To compare anchorage condition in cases in which transpalatal arch was used to enhance anchorage in both continuous and segmented arch techniques. Twenty cases that required first premolar extraction for orthodontic treatment and transpalatal arch to enhance anchorage were included in this study. Ten cases were treated using the continuous arch technique, while the other 10 cases were treated using 0.019 × 0.025-inch TMA T-loops with posterior anchorage bend according to the Burstone and Marcotte description. Lateral cephalometric analysis of before and after canine retraction was performed using Ricketts analysis to measure the anteroposterior position of the upper first molar to the vertical line from the Pt point. Data were analyzed using an independent sample t-test. There was a statistically significant forward movement of the upper first molar in cases treated by continuous arch mechanics (4.5 ± 3.0 mm) compared with segmented arch mechanics (-0.7 ± 1.4 mm; P = .01). The posterior anchorage bend to T-loop used to retract the maxillary canine can enhance anchorage during maxillary canine retraction.
Femoral venous access is safe in burned children: an analysis of 224 catheters.
Goldstein, A M; Weber, J M; Sheridan, R L
1997-03-01
To document the incidence of septic and mechanical complications associated with femoral venous catheters in a subgroup of patients thought to be at particularly high risk of both: young children with large burns. An analysis of data collected prospectively on all femoral venous catheters placed during a 4-year period at a regional pediatric burn facility. There were 224 femoral catheters placed in 86 children with an average age of 5.3 +/- 5.1 years and an average burn size of 38% +/- 23%. Catheters were left in place for a mean duration of 5.7 days. Catheter-related sepsis occurred with 4.9% of the catheters, and mechanical complications occurred in 3.5% of the patients. There was no statistically significant association between the risk of catheter sepsis and the placement of catheters through burned versus unburned skin. Similarly, the risk of sepsis was equivalent between lines placed over a guide wire and those placed of a new site. Femoral venous catheters are safe in burned children and are associated with a low incidence of infectious and mechanical complications.
[Influence of geomagnetic storms on the balance of autonomic regulatory mechanisms].
Chichinadze, G; Tvildiani, L; Kvachadze, I; Tarkhan-Mouravi, I
2005-09-01
The investigation aimed to evaluate autonomic regulatory mechanisms in practically healthy persons during the geomagnetically quiet periods and during geomagnetic storms. The examinations were conducted among the volunteer young men (n=64) 18-22 years of age. The autonomic function was studied on the basis of the heart rate variability. The geomagnetically quiet periods were considered when the value of the K-index was no more then 2 and a geomagnetic storm was considered when the value of the index was 5 and more. It is ascertained that in the both cases the basic statistical indices of the heart rate were identical. The analysis of R-R intervals spectral power gave the possibility to sort the persons examined into the three different groups. The data obtained allowed to suggest that geomagnetic storms influence human organisms through the vagus centers by means of their excitation. This phenomenon may be considered as a self-regulatory physiologic mechanism of the adaptive character. The analysis of the spectral power of R-R intervals may be considered as a sensitive method for the detection of the magnitolabile persons.
Quantum mechanics of black holes.
Witten, Edward
2012-08-03
The popular conception of black holes reflects the behavior of the massive black holes found by astronomers and described by classical general relativity. These objects swallow up whatever comes near and emit nothing. Physicists who have tried to understand the behavior of black holes from a quantum mechanical point of view, however, have arrived at quite a different picture. The difference is analogous to the difference between thermodynamics and statistical mechanics. The thermodynamic description is a good approximation for a macroscopic system, but statistical mechanics describes what one will see if one looks more closely.
NASA Astrophysics Data System (ADS)
Descartes, R.; Rota, G.-C.; Euler, L.; Bernoulli, J. D.; Siegel, Edward Carl-Ludwig
2011-03-01
Quantum-statistics Dichotomy: Fermi-Dirac(FDQS) Versus Bose-Einstein(BEQS), respectively with contact-repulsion/non-condensation(FDCR) versus attraction/ condensationBEC are manifestly-demonstrated by Taylor-expansion ONLY of their denominator exponential, identified BOTH as Descartes analytic-geometry conic-sections, FDQS as Elllipse (homotopy to rectangle FDQS distribution-function), VIA Maxwell-Boltzmann classical-statistics(MBCS) to Parabola MORPHISM, VS. BEQS to Hyperbola, Archimedes' HYPERBOLICITY INEVITABILITY, and as well generating-functions[Abramowitz-Stegun, Handbook Math.-Functions--p. 804!!!], respectively of Euler-numbers/functions, (via Riemann zeta-function(domination of quantum-statistics: [Pathria, Statistical-Mechanics; Huang, Statistical-Mechanics]) VS. Bernoulli-numbers/ functions. Much can be learned about statistical-physics from Euler-numbers/functions via Riemann zeta-function(s) VS. Bernoulli-numbers/functions [Conway-Guy, Book of Numbers] and about Euler-numbers/functions, via Riemann zeta-function(s) MORPHISM, VS. Bernoulli-numbers/ functions, visa versa!!! Ex.: Riemann-hypothesis PHYSICS proof PARTLY as BEQS BEC/BEA!!!
Phase operator problem and macroscopic extension of quantum mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozawa, M.
1997-06-01
To find the Hermitian phase operator of a single-mode electromagnetic field in quantum mechanics, the Schr{umlt o}dinger representation is extended to a larger Hilbert space augmented by states with infinite excitation by nonstandard analysis. The Hermitian phase operator is shown to exist on the extended Hilbert space. This operator is naturally considered as the controversial limit of the approximate phase operators on finite dimensional spaces proposed by Pegg and Barnett. The spectral measure of this operator is a Naimark extension of the optimal probability operator-valued measure for the phase parameter found by Helstrom. Eventually, the two promising approaches to themore » statistics of the phase in quantum mechanics are synthesized by means of the Hermitian phase operator in the macroscopic extension of the Schr{umlt o}dinger representation. {copyright} 1997 Academic Press, Inc.« less
SurfKin: an ab initio kinetic code for modeling surface reactions.
Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K
2014-10-05
In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.
Simulation-based sensitivity analysis for non-ignorably missing data.
Yin, Peng; Shi, Jian Q
2017-01-01
Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.
Statistical Mechanics of Prion Diseases
NASA Astrophysics Data System (ADS)
Slepoy, A.; Singh, R. R.; Pázmándi, F.; Kulkarni, R. V.; Cox, D. L.
2001-07-01
We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ``species barriers'' to prion infection and assess a related treatment protocol.
Physicochemical and microscopic characterization of implant–abutment joints
Lopes, Patricia A.; Carreiro, Adriana F. P.; Nascimento, Rubens M.; Vahey, Brendan R.; Henriques, Bruno; Souza, Júlio C. M.
2018-01-01
Objective: The purpose of this study was to investigate Morse taper implant–abutment joints by chemical, mechanical, and microscopic analysis. Materials and Methods: Surfaces of 10 Morse taper implants and the correlated abutments were inspected by field emission gun-scanning electron microscopy (FEG-SEM) before connection. The implant–abutment connections were tightened at 32 Ncm. For microgap evaluation by FEG-SEM, the systems were embedded in epoxy resin and cross-sectioned at a perpendicular plane of the implant–abutment joint. Furthermore, nanoindentation tests and chemical analysis were performed at the implant–abutment joints. Statistics: Results were statistically analyzed via one-way analysis of variance, with a significance level of P < 0.05. Results: Defects were noticed on different areas of the abutment surfaces. The minimum and maximum size of microgaps ranged from 0.5 μm up to 5.6 μm. Furthermore, defects were detected throughout the implant–abutment joint that can, ultimately, affect the microgap size after connection. Nanoindentation tests revealed a higher hardness (4.2 ± 0.4 GPa) for abutment composed of Ti6Al4V alloy when compared to implant composed of commercially pure Grade 4 titanium (3.2 ± 0.4 GPa). Conclusions: Surface defects produced during the machining of both implants and abutments can increase the size of microgaps and promote a misfit of implant–abutment joints. In addition, the mismatch in mechanical properties between abutment and implant can promote the wear of surfaces, affecting the size of microgaps and consequently the performance of the joints during mastication. PMID:29657532
NASA Astrophysics Data System (ADS)
Queirós, S. M. D.; Tsallis, C.
2005-11-01
The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).
Liu, Hongwei; Wu, Xueping; Zhao, Xiaoning; Zhu, Ping
2016-01-01
Objective To examine if mechanical ventilation with positive end-expiratory pressure (PEEP) combined with intra-aortic balloon pump (IABP) provided a better outcome than IABP alone for the treatment of cardiogenic shock after acute myocardial infarction in patients aged > 60 years. Methods This was a retrospective analysis of data from patients in cardiogenic shock, refractory to pharmacological therapy and treated at a geriatric coronary care unit. Results Sixty-two patients were eligible for study inclusion: 33 received IABP alone; 29 received IABP combined with mechanical ventilation. Patients in the IABP + mechanical ventilation group had lower mean arterial blood pressure (BP), systolic BP and partial pressure of oxygen compared with the IABP group, indicating worse cardiac and pulmonary function. In addition, higher rates of pulmonary infection and renal insufficiency were observed in the IABP + mechanical ventilation group than in the IABP group. A statistically significant improvement of left ventricular function before and after treatment was observed in the IABP + mechanical ventilation group, but not in the IABP group. Pulmonary infection and renal insufficiency were risk factors for all-cause in-hospital mortality; successful revascularization was a negative risk factor. There was no between-group difference in survival. Conclusion Mechanical ventilation with an appropriate level of PEEP appears to enhance the beneficial effects of IABP on left ventricular function for patients in cardiogenic shock. PMID:27020597
Brady, Timothy F; Oliva, Aude
2008-07-01
Recent work has shown that observers can parse streams of syllables, tones, or visual shapes and learn statistical regularities in them without conscious intent (e.g., learn that A is always followed by B). Here, we demonstrate that these statistical-learning mechanisms can operate at an abstract, conceptual level. In Experiments 1 and 2, observers incidentally learned which semantic categories of natural scenes covaried (e.g., kitchen scenes were always followed by forest scenes). In Experiments 3 and 4, category learning with images of scenes transferred to words that represented the categories. In each experiment, the category of the scenes was irrelevant to the task. Together, these results suggest that statistical-learning mechanisms can operate at a categorical level, enabling generalization of learned regularities using existing conceptual knowledge. Such mechanisms may guide learning in domains as disparate as the acquisition of causal knowledge and the development of cognitive maps from environmental exploration.
Mechanical model for simulating the conditioning of air in the respiratory tract.
Bergonse Neto, Nelson; Von Bahten, Luiz Carlos; Moura, Luís Mauro; Coelho, Marlos de Souza; Stori Junior, Wilson de Souza; Bergonse, Gilberto da Fontoura Rey
2007-01-01
To create a mechanical model that could be regulated to simulate the conditioning of inspired and expired air with the same normal values of temperature, pressure, and relative humidity as those of the respiratory system of a healthy young man on mechanical ventilation. Using several types of materials, a mechanical device was built and regulated using normal values of vital capacity, tidal volume, maximal inspiratory pressure, positive end-expiratory pressure, and gas temperature in the system. The device was submitted to mechanical ventilation for a period of 29.8 min. The changes in the temperature of the air circulating in the system were recorded every two seconds. The statistical analysis of the data collected revealed that the device was approximately as efficient in the conditioning of air as is the respiratory system of a human being. By the study endpoint, we had developed a mechanical device capable of simulating the conditioning of air in the respiratory tract. The device mimics the conditions of temperature, pressure, and relative humidity seen in the respiratory system of healthy individuals.
Effects of different mechanized soil fertilization methods on corn nutrient accumulation and yield
NASA Astrophysics Data System (ADS)
Shi, Qingwen; Bai, Chunming; Wang, Huixin; Wu, Di; Song, Qiaobo; Dong, Zengqi; Gao, Depeng; Dong, Qiping; Cheng, Xin; Zhang, Yahao; Mu, Jiahui; Chen, Qinghong; Liao, Wenqing; Qu, Tianru; Zhang, Chunling; Zhang, Xinyu; Liu, Yifei; Han, Xiaori
2017-05-01
Aim: Experiments for mechanized corn soil fertilization were conducted in Faku demonstration zone. On this basis, we studied effects on corn nutrient accumulation and yield traits at brown soil regions due to different mechanized soil fertilization measures. We also evaluated and optimized the regulation effects of mechanized soil fertilization for the purpose of crop yield increase and production efficiency improvement. Method: Based on the survey of soil background value in the demonstration zone, we collected plant samples during different corn growth periods to determine and make statistical analysis. Conclusions: Decomposed cow dung, when under mechanical broadcasting, was able to remarkably increase nitrogen and potassium accumulation content of corns at their ripe stage. Crushed stalk returning combined with deep tillage would remarkably increase phosphorus accumulation content of corn plants. When compared with top application, crushed stalk returning combined with deep tillage would remarkably increase corn thousand kernel weight (TKW). Mechanized broadcasting of granular organic fertilizer and crushed stalk returning combined with deep tillage, when compared with surface application, were able to boost corn yield in the in the demonstration zone.
Facile Fabrication of 100% Bio-Based and Degradable Ternary Cellulose/PHBV/PLA Composites
Wang, Jinwu
2018-01-01
Modifying bio-based degradable polymers such as polylactide (PLA) and poly(hydroxybutyrate-co-hydroxyvalerate) (PHBV) with non-degradable agents will compromise the 100% degradability of their resultant composites. This work developed a facile and solvent-free route in order to fabricate 100% bio-based and degradable ternary cellulose/PHBV/PLA composite materials. The effects of ball milling on the physicochemical properties of pulp cellulose fibers, and the ball-milled cellulose particles on the morphology and mechanical properties of PHBV/PLA blends, were investigated experimentally and statistically. The results showed that more ball-milling time resulted in a smaller particle size and lower crystallinity by way of mechanical disintegration. Filling PHBV/PLA blends with the ball-milled celluloses dramatically increased the stiffness at all of the levels of particle size and filling content, and improved their elongation at the break and fracture work at certain levels of particle size and filling content. It was also found that the high filling content of the ball-milled cellulose particles was detrimental to the mechanical properties for the resultant composite materials. The ternary cellulose/PHBV/PLA composite materials have some potential applications, such as in packaging materials and automobile inner decoration parts. Furthermore, filling content contributes more to the variations of their mechanical properties than particle size does. Statistical analysis combined with experimental tests provide a new pathway to quantitatively evaluate the effects of multiple variables on a specific property, and figure out the dominant one for the resultant composite materials. PMID:29495315
Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E
2015-03-01
Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
Implicit Statistical Learning and Language Skills in Bilingual Children
ERIC Educational Resources Information Center
Yim, Dongsun; Rudoy, John
2013-01-01
Purpose: Implicit statistical learning in 2 nonlinguistic domains (visual and auditory) was used to investigate (a) whether linguistic experience influences the underlying learning mechanism and (b) whether there are modality constraints in predicting implicit statistical learning with age and language skills. Method: Implicit statistical learning…
Shaikh, Masood Ali
2017-09-01
Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.
Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M
2016-01-01
Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Behavioral pattern identification for structural health monitoring in complex systems
NASA Astrophysics Data System (ADS)
Gupta, Shalabh
Estimation of structural damage and quantification of structural integrity are critical for safe and reliable operation of human-engineered complex systems, such as electromechanical, thermofluid, and petrochemical systems. Damage due to fatigue crack is one of the most commonly encountered sources of structural degradation in mechanical systems. Early detection of fatigue damage is essential because the resulting structural degradation could potentially cause catastrophic failures, leading to loss of expensive equipment and human life. Therefore, for reliable operation and enhanced availability, it is necessary to develop capabilities for prognosis and estimation of impending failures, such as the onset of wide-spread fatigue crack damage in mechanical structures. This dissertation presents information-based online sensing of fatigue damage using the analytical tools of symbolic time series analysis ( STSA). Anomaly detection using STSA is a pattern recognition method that has been recently developed based upon a fixed-structure, fixed-order Markov chain. The analysis procedure is built upon the principles of Symbolic Dynamics, Information Theory and Statistical Pattern Recognition. The dissertation demonstrates real-time fatigue damage monitoring based on time series data of ultrasonic signals. Statistical pattern changes are measured using STSA to monitor the evolution of fatigue damage. Real-time anomaly detection is presented as a solution to the forward (analysis) problem and the inverse (synthesis) problem. (1) the forward problem - The primary objective of the forward problem is identification of the statistical changes in the time series data of ultrasonic signals due to gradual evolution of fatigue damage. (2) the inverse problem - The objective of the inverse problem is to infer the anomalies from the observed time series data in real time based on the statistical information generated during the forward problem. A computer-controlled special-purpose fatigue test apparatus, equipped with multiple sensing devices (e.g., ultrasonics and optical microscope) for damage analysis, has been used to experimentally validate the STSA method for early detection of anomalous behavior. The sensor information is integrated with a software module consisting of the STSA algorithm for real-time monitoring of fatigue damage. Experiments have been conducted under different loading conditions on specimens constructed from the ductile aluminium alloy 7075 - T6. The dissertation has also investigated the application of the STSA method for early detection of anomalies in other engineering disciplines. Two primary applications include combustion instability in a generic thermal pulse combustor model and whirling phenomenon in a typical misaligned shaft.
NASA Technical Reports Server (NTRS)
Yeh, Leehwa
1993-01-01
The phase-space-picture approach to quantum non-equilibrium statistical mechanics via the characteristic function of infinite-mode squeezed coherent states is introduced. We use quantum Brownian motion as an example to show how this approach provides an interesting geometrical interpretation of quantum non-equilibrium phenomena.
Quantum Mechanics From the Cradle?
ERIC Educational Resources Information Center
Martin, John L.
1974-01-01
States that the major problem in learning quantum mechanics is often the student's ignorance of classical mechanics and that one conceptual hurdle in quantum mechanics is its statistical nature, in contrast to the determinism of classical mechanics. (MLH)
Rinaldi, Antonio
2011-04-01
Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).
Gautestad, Arild O
2013-03-01
The flow of GPS data on animal space is challenging old paradigms, such as the issue of the scale-free Lévy walk versus scale-specific Brownian motion. Since these movement classes often require different protocols with respect to ecological analyses, further theoretical development in this field is important. I describe central concepts such as scale-specific versus scale-free movement and the difference between mechanistic and statistical-mechanical levels of analysis. Next, I report how a specific sampling scheme may have produced much confusion: a Lévy walk may be wrongly categorized as Brownian motion if the duration of a move, or bout, is used as a proxy for step length and a move is subjectively defined. Hence, the categorization and recategorization of movement class compliance surrounding the Lévy walk controversy may have been based on a statistical artifact. This issue may be avoided by collecting relocations at a fixed rate at a temporal scale that minimizes over- and undersampling.
Tepedino, Michele; Masedu, Francesco; Chimenti, Claudio
2017-05-30
The aim of the present study was to evaluate the relationship between insertion torque and stability of miniscrews in terms of resistance against dislocation, then comparing a self-tapping screw with a self-drilling one. Insertion torque was measured during placement of 30 self-drilling and 31 self-tapping stainless steel miniscrews (Leone SpA, Sesto Fiorentino, Italy) in synthetic bone blocks. Then, an increasing pulling force was applied at an angle of 90° and 45°, and the displacement of the miniscrews was recorded. The statistical analysis showed a statistically significant difference between the mean Maximum Insertion Torque (MIT) observed in the two groups and showed that force angulation and MIT have a statistically significant effect on miniscrews stability. For both the miniscrews, an angle of 90° between miniscrew and loading force is preferable in terms of stability. The tested self-drilling orthodontic miniscrews showed higher MIT and greater resistance against dislocation than the self-tapping ones.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
Statistical summaries of fatigue data for design purposes
NASA Technical Reports Server (NTRS)
Wirsching, P. H.
1983-01-01
Two methods are discussed for constructing a design curve on the safe side of fatigue data. Both the tolerance interval and equivalent prediction interval (EPI) concepts provide such a curve while accounting for both the distribution of the estimators in small samples and the data scatter. The EPI is also useful as a mechanism for providing necessary statistics on S-N data for a full reliability analysis which includes uncertainty in all fatigue design factors. Examples of statistical analyses of the general strain life relationship are presented. The tolerance limit and EPI techniques for defining a design curve are demonstrated. Examples usng WASPALOY B and RQC-100 data demonstrate that a reliability model could be constructed by considering the fatigue strength and fatigue ductility coefficients as two independent random variables. A technique given for establishing the fatigue strength for high cycle lives relies on an extrapolation technique and also accounts for "runners." A reliability model or design value can be specified.
Estimating short-run and long-run interaction mechanisms in interictal state.
Ozkaya, Ata; Korürek, Mehmet
2010-04-01
We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.
The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model
NASA Astrophysics Data System (ADS)
Verkley, Wim; Severijns, Camiel
2014-05-01
Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).
Kim, Yong Wook; Kim, Hyoung Seop; An, Young-Sil; Im, Sang Hee
2010-10-01
Permanent vegetative state is defined as the impaired level of consciousness longer than 12 months after traumatic causes and 3 months after non-traumatic causes of brain injury. Although many studies assessed the cerebral metabolism in patients with acute and persistent vegetative state after brain injury, few studies investigated the cerebral metabolism in patients with permanent vegetative state. In this study, we performed the voxel-based analysis of cerebral glucose metabolism and investigated the relationship between regional cerebral glucose metabolism and the severity of impaired consciousness in patients with permanent vegetative state after acquired brain injury. We compared the regional cerebral glucose metabolism as demonstrated by F-18 fluorodeoxyglucose positron emission tomography from 12 patients with permanent vegetative state after acquired brain injury with those from 12 control subjects. Additionally, covariance analysis was performed to identify regions where decreased changes in regional cerebral glucose metabolism significantly correlated with a decrease of level of consciousness measured by JFK-coma recovery scale. Statistical analysis was performed using statistical parametric mapping. Compared with controls, patients with permanent vegetative state demonstrated decreased cerebral glucose metabolism in the left precuneus, both posterior cingulate cortices, the left superior parietal lobule (P(corrected) < 0.001), and increased cerebral glucose metabolism in the both cerebellum and the right supramarginal cortices (P(corrected) < 0.001). In the covariance analysis, a decrease in the level of consciousness was significantly correlated with decreased cerebral glucose metabolism in the both posterior cingulate cortices (P(uncorrected) < 0.005). Our findings suggest that the posteromedial parietal cortex, which are part of neural network for consciousness, may be relevant structure for pathophysiological mechanism in patients with permanent vegetative state after acquired brain injury.
NASA Astrophysics Data System (ADS)
Adeoye-Akinde, K.; Gudmundsson, A.
2017-12-01
Heterogeneity and anisotropy, especially with layered strata within the same reservoir, makes the geometry and permeability of an in-situ fracture network challenging to forecast. This study looks at outcrops analogous to reservoir rocks for a better understanding of in-situ fracture networks and permeability, especially fracture formation, propagation, and arrest/deflection. Here, fracture geometry (e.g. length and aperture) from interbedded limestone and shale is combined with statistical and numerical modelling (using the Finite Element Method) to better forecast fracture network properties and permeability. The main aim is to bridge the gap between fracture data obtained at the core level (cm-scale) and at the seismic level (km-scale). Analysis has been made of geometric properties of over 250 fractures from the blue Lias in Nash Point, UK. As fractures propagate, energy is required to keep them going, and according to the laws of thermodynamics, this energy can be linked to entropy. As fractures grow, entropy increases, therefore, the result shows a strong linear correlation between entropy and the scaling exponent of fracture length and aperture-size distributions. Modelling is used to numerically simulate the stress/fracture behaviour in mechanically dissimilar rocks. Results show that the maximum principal compressive stress orientation changes in the host rock as the fracture-induced stress tip moves towards a more compliant (shale) layer. This behaviour can be related to the three mechanisms of fracture arrest/deflection at an interface, namely: elastic mismatch, stress barrier and Cook-Gordon debonding. Tensile stress concentrates at the contact between the stratigraphic layers, ahead of and around the propagating fracture. However, as shale stiffens with time, the stresses concentrated at the contact start to dissipate into it. This can happen in nature through diagenesis, and with greater depth of burial. This study also investigates how induced fractures propagate and interact with existing discontinuities in layered rocks using analogue modelling. Further work will introduce the Maximum Entropy Method for more accurate statistical modelling. This method is mainly useful to forecast likely fracture-size probability distributions from incomplete subsurface information.
Impact of Injury Mechanisms on Patterns and Management of Facial Fractures.
Greathouse, S Travis; Adkinson, Joshua M; Garza, Ramon; Gilstrap, Jarom; Miller, Nathan F; Eid, Sherrine M; Murphy, Robert X
2015-07-01
Mechanisms causing facial fractures have evolved over time and may be predictive of the types of injuries sustained. The objective of this study is to examine the impact of mechanisms of injury on the type and management of facial fractures at our Level 1 Trauma Center. The authors performed an Institutional Review Board-approved review of our network's trauma registry from 2006 to 2010, documenting age, sex, mechanism, Injury Severity Score, Glasgow Coma Scale, facial fracture patterns (nasal, maxillary/malar, orbital, mandible), and reconstructions. Mechanism rates were compared using a Pearson χ2 test. The database identified 23,318 patients, including 1686 patients with facial fractures and a subset of 1505 patients sustaining 2094 fractures by motor vehicle collision (MVC), fall, or assault. Nasal fractures were the most common injuries sustained by all mechanisms. MVCs were most likely to cause nasal and malar/maxillary fractures (P < 0.01). Falls were the least likely and assaults the most likely to cause mandible fractures (P < 0.001), the most common injury leading to surgical intervention (P < 0.001). Although not statistically significant, fractures sustained in MVCs were the most likely overall to undergo surgical intervention. Age, number of fractures, and alcohol level were statistically significant variables associated with operative management. Age and number of fractures sustained were associated with operative intervention. Although there is a statistically significant correlation between mechanism of injury and type of facial fracture sustained, none of the mechanisms evaluated herein are statistically associated with surgical intervention. Clinical Question/Level of Evidence: Therapeutic, III.
An experimental study of the temporal statistics of radio signals scattered by rain
NASA Technical Reports Server (NTRS)
Hubbard, R. W.; Hull, J. A.; Rice, P. L.; Wells, P. I.
1973-01-01
A fixed-beam bistatic CW experiment designed to measure the temporal statistics of the volume reflectivity produced by hydrometeors at several selected altitudes, scattering angles, and at two frequencies (3.6 and 7.8 GHz) is described. Surface rain gauge data, local meteorological data, surveillance S-band radar, and great-circle path propagation measurements were also made to describe the general weather and propagation conditions and to distinguish precipitation scatter signals from those caused by ducting and other nonhydrometeor scatter mechanisms. The data analysis procedures were designed to provide an assessment of a one-year sample of data with a time resolution of one minute. The cumulative distributions of the bistatic signals for all of the rainy minutes during this period are presented for the several path geometries.
Colloquium: Statistical mechanics of money, wealth, and income
NASA Astrophysics Data System (ADS)
Yakovenko, Victor M.; Rosser, J. Barkley, Jr.
2009-10-01
This Colloquium reviews statistical models for money, wealth, and income distributions developed in the econophysics literature since the late 1990s. By analogy with the Boltzmann-Gibbs distribution of energy in physics, it is shown that the probability distribution of money is exponential for certain classes of models with interacting economic agents. Alternative scenarios are also reviewed. Data analysis of the empirical distributions of wealth and income reveals a two-class distribution. The majority of the population belongs to the lower class, characterized by the exponential (“thermal”) distribution, whereas a small fraction of the population in the upper class is characterized by the power-law (“superthermal”) distribution. The lower part is very stable, stationary in time, whereas the upper part is highly dynamical and out of equilibrium.
An investigation into the causes of stratospheric ozone loss in the southern Australasian region
NASA Astrophysics Data System (ADS)
Lehmann, P.; Karoly, D. J.; Newmann, P. A.; Clarkson, T. S.; Matthews, W. A.
1992-07-01
Measurements of total ozone at Macquarie Island (55 deg S, 159 deg E) reveal statistically significant reductions of approximately twelve percent during July to September when comparing the mean levels for 1987-90 with those in the seventies. In order to investigate the possibility that these ozone changes may not be a result of dynamic variability of the stratosphere, a simple linear model of ozone was created from statistical analysis of tropopause height and isentropic transient eddy heat flux, which were assumed representative of the dominant dynamic influences. Comparison of measured and modeled ozone indicates that the recent downward trend in ozone at Macquarie Island is not related to stratospheric dynamic variability and therefore suggests another mechanism, possibly changes in photochemical destruction of ozone.
[Study on correlation between ITS sequence of Arctium lappa and quality of Fructus Arctii].
Xu, Liang; Dou, Deqiang; Wang, Bing; Yang, Yanyun; Kang, Tingguo
2011-07-01
To study the correlation between ITS sequence of Arctium lappa and Fructus Arctii quality of different origin. The samples of Fructu arctii materials were collected from 26 different producing areas. Their ITS sequence were determined after polymerase chain reaction (PCR) and quality were evaluated through the determination of arctiin content by HPLC. Genetic diversity, genotype and correlation were analyzed by ClustalX (1.81), Mage 4.0, SPSS 13.0 statistical software. ITS sequence of A. was obtained from 26 samples, and was registered in the GenBank. Corresponding arctiin content of Fructus arctii and 1000-grain weight were determined. A. lappa genotype correlated with Fructus arctii quality by statistical analysis. The research provided a foundation for revealing the molecular mechanism of Fructus arctii geoherbs.
Patel, Vikram; Burns, Jonathan K; Dhingra, Monisha; Tarver, Leslie; Kohrt, Brandon A; Lund, Crick
2018-02-01
Most countries have witnessed a dramatic increase of income inequality in the past three decades. This paper addresses the question of whether income inequality is associated with the population prevalence of depression and, if so, the potential mechanisms and pathways which may explain this association. Our systematic review included 26 studies, mostly from high-income countries. Nearly two-thirds of all studies and five out of six longitudinal studies reported a statistically significant positive relationship between income inequality and risk of depression; only one study reported a statistically significant negative relationship. Twelve studies were included in a meta-analysis with dichotomized inequality groupings. The pooled risk ratio was 1.19 (95% CI: 1.07-1.31), demonstrating greater risk of depression in populations with higher income inequality relative to populations with lower inequality. Multiple studies reported subgroup effects, including greater impacts of income inequality among women and low-income populations. We propose an ecological framework, with mechanisms operating at the national level (the neo-material hypothesis), neighbourhood level (the social capital and the social comparison hypotheses) and individual level (psychological stress and social defeat hypotheses) to explain this association. We conclude that policy makers should actively promote actions to reduce income inequality, such as progressive taxation policies and a basic universal income. Mental health professionals should champion such policies, as well as promote the delivery of interventions which target the pathways and proximal determinants, such as building life skills in adolescents and provision of psychological therapies and packages of care with demonstrated effectiveness for settings of poverty and high income inequality. © 2018 World Psychiatric Association.
Dynamics of Sleep Stage Transitions in Health and Disease
NASA Astrophysics Data System (ADS)
Kishi, Akifumi; Struzik, Zbigniew R.; Natelson, Benjamin H.; Togo, Fumiharu; Yamamoto, Yoshiharu
2007-07-01
Sleep dynamics emerges from complex interactions between neuronal populations in many brain regions. Annotated sleep stages from electroencephalography (EEG) recordings could potentially provide a non-invasive way to obtain valuable insights into the mechanisms of these interactions, and ultimately into the very nature of sleep regulation. However, to date, sleep stage analysis has been restricted, only very recently expanding the scope of the traditional descriptive statistics to more dynamical concepts of the duration of and transitions between vigilance states and temporal evaluation of transition probabilities among different stages. Physiological and/or pathological implications of the dynamics of sleep stage transitions have, to date, not been investigated. Here, we study detailed duration and transition statistics among sleep stages in healthy humans and patients with chronic fatigue syndrome, known to be associated with disturbed sleep. We find that the durations of waking and non-REM sleep, in particular deep sleep (Stages III and IV), during the nighttime, follow a power-law probability distribution function, while REM sleep durations follow an exponential function, suggestive of complex underlying mechanisms governing the onset of light sleep. We also find a substantial number of REM to non-REM transitions in humans, while this transition is reported to be virtually non-existent in rats. Interestingly, the probability of this REM to non-REM transition is significantly lower in the patients than in controls, resulting in a significantly greater REM to awake, together with Stage I to awake, transition probability. This might potentially account for the reported poor sleep quality in the patients because the normal continuation of sleep after either the lightest or REM sleep is disrupted. We conclude that the dynamical transition analysis of sleep stages is useful for elucidating yet-to-be-determined human sleep regulation mechanisms with a pathophysiological implication.
Patel, Vikram; Burns, Jonathan K.; Dhingra, Monisha; Tarver, Leslie; Kohrt, Brandon A.; Lund, Crick
2018-01-01
Most countries have witnessed a dramatic increase of income inequality in the past three decades. This paper addresses the question of whether income inequality is associated with the population prevalence of depression and, if so, the potential mechanisms and pathways which may explain this association. Our systematic review included 26 studies, mostly from high‐income countries. Nearly two‐thirds of all studies and five out of six longitudinal studies reported a statistically significant positive relationship between income inequality and risk of depression; only one study reported a statistically significant negative relationship. Twelve studies were included in a meta‐analysis with dichotomized inequality groupings. The pooled risk ratio was 1.19 (95% CI: 1.07‐1.31), demonstrating greater risk of depression in populations with higher income inequality relative to populations with lower inequality. Multiple studies reported subgroup effects, including greater impacts of income inequality among women and low‐income populations. We propose an ecological framework, with mechanisms operating at the national level (the neo‐material hypothesis), neighbourhood level (the social capital and the social comparison hypotheses) and individual level (psychological stress and social defeat hypotheses) to explain this association. We conclude that policy makers should actively promote actions to reduce income inequality, such as progressive taxation policies and a basic universal income. Mental health professionals should champion such policies, as well as promote the delivery of interventions which target the pathways and proximal determinants, such as building life skills in adolescents and provision of psychological therapies and packages of care with demonstrated effectiveness for settings of poverty and high income inequality. PMID:29352539
Carter, Rebecca R; DiFeo, Analisa; Bogie, Kath; Zhang, Guo-Qiang; Sun, Jiayang
2014-01-01
Ovarian cancer is the most lethal gynecologic disease in the United States, with more women dying from this cancer than all gynecological cancers combined. Ovarian cancer has been termed the "silent killer" because some patients do not show clear symptoms at an early stage. Currently, there is a lack of approved and effective early diagnostic tools for ovarian cancer. There is also an apparent severe knowledge gap of ovarian cancer in general and of its indicative symptoms among both public and many health professionals. These factors have significantly contributed to the late stage diagnosis of most ovarian cancer patients (63% are diagnosed at Stage III or above), where the 5-year survival rate is less than 30%. The paucity of knowledge concerning ovarian cancer in the United States is unknown. The present investigation examined current public awareness and knowledge about ovarian cancer. The study implemented design strategies to develop an unbiased survey with quality control measures, including the modern application of multiple statistical analyses. The survey assessed a reasonable proxy of the US population by crowdsourcing participants through the online task marketplace Amazon Mechanical Turk, at a highly condensed rate of cost and time compared to traditional recruitment methods. Knowledge of ovarian cancer was compared to that of breast cancer using repeated measures, bias control and other quality control measures in the survey design. Analyses included multinomial logistic regression and categorical data analysis procedures such as correspondence analysis, among other statistics. We confirmed the relatively poor public knowledge of ovarian cancer among the US population. The simple, yet novel design should set an example for designing surveys to obtain quality data via Amazon Mechanical Turk with the associated analyses.
Choe, Joshua A; Jana, Soumen; Tefft, Brandon J; Hennessy, Ryan S; Go, Jason; Morse, David; Lerman, Amir; Young, Melissa D
2018-05-10
Fixed pericardial tissue is commonly used for commercially available xenograft valve implants, and has proven durability, but lacks the capability to remodel and grow. Decellularized porcine pericardial tissue has the promise to outperform fixed tissue and remodel, but the decellularization process has been shown to damage the collagen structure and reduce mechanical integrity of the tissue. Therefore, a comparison of uniaxial tensile properties was performed on decellularized, decellularized-sterilized, fixed, and native porcine pericardial tissue, versus native valve leaflet cusps. The results of non-parametric analysis showed statistically significant differences (p<0.05) between the stiffness of 1) decellularized vs. native pericardium, and native cusps as well as fixed tissue respectively; however decellularized tissue showed large increases in elastic properties. Porosity testing of the tissues showed no statistical difference between decellularized or decell-sterilized tissue compared to native cusps (p>0.05). SEM confirmed that valvular endothelial and interstitial cells colonized the decellularized pericardial surface when seeded and grown for 30 days in static culture. Collagen assays and TEM analysis showed limited reductions in collagen with processing; yet, GAG assays showed great reductions in the processed pericardium relative to native cusps. Decellularized pericardium had comparatively lower mechanical properties amongst the groups studied; yet, the stiffness was comparatively similar to the native cusps and demonstrated a lack of cytotoxicity. Suture retention, accelerated wear, and hydrodynamic testing of prototype decellularized and decell-sterilized valves showed positive functionality. Sterilized tissue could mimic valvular mechanical environment in vitro, therefore making it a viable potential candidate for off-the-shelf tissue engineered valvular applications. KEYTERMS Decellularization, Sterilization, Pericardial Tissue, Heart Valves, Tissue Engineering, Biomechanics. This article is protected by copyright. All rights reserved.
Analysis of Noise Mechanisms in Cell-Size Control.
Modi, Saurabh; Vargas-Garcia, Cesar Augusto; Ghusinga, Khem Raj; Singh, Abhyudai
2017-06-06
At the single-cell level, noise arises from multiple sources, such as inherent stochasticity of biomolecular processes, random partitioning of resources at division, and fluctuations in cellular growth rates. How these diverse noise mechanisms combine to drive variations in cell size within an isoclonal population is not well understood. Here, we investigate the contributions of different noise sources in well-known paradigms of cell-size control, such as adder (division occurs after adding a fixed size from birth), sizer (division occurs after reaching a size threshold), and timer (division occurs after a fixed time from birth). Analysis reveals that variation in cell size is most sensitive to errors in partitioning of volume among daughter cells, and not surprisingly, this process is well regulated among microbes. Moreover, depending on the dominant noise mechanism, different size-control strategies (or a combination of them) provide efficient buffering of size variations. We further explore mixer models of size control, where a timer phase precedes/follows an adder, as has been proposed in Caulobacter crescentus. Although mixing a timer and an adder can sometimes attenuate size variations, it invariably leads to higher-order moments growing unboundedly over time. This results in a power-law distribution for the cell size, with an exponent that depends inversely on the noise in the timer phase. Consistent with theory, we find evidence of power-law statistics in the tail of C. crescentus cell-size distribution, although there is a discrepancy between the observed power-law exponent and that predicted from the noise parameters. The discrepancy, however, is removed after data reveal that the size added by individual newborns in the adder phase itself exhibits power-law statistics. Taken together, this study provides key insights into the role of noise mechanisms in size homeostasis, and suggests an inextricable link between timer-based models of size control and heavy-tailed cell-size distributions. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
NASA Astrophysics Data System (ADS)
Athiyamaan, V.; Mohan Ganesh, G.
2017-11-01
Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.
Ayers, John W; Hofstetter, C Richard; Hughes, Suzanne C; Irvin, Veronica L; Sim, D Eastern Kang; Hovell, Melbourne F
2009-11-01
This research identifies social reinforcers within religious institutions associated with alcohol consumption among Korean women in California. Data were drawn from telephone interviews with female adults (N = 591) selected from a random sampling of persons in California with Korean surnames during 2007. Approximately 70% of attempted interviews were completed, with 92% conducted in Korean. Respondents were asked about any lifetime drinking (yes/no), drinking rate (typical number of drinks consumed on drinking days among current drinkers), and messages discouraging "excessive drinking" from religious leaders or congregants. Bivariable and multivariable regressions were used for analysis. Approximately 70.4% of women reported any lifetime drinking, and drinkers drank a mean (SD) of 1.10 (1.22) drinks on drinking days. About 30.8% reported any exposure to religious leaders' messages discouraging excessive drinking, and 28.2% reported any exposure to similar messages from congregants. Each congregant's message was statistically significantly associated with a 5.1% lower probability (odds ratio = 0.775, 95% confidence interval [CI]: 0.626, 0.959) of any lifetime drinking. also, each congregant's message was associated with a 13.8% (B = -0.138; 95% CI: -0.306, 0.029) lower drinking rate, which was statistically significant after adjusting for covariates using a one-tailed test. Exposure to leaders' messages was not statistically significantly associated with any lifetime drinking or drinking rate. Social reinforcement in the form of religious messages may be one mechanism by which religious institutions influence drinking behaviors. For Korean women, messages from congregants had a unique impact beyond the traditional religiosity indicators. These social mechanisms provide public health interventionists with religious pathways to improve drinking behaviors.
Reading biological processes from nucleotide sequences
NASA Astrophysics Data System (ADS)
Murugan, Anand
Cellular processes have traditionally been investigated by techniques of imaging and biochemical analysis of the molecules involved. The recent rapid progress in our ability to manipulate and read nucleic acid sequences gives us direct access to the genetic information that directs and constrains biological processes. While sequence data is being used widely to investigate genotype-phenotype relationships and population structure, here we use sequencing to understand biophysical mechanisms. We present work on two different systems. First, in chapter 2, we characterize the stochastic genetic editing mechanism that produces diverse T-cell receptors in the human immune system. We do this by inferring statistical distributions of the underlying biochemical events that generate T-cell receptor coding sequences from the statistics of the observed sequences. This inferred model quantitatively describes the potential repertoire of T-cell receptors that can be produced by an individual, providing insight into its potential diversity and the probability of generation of any specific T-cell receptor. Then in chapter 3, we present work on understanding the functioning of regulatory DNA sequences in both prokaryotes and eukaryotes. Here we use experiments that measure the transcriptional activity of large libraries of mutagenized promoters and enhancers and infer models of the sequence-function relationship from this data. For the bacterial promoter, we infer a physically motivated 'thermodynamic' model of the interaction of DNA-binding proteins and RNA polymerase determining the transcription rate of the downstream gene. For the eukaryotic enhancers, we infer heuristic models of the sequence-function relationship and use these models to find synthetic enhancer sequences that optimize inducibility of expression. Both projects demonstrate the utility of sequence information in conjunction with sophisticated statistical inference techniques for dissecting underlying biophysical mechanisms.
Ayers, John W.; Hofstetter, C. Richard; Hughes, Suzanne C.; Irvin, Veronica L.; Kang Sim, D. Eastern; Hovell, Melbourne F.
2009-01-01
Objective: This research identifies social reinforcers within religious institutions associated with alcohol consumption among Korean women in California. Method: Data were drawn from telephone interviews with female adults (N = 591) selected from a random sampling of persons in California with Korean surnames during 2007. Approximately 70% of attempted interviews were completed, with 92% conducted in Korean. Respondents were asked about any lifetime drinking (yes/no), drinking rate (typical number of drinks consumed on drinking days among current drinkers), and messages discouraging “excessive drinking” from religious leaders or congregants. Bivariable and multivariable regressions were used for analysis. Results: Approximately 70.4% of women reported any lifetime drinking, and drinkers drank a mean (SD) of 1.10 (1.22) drinks on drinking days. About 30.8%reported about 30.8% reported any exposure to religious leaders' messages discouraging excessive drinking, and 28.2% reported any exposure to similar messages from congregants. Each congregant's message was statistically significantly associated with a 5.1% lower probability (odds ratio = 0.775, 95% confidence interval [CI]: 0.626, 0.959) of any lifetime drinking. Also, each congregant's message was associated with a 13.8% (B = -0.138; 95% CI: -0.306, 0.029) lower drinking rate, which was statistically significant after adjusting for covariates using a one-tailed test. Exposure to leaders' messages was not statistically significantly associated with any lifetime drinking or drinking rate. Conclusions: Social reinforcement in the form of religious messages may be one mechanism by which religious institutions influence drinking behaviors. For Korean women, messages from congregants had a unique impact beyond the traditional religiosity indicators. These social mechanisms provide public health interventionists with religious pathways to improve drinking behaviors. PMID:19895765
Hitting Is Contagious in Baseball: Evidence from Long Hitting Streaks
Bock, Joel R.; Maewal, Akhilesh; Gough, David A.
2012-01-01
Data analysis is used to test the hypothesis that “hitting is contagious”. A statistical model is described to study the effect of a hot hitter upon his teammates’ batting during a consecutive game hitting streak. Box score data for entire seasons comprising streaks of length games, including a total observations were compiled. Treatment and control sample groups () were constructed from core lineups of players on the streaking batter’s team. The percentile method bootstrap was used to calculate confidence intervals for statistics representing differences in the mean distributions of two batting statistics between groups. Batters in the treatment group (hot streak active) showed statistically significant improvements in hitting performance, as compared against the control. Mean for the treatment group was found to be to percentage points higher during hot streaks (mean difference increased points), while the batting heat index introduced here was observed to increase by points. For each performance statistic, the null hypothesis was rejected at the significance level. We conclude that the evidence suggests the potential existence of a “statistical contagion effect”. Psychological mechanisms essential to the empirical results are suggested, as several studies from the scientific literature lend credence to contagious phenomena in sports. Causal inference from these results is difficult, but we suggest and discuss several latent variables that may contribute to the observed results, and offer possible directions for future research. PMID:23251507
Probabilistic models for reactive behaviour in heterogeneous condensed phase media
NASA Astrophysics Data System (ADS)
Baer, M. R.; Gartling, D. K.; DesJardin, P. E.
2012-02-01
This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.
Wang, Ming; Long, Qi
2016-09-01
Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.
Turking Statistics: Student-Generated Surveys Increase Student Engagement and Performance
ERIC Educational Resources Information Center
Whitley, Cameron T.; Dietz, Thomas
2018-01-01
Thirty years ago, Hubert M. Blalock Jr. published an article in "Teaching Sociology" about the importance of teaching statistics. We honor Blalock's legacy by assessing how using Amazon Mechanical Turk (MTurk) in statistics classes can enhance student learning and increase statistical literacy among social science gradaute students. In…
Determining Functional Reliability of Pyrotechnic Mechanical Devices
NASA Technical Reports Server (NTRS)
Bement, Laurence J.; Multhaup, Herbert A.
1997-01-01
This paper describes a new approach for evaluating mechanical performance and predicting the mechanical functional reliability of pyrotechnic devices. Not included are other possible failure modes, such as the initiation of the pyrotechnic energy source. The requirement of hundreds or thousands of consecutive, successful tests on identical components for reliability predictions, using the generally accepted go/no-go statistical approach routinely ignores physics of failure. The approach described in this paper begins with measuring, understanding and controlling mechanical performance variables. Then, the energy required to accomplish the function is compared to that delivered by the pyrotechnic energy source to determine mechanical functional margin. Finally, the data collected in establishing functional margin is analyzed to predict mechanical functional reliability, using small-sample statistics. A careful application of this approach can provide considerable cost improvements and understanding over that of go/no-go statistics. Performance and the effects of variables can be defined, and reliability predictions can be made by evaluating 20 or fewer units. The application of this approach to a pin puller used on a successful NASA mission is provided as an example.
Fu, Huichao; Wang, Jiaxing; Zhou, Shenyuan; Cheng, Tao; Zhang, Wen; Wang, Qi; Zhang, Xianlong
2015-11-01
There is a rising interest in the use of patient-specific instrumentation (PSI) during total knee arthroplasty (TKA). The goal of this meta-analysis was to compare PSI with conventional instrumentation (CI) in patients undergoing TKA. A literature search was performed in PubMed, Embase, Springer, Ovid, China National Knowledge Infrastructure, and the Cochrane Library. A total of 10 randomized controlled studies involving 837 knees comparing outcomes of PSI TKAs with CI TKAs were included in the present analysis. Outcomes of interest included component alignment, surgical time, blood loss, and hospital stay. The results presented no significant differences between the two instrumentations in terms of restoring a neutral mechanical axis and femoral component placement. However, their differences have been noted regarding the alignment of the tibial component in coronal and sagittal planes. Also, 3 min less surgical time was used in PSI patients. Based on these findings, PSI appeared not to be superior to CI in terms of the post-operative mechanical axis of the limb or femoral component placement. Despite a statistical difference for operative duration, the benefit of a small reduction in surgical time with PSI is clinically irrelevant. Therapeutic study (systematic review and meta-analysis), Level I.
Analysis of moving surface structures at a laser-induced boiling front
NASA Astrophysics Data System (ADS)
Matti, R. S.; Kaplan, A. F. H.
2014-10-01
Recently ultra-high speed imaging enabled to observe moving wave patterns on metal melts that experience laser-induced boiling. In laser materials processing a vertical laser-induced boiling front governs processes like keyhole laser welding, laser remote fusion cutting, laser drilling or laser ablation. The observed waves originate from temperature variations that are closely related to the melt topology. For improved understanding of the essential front mechanisms and of the front topology, for the first time a deeper systematic analysis of the wave patterns was carried out. Seven geometrical shapes of bright or dark domains were distinguished and categorized, in particular bright peaks of three kinds and dark valleys, often inclined. Two categories describe special flow patterns at the top and bottom of the front. Dynamic and statistical analysis has revealed that the shapes often combine or separate from one category to another when streaming down the front. The brightness of wave peaks typically fluctuates during 20-50 μs. This variety of thermal wave observations is interpreted with respect to the accompanying surface topology of the melt and in turn for governing local mechanisms like absorption, shadowing, boiling, ablation pressure and melt acceleration. The findings can be of importance for understanding the key process mechanisms and for optimizing laser materials processing.
Transcriptome profile and unique genetic evolution of positively selected genes in yak lungs.
Lan, DaoLiang; Xiong, XianRong; Ji, WenHui; Li, Jian; Mipam, Tserang-Donko; Ai, Yi; Chai, ZhiXin
2018-04-01
The yak (Bos grunniens), which is a unique bovine breed that is distributed mainly in the Qinghai-Tibetan Plateau, is considered a good model for studying plateau adaptability in mammals. The lungs are important functional organs that enable animals to adapt to their external environment. However, the genetic mechanism underlying the adaptability of yak lungs to harsh plateau environments remains unknown. To explore the unique evolutionary process and genetic mechanism of yak adaptation to plateau environments, we performed transcriptome sequencing of yak and cattle (Bos taurus) lungs using RNA-Seq technology and a subsequent comparison analysis to identify the positively selected genes in the yak. After deep sequencing, a normal transcriptome profile of yak lung that containing a total of 16,815 expressed genes was obtained, and the characteristics of yak lungs transcriptome was described by functional analysis. Furthermore, Ka/Ks comparison statistics result showed that 39 strong positively selected genes are identified from yak lungs. Further GO and KEGG analysis was conducted for the functional annotation of these genes. The results of this study provide valuable data for further explorations of the unique evolutionary process of high-altitude hypoxia adaptation in yaks in the Tibetan Plateau and the genetic mechanism at the molecular level.
Variation Principles and Applications in the Study of Cell Structure and Aging
NASA Technical Reports Server (NTRS)
Economos, Angelos C.; Miquel, Jaime; Ballard, Ralph C.; Johnson, John E., Jr.
1981-01-01
In this report we have attempted to show that "some reality lies concealed in biological variation". This "reality" has its principles, laws, mechanisms, and rules, only a few of which we have sketched. A related idea we pursued was that important information may be lost in the process of ignoring frequency distributions of physiological variables (as is customary in experimental physiology and gerontology). We suggested that it may be advantageous to expand one's "statistical field of vision" beyond simple averages +/- standard deviations. Indeed, frequency distribution analysis may make visible some hidden information not evident from a simple qualitative analysis, particularly when the effect of some external factor or condition (e.g., aging, dietary chemicals) is being investigated. This was clearly illustrated by the application of distribution analysis in the study of variation in mouse liver cellular and fine structure, and may be true of fine structural studies in general. In living systems, structure and function interact in a dynamic way; they are "inseparable," unlike in technological systems or machines. Changes in fine structure therefore reflect changes in function. If such changes do not exceed a certain physiologic range, a quantitative analysis of structure will provide valuable information on quantitative changes in function that may not be possible or easy to measure directly. Because there is a large inherent variation in fine structure of cells in a given organ of an individual and among individuals, changes in fine structure can be analyzed only by studying frequency distribution curves of various structural characteristics (dimensions). Simple averages +/- S.D. do not in general reveal all information on the effect of a certain factor, because often this effect is not uniform; on the contrary, this will be apparent from distribution analysis because the form of the curves will be affected. We have also attempted to show in this chapter that similar general statistical principles and mechanisms may be operative in biological and technological systems. Despite the common belief that most biological and technological characteristics of interest have a symmetric bell-shaped (normal or Gaussian) distribution, we have shown that more often than not, distributions tend to be asymmetric and often resemble a so-called log-normal distribution. We saw that at least three general mechanisms may be operative, i.e., nonadditivity of influencing factors, competition among individuals for a common resource, and existence of an "optimum" value for a studied characteristic; more such mechanisms could exist.
Analysis of Variance: What Is Your Statistical Software Actually Doing?
ERIC Educational Resources Information Center
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
Unlikely Fluctuations and Non-Equilibrium Work Theorems-A Simple Example.
Muzikar, Paul
2016-06-30
An exciting development in statistical mechanics has been the elucidation of a series of surprising equalities involving the work done during a nonequilibrium process. Astumian has presented an elegant example of such an equality, involving a colloidal particle undergoing Brownian motion in the presence of gravity. We analyze this example; its simplicity, and its link to geometric Brownian motion, allows us to clarify the inner workings of the equality. Our analysis explicitly shows the important role played by large, unlikely fluctuations.
Investigation of Mechanisms Underlying Odor Recognition.
1984-02-01
have obtained recordings of the EOG from the cribriform plate (through which the olfactory receptor nerves pass from the epithelium to the bulb...preinjection period, postinjection period) by 5 (2 day trial blocks during both periods = 10 total days). The results of this statistical analysis are... Total ,11.296 119 Between Ss 6.230 11 ---- Groups 0.928 1 0.928 1.750 ns Error 5.302 10 .0,530 Within Ss 5.066 108 Injection 0.213 1 0.213 1.507 ns
Eu, Byung Chan
2008-09-07
In the traditional theories of irreversible thermodynamics and fluid mechanics, the specific volume and molar volume have been interchangeably used for pure fluids, but in this work we show that they should be distinguished from each other and given distinctive statistical mechanical representations. In this paper, we present a general formula for the statistical mechanical representation of molecular domain (volume or space) by using the Voronoi volume and its mean value that may be regarded as molar domain (volume) and also the statistical mechanical representation of volume flux. By using their statistical mechanical formulas, the evolution equations of volume transport are derived from the generalized Boltzmann equation of fluids. Approximate solutions of the evolution equations of volume transport provides kinetic theory formulas for the molecular domain, the constitutive equations for molar domain (volume) and volume flux, and the dissipation of energy associated with volume transport. Together with the constitutive equation for the mean velocity of the fluid obtained in a previous paper, the evolution equations for volume transport not only shed a fresh light on, and insight into, irreversible phenomena in fluids but also can be applied to study fluid flow problems in a manner hitherto unavailable in fluid dynamics and irreversible thermodynamics. Their roles in the generalized hydrodynamics will be considered in the sequel.
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
NASA Astrophysics Data System (ADS)
Zhao, Yong; Yang, Tianhong; Bohnhoff, Marco; Zhang, Penghai; Yu, Qinglei; Zhou, Jingren; Liu, Feiyue
2018-05-01
To quantitatively understand the failure process and failure mechanism of a rock mass during the transformation from open-pit mining to underground mining, the Shirengou Iron Mine was selected as an engineering project case study. The study area was determined using the rock mass basic quality classification method and the kinematic analysis method. Based on the analysis of the variations in apparent stress and apparent volume over time, the rock mass failure process was analyzed. According to the recent research on the temporal and spatial change of microseismic events in location, energy, apparent stress, and displacement, the migration characteristics of rock mass damage were studied. A hybrid moment tensor inversion method was used to determine the rock mass fracture source mechanisms, the fracture orientations, and fracture scales. The fracture area can be divided into three zones: Zone A, Zone B, and Zone C. A statistical analysis of the orientation information of the fracture planes orientations was carried out, and four dominant fracture planes were obtained. Finally, the slip tendency analysis method was employed, and the unstable fracture planes were obtained. The results show: (1) The microseismic monitoring and hybrid moment tensor analysis can effectively analyze the failure process and failure mechanism of rock mass, (2) during the transformation from open-pit to underground mining, the failure type of rock mass is mainly shear failure and the tensile failure is mostly concentrated in the roof of goafs, and (3) the rock mass of the pit bottom and the upper of goaf No. 18 have the possibility of further damage.
Kokolis, John; Chakmakchi, Makdad; Theocharopoulos, Antonios; Prombonas, Anthony
2015-01-01
PURPOSE The mechanical and interfacial characterization of laser welded Co-Cr alloy with two different joint designs. MATERIALS AND METHODS Dumbbell cast specimens (n=30) were divided into 3 groups (R, I, K, n=10). Group R consisted of intact specimens, group I of specimens sectioned with a straight cut, and group K of specimens with a 45° bevel made at the one welding edge. The microstructure and the elemental distributions of alloy and welding regions were examined by an SEM/EDX analysis and then specimens were loaded in tension up to fracture. The tensile strength (TS) and elongation (ε) were determined and statistically compared among groups employing 1-way ANOVA, SNK multiple comparison test (α=.05) and Weibull analysis where Weibull modulus m and characteristic strength σο were identified. Fractured surfaces were imaged by a SEM. RESULTS SEM/EDX analysis showed that cast alloy consists of two phases with differences in mean atomic number contrast, while no mean atomic number was identified for welded regions. EDX analysis revealed an increased Cr and Mo content at the alloy-joint interface. All mechanical properties of group I (TS, ε, m and σο) were found inferior to R while group K showed intermediated values without significant differences to R and I, apart from elongation with group R. The fractured surfaces of all groups showed extensive dendritic pattern although with a finer structure in the case of welded groups. CONCLUSION The K shape joint configuration should be preferred over the I, as it demonstrates improved mechanical strength and survival probability. PMID:25722836
Ramirez, Ivan I; Arellano, Daniel H; Adasme, Rodrigo S; Landeros, Jose M; Salinas, Francisco A; Vargas, Alvaro G; Vasquez, Francisco J; Lobos, Ignacio A; Oyarzun, Magdalena L; Restrepo, Ruben D
2017-02-01
Waveform analysis by visual inspection can be a reliable, noninvasive, and useful tool for detecting patient-ventilator asynchrony. However, it is a skill that requires a properly trained professional. This observational study was conducted in 17 urban ICUs. Health-care professionals (HCPs) working in these ICUs were asked to recognize different types of asynchrony shown in 3 evaluation videos. The health-care professionals were categorized according to years of experience, prior training in mechanical ventilation, profession, and number of asynchronies identified correctly. A total of 366 HCPs were evaluated. Statistically significant differences were found when HCPs with and without prior training in mechanical ventilation (trained vs non-trained HCPs) were compared according to the number of asynchronies detected correctly (of the HCPs who identified 3 asynchronies, 63 [81%] trained vs 15 [19%] non-trained, P < .001; 2 asynchronies, 72 [65%] trained vs 39 [35%] non-trained, P = .034; 1 asynchrony, 55 [47%] trained vs 61 [53%] non-trained, P = .02; 0 asynchronies, 17 [28%] trained vs 44 [72%] non-trained, P < .001). HCPs who had prior training in mechanical ventilation also increased, nearly 4-fold, their odds of identifying ≥2 asynchronies correctly (odds ratio 3.67, 95% CI 1.93-6.96, P < .001). However, neither years of experience nor profession were associated with the ability of HCPs to identify asynchrony. HCPs who have specific training in mechanical ventilation increase their ability to identify asynchrony using waveform analysis. Neither experience nor profession proved to be a relevant factor to identify asynchrony correctly using waveform analysis. Copyright © 2017 by Daedalus Enterprises.
Mechanical properties of dental resin/composite containing urchin-like hydroxyapatite.
Liu, Fengwei; Sun, Bin; Jiang, Xiaoze; Aldeyab, Sultan S; Zhang, Qinghong; Zhu, Meifang
2014-12-01
To investigate the reinforcing effect of urchin-like hydroxyapatite (UHA) in bisphenol A glycidyl methacrylate (Bis-GMA)/triethylene glycol dimethacrylate (TEGDMA) dental resin (without silica nanoparticles) and dental composites (with silica nanoparticles), and explore the effect of HA filler morphologies and loadings on the mechanical properties. UHA was synthesized by a facile method of microwave irradiation and studied by X-ray diffraction (XRD), scanning electron microscope (SEM), and thermogravimetric analysis (TGA). Mechanical properties of the dental resin composites containing silanized UHA were tested by a universal mechanical testing machine. Analysis of variance was used for the statistical analysis of the acquired data. The fracture morphologies of tested composites were observed by SEM. Composites with silanized irregular particulate hydroxyapatite (IPHA) and hydroxyapatite whisker (HW) were prepared for comparative studies. Impregnation of lower loadings (5 wt% and 10 wt%) of silanized UHA into dental resin (without silica nanoparticles) substantially improved the mechanical properties; higher UHA loadings (20 wt% and 30 wt%) of impregnation continuously improved the flexural modulus and microhardness, while the strength would no longer be increased. Compared with silanized IPHA and HW, silanized UHA consisting of rods extending radially from center were embedded into the matrix closely and well dispersed in the composite, increasing filler-matrix interfacial contact area and combination. At higher filler loadings, UHA interlaced together tightly without affecting the mobility of monomer inside, which might bear higher loads during fracture of the composite, leading to higher strengths than those of dental resins with IPHA and HW. Besides, impregnation of silanized UHA into dental composites (with silica nanoparticles) significantly improved the strength and modulus. UHA could serve as novel reinforcing HA filler to improve the mechanical properties of dental resin and dental composite.
Material Phase Causality or a Dynamics-Statistical Interpretation of Quantum Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koprinkov, I. G.
2010-11-25
The internal phase dynamics of a quantum system interacting with an electromagnetic field is revealed in details. Theoretical and experimental evidences of a causal relation of the phase of the wave function to the dynamics of the quantum system are presented sistematically for the first time. A dynamics-statistical interpretation of the quantum mechanics is introduced.
Non-equilibrium dog-flea model
NASA Astrophysics Data System (ADS)
Ackerson, Bruce J.
2017-11-01
We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems
NASA Astrophysics Data System (ADS)
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Equilibration, thermalisation, and the emergence of statistical mechanics in closed quantum systems.
Gogolin, Christian; Eisert, Jens
2016-05-01
We review selected advances in the theoretical understanding of complex quantum many-body systems with regard to emergent notions of quantum statistical mechanics. We cover topics such as equilibration and thermalisation in pure state statistical mechanics, the eigenstate thermalisation hypothesis, the equivalence of ensembles, non-equilibration dynamics following global and local quenches as well as ramps. We also address initial state independence, absence of thermalisation, and many-body localisation. We elucidate the role played by key concepts for these phenomena, such as Lieb-Robinson bounds, entanglement growth, typicality arguments, quantum maximum entropy principles and the generalised Gibbs ensembles, and quantum (non-)integrability. We put emphasis on rigorous approaches and present the most important results in a unified language.
Domain generality vs. modality specificity: The paradox of statistical learning
Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.
2015-01-01
Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249
Endovascular Treatment of Ischemic Stroke: An Updated Meta-Analysis of Efficacy and Safety.
Vidale, Simone; Agostoni, Elio
2017-05-01
Recent randomized trials demonstrated the superiority of the mechanical thrombectomy over the best medical treatment in patients with acute ischemic stroke due to an occlusion of arteries of proximal anterior circulation. In this updated meta-analysis, we aimed to summarize the total clinical effects of the treatment, including the last trials. We performed literature search of Randomized Crontrolled Trials (RCTs) published between 2010 and October 2016, comparing endovenous thrombolysis plus mechanical thrombectomy (intervention group) with best medical care alone (control group). We identified 8 trials. Primary outcomes were reduced disability at 90 days from the event and symptomatic intracranial hemorrhage. Statistical analysis was performed pooling data into the 2 groups, evaluating outcome heterogeneity. The Mantel-Haenszel method was used to calculate odds ratios (ORs). We analyzed data for 1845 patients (interventional group: 911; control group: 934). Mechanical thrombectomy contributed to a significant reduction in disability rate compared to the best medical treatment alone (OR: 2.087; 95% confidence interval [CI]: 1.718-2.535; P < .001). We calculated that for every 100 treated patients, 16 more participants have a good outcome as a result of mechanical treatment. No significant differences between groups were observed concerning the occurrence of symptomatic hemorrhage (OR: 1.021; 95% CI: 0.641-1.629; P = .739). Mechanical thrombectomy contributes to significantly increase the functional benefit of endovenous thrombolysis in patients with acute ischemic stroke caused by arterial occlusion of proximal anterior circulation, without reduction in safety. These findings are relevant for the optimization of the acute stroke management, including the implementation of networks between stroke centers.
A statistical design for testing apomictic diversification through linkage analysis.
Zeng, Yanru; Hou, Wei; Song, Shuang; Feng, Sisi; Shen, Lin; Xia, Guohua; Wu, Rongling
2014-03-01
The capacity of apomixis to generate maternal clones through seed reproduction has made it a useful characteristic for the fixation of heterosis in plant breeding. It has been observed that apomixis displays pronounced intra- and interspecific diversification, but the genetic mechanisms underlying this diversification remains elusive, obstructing the exploitation of this phenomenon in practical breeding programs. By capitalizing on molecular information in mapping populations, we describe and assess a statistical design that deploys linkage analysis to estimate and test the pattern and extent of apomictic differences at various levels from genotypes to species. The design is based on two reciprocal crosses between two individuals each chosen from a hermaphrodite or monoecious species. A multinomial distribution likelihood is constructed by combining marker information from two crosses. The EM algorithm is implemented to estimate the rate of apomixis and test its difference between two plant populations or species as the parents. The design is validated by computer simulation. A real data analysis of two reciprocal crosses between hickory (Carya cathayensis) and pecan (C. illinoensis) demonstrates the utilization and usefulness of the design in practice. The design provides a tool to address fundamental and applied questions related to the evolution and breeding of apomixis.
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Synchronized LES for acoustic near-field analysis of a supersonic jet
NASA Astrophysics Data System (ADS)
S, Unnikrishnan; Gaitonde, Datta; The Ohio State University Team
2014-11-01
We develop a novel method using simultaneous, synchronized Large Eddy Simulations (LES) to examine the manner in which the plume of a supersonic jet generates the near acoustic field. Starting from a statistically stationary state, at each time-step, the first LES (Baseline) is used to obtain native perturbations, which are then localized in space, scaled to small values and injected into the second LES (Twin). At any subsequent time, the difference between the two simulations can be processed to discern how disturbances from any particular zone in the jet are modulated and filtered by the non-linear core to form the combined hydrodynamic and acoustic near field and the fully acoustic farfield. Unlike inverse techniques that use correlations between jet turbulence and far-field signals to infer causality, the current forward analysis effectively tags and tracks native perturbations as they are processed by the jet. Results are presented for a Mach 1.3 cold jet. Statistical analysis of the baseline and perturbation boost provides insight into different mechanisms of disturbance propagation, amplification, directivity, generation of intermittent wave-packet like events and the direct and indirect effect of different parts of the jet on the acoustic field. Office of Naval Research.
Kazdal, Hizir; Kanat, Ayhan; Aydin, Mehmet Dumlu; Yazar, Ugur; Guvercin, Ali Riza; Calik, Muhammet; Gundogdu, Betul
2017-01-01
Context: Sudden death from subarachnoid hemorrhage (SAH) is not uncommon. Aims: The goal of this study is to elucidate the effect of the cervical spinal roots and the related dorsal root ganglions (DRGs) on cardiorespiratory arrest following SAH. Settings and Design: This was an experimental study conducted on rabbits. Materials and Methods: This study was conducted on 22 rabbits which were randomly divided into three groups: control (n = 5), physiologic serum saline (SS; n = 6), SAH groups (n = 11). Experimental SAH was performed. Seven of 11 rabbits with SAH died within the first 2 weeks. After 20 days, other animals were sacrificed. The anterior spinal arteries, arteriae nervorum of cervical nerve roots (C6–C8), DRGs, and lungs were histopathologically examined and estimated stereologically. Statistical Analysis Used: Statistical analysis was performed using the PASW Statistics 18.0 for Windows (SPSS Inc., Chicago, Illinois, USA). Intergroup differences were assessed using a one-way ANOVA. The statistical significance was set at P < 0.05. Results: In the SAH group, histopathologically, severe anterior spinal artery (ASA) and arteriae nervorum vasospasm, axonal and neuronal degeneration, and neuronal apoptosis were observed. Vasospasm of ASA did not occur in the SS and control groups. There was a statistically significant increase in the degenerated neuron density in the SAH group as compared to the control and SS groups (P < 0.05). Cardiorespiratory disturbances, arrest, and lung edema more commonly developed in animals in the SAH group. Conclusion: We noticed interestingly that C6–C8 DRG degenerations were secondary to the vasospasm of ASA, following SAH. Cardiorespiratory disturbances or arrest can be explained with these mechanisms. PMID:28250634
Bayesian approach to inverse statistical mechanics.
Habeck, Michael
2014-05-01
Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.
Bayesian approach to inverse statistical mechanics
NASA Astrophysics Data System (ADS)
Habeck, Michael
2014-05-01
Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.
A System-Level Pathway-Phenotype Association Analysis Using Synthetic Feature Random Forest
Pan, Qinxin; Hu, Ting; Malley, James D.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.
2015-01-01
As the cost of genome-wide genotyping decreases, the number of genome-wide association studies (GWAS) has increased considerably. However, the transition from GWAS findings to the underlying biology of various phenotypes remains challenging. As a result, due to its system-level interpretability, pathway analysis has become a popular tool for gaining insights on the underlying biology from high-throughput genetic association data. In pathway analyses, gene sets representing particular biological processes are tested for significant associations with a given phenotype. Most existing pathway analysis approaches rely on single-marker statistics and assume that pathways are independent of each other. As biological systems are driven by complex biomolecular interactions, embracing the complex relationships between single-nucleotide polymorphisms (SNPs) and pathways needs to be addressed. To incorporate the complexity of gene-gene interactions and pathway-pathway relationships, we propose a system-level pathway analysis approach, synthetic feature random forest (SF-RF), which is designed to detect pathway-phenotype associations without making assumptions about the relationships among SNPs or pathways. In our approach, the genotypes of SNPs in a particular pathway are aggregated into a synthetic feature representing that pathway via Random Forest (RF). Multiple synthetic features are analyzed using RF simultaneously and the significance of a synthetic feature indicates the significance of the corresponding pathway. We further complement SF-RF with pathway-based Statistical Epistasis Network (SEN) analysis that evaluates interactions among pathways. By investigating the pathway SEN, we hope to gain additional insights into the genetic mechanisms contributing to the pathway-phenotype association. We apply SF-RF to a population-based genetic study of bladder cancer and further investigate the mechanisms that help explain the pathway-phenotype associations using SEN. The bladder cancer associated pathways we found are both consistent with existing biological knowledge and reveal novel and plausible hypotheses for future biological validations. PMID:24535726
Arizpe, Joseph; Kravitz, Dwight J; Walsh, Vincent; Yovel, Galit; Baker, Chris I
2016-01-01
The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis.
Oehr, Lucy; Anderson, Jacqueline
2017-11-01
To undertake a systematic review and meta-analysis of the relationship between microstructural damage and cognitive function after hospitalized mixed-mechanism (HMM) mild traumatic brain injury (mTBI). PsycInfo, EMBASE, and MEDLINE were used to find relevant empirical articles published between January 2002 and January 2016. Studies that examined the specific relationship between diffusion tensor imaging (DTI) and cognitive test performance were included. The final sample comprised previously medically and psychiatrically healthy adults with HMM mTBI. Specific data were extracted including mTBI definitional criteria, descriptive statistics, outcome measures, and specific results of associations between DTI metrics and cognitive test performance. Of the 248 original articles retrieved and reviewed, 8 studies met all inclusion criteria and were included in the meta-analysis. The meta-analysis revealed statistically significant associations between reduced white matter integrity and poor performance on measures of attention (fractional anisotropy [FA]: d=.413, P<.001; mean diffusivity [MD]: d=-.407, P=.001), memory (FA: d=.347, P<.001; MD: d=-.568, P<.001), and executive function (FA: d=.246, P<.05), which persisted beyond 1 month postinjury. The findings from the meta-analysis provide clear support for an association between in vivo markers of underlying neuropathology and cognitive function after mTBI. Furthermore, these results demonstrate clearly for the first time that in vivo markers of structural neuropathology are associated with cognitive dysfunction within the domains of attention, memory, and executive function. These findings provide an avenue for future research to examine the causal relationship between mTBI-related neuropathology and cognitive dysfunction. Furthermore, they have important implications for clinical management of patients with mTBI because they provide a more comprehensive understanding of factors that are associated with cognitive dysfunction after mTBI. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Arizpe, Joseph; Kravitz, Dwight J.; Walsh, Vincent; Yovel, Galit; Baker, Chris I.
2016-01-01
The Other-Race Effect (ORE) is the robust and well-established finding that people are generally poorer at facial recognition of individuals of another race than of their own race. Over the past four decades, much research has focused on the ORE because understanding this phenomenon is expected to elucidate fundamental face processing mechanisms and the influence of experience on such mechanisms. Several recent studies of the ORE in which the eye-movements of participants viewing own- and other-race faces were tracked have, however, reported highly conflicting results regarding the presence or absence of differential patterns of eye-movements to own- versus other-race faces. This discrepancy, of course, leads to conflicting theoretical interpretations of the perceptual basis for the ORE. Here we investigate fixation patterns to own- versus other-race (African and Chinese) faces for Caucasian participants using different analysis methods. While we detect statistically significant, though subtle, differences in fixation pattern using an Area of Interest (AOI) approach, we fail to detect significant differences when applying a spatial density map approach. Though there were no significant differences in the spatial density maps, the qualitative patterns matched the results from the AOI analyses reflecting how, in certain contexts, Area of Interest (AOI) analyses can be more sensitive in detecting the differential fixation patterns than spatial density analyses, due to spatial pooling of data with AOIs. AOI analyses, however, also come with the limitation of requiring a priori specification. These findings provide evidence that the conflicting reports in the prior literature may be at least partially accounted for by the differences in the statistical sensitivity associated with the different analysis methods employed across studies. Overall, our results suggest that detection of differences in eye-movement patterns can be analysis-dependent and rests on the assumptions inherent in the given analysis. PMID:26849447
On the Correct Analysis of the Foundations of Theoretical Physics
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2007-04-01
The problem of truth in science -- the most urgent problem of our time -- is discussed. The correct theoretical analysis of the foundations of theoretical physics is proposed. The principle of the unity of formal logic and rational dialectics is a methodological basis of the analysis. The main result is as follows: the generally accepted foundations of theoretical physics (i.e. Newtonian mechanics, Maxwell electrodynamics, thermodynamics, statistical physics and physical kinetics, the theory of relativity, quantum mechanics) contain the set of logical errors. These errors are explained by existence of the global cause: the errors are a collateral and inevitable result of the inductive way of cognition of the Nature, i.e. result of movement from formation of separate concepts to formation of the system of concepts. Consequently, theoretical physics enters the greatest crisis. It means that physics as a science of phenomenon leaves the progress stage for a science of essence (information). Acknowledgment: The books ``Surprises in Theoretical Physics'' (1979) and ``More Surprises in Theoretical Physics'' (1991) by Sir Rudolf Peierls stimulated my 25-year work.
A Role for Chunk Formation in Statistical Learning of Second Language Syntax
ERIC Educational Resources Information Center
Hamrick, Phillip
2014-01-01
Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…
Interpreting support vector machine models for multivariate group wise analysis in neuroimaging
Gaonkar, Bilwaj; Shinohara, Russell T; Davatzikos, Christos
2015-01-01
Machine learning based classification algorithms like support vector machines (SVMs) have shown great promise for turning a high dimensional neuroimaging data into clinically useful decision criteria. However, tracing imaging based patterns that contribute significantly to classifier decisions remains an open problem. This is an issue of critical importance in imaging studies seeking to determine which anatomical or physiological imaging features contribute to the classifier’s decision, thereby allowing users to critically evaluate the findings of such machine learning methods and to understand disease mechanisms. The majority of published work addresses the question of statistical inference for support vector classification using permutation tests based on SVM weight vectors. Such permutation testing ignores the SVM margin, which is critical in SVM theory. In this work we emphasize the use of a statistic that explicitly accounts for the SVM margin and show that the null distributions associated with this statistic are asymptotically normal. Further, our experiments show that this statistic is a lot less conservative as compared to weight based permutation tests and yet specific enough to tease out multivariate patterns in the data. Thus, we can better understand the multivariate patterns that the SVM uses for neuroimaging based classification. PMID:26210913
Schwiedrzik, J J; Zysset, P K
2015-01-21
Microindentation in bone is a micromechanical testing technique routinely used to extract material properties related to bone quality. As the analysis of microindentation data is based on assumptions about the contact between sample and surface, the aim of this study was to quantify the topological variability of indentations in bone and examine its relationship with mechanical properties. Indentations were performed in dry human and ovine bone in axial and transverse directions and their topology was measured by atomic force microscopy. Statistical shape modeling of the residual imprint allowed to define a mean shape and to describe the variability in terms of 21 principal components related to imprint depth, surface curvature and roughness. The indentation profile of bone was found to be highly consistent and free of any pile up while differing mostly by depth between species and direction. A few of the topological parameters, in particular depth, showed significant but rather weak and inconsistent correlations to variations in mechanical properties. The mechanical response of bone as well as the residual imprint shape was highly consistent within each category. We could thus verify that bone is rather homogeneous in its micromechanical properties and that indentation results are not strongly influenced by small deviations from an ideally flat surface. Copyright © 2014 Elsevier Ltd. All rights reserved.
The effects of magnetic and mechanical microstructures on the twinning stress in Ni-Mn-Ga
NASA Astrophysics Data System (ADS)
Faran, Eilon; Benichou, Itamar; Givli, Sefi; Shilo, Doron
2015-12-01
The ferromagnetic 10M Ni-Mn-Ga alloy exhibits complex magnetic and mechanical microstructures, which are expected to form barriers for motion of macro twin boundaries. Here, the contributions of both microstructures to the magnitude of the twinning stress property are investigated experimentally. A series of uniaxial loading-unloading curves are taken under different orientation angles of a constant magnetic field. The different 180 ° magnetic domains microstructures that are formed across the twin boundary in each case are visualised using a magneto optical film. Analysis of the different loading curves and the corresponding magnetic microstructures show that the latter does not contribute to the barriers for twin boundary motion. In accordance, the internal resisting stress for twin boundary motion under any magnetic field can be taken as the twinning stress measured in the absence of an external field. In addition, a statistical analysis of the fine features in the loading profiles reveals that the barrier for twinning is associated with a μ m sized characteristic length scale. This length scale corresponds to the typical thickness of micro-twinning laminates that constitute a mechanical microstructure. These findings indicate that the magnitude of the twinning stress in 10M Ni-Mn-Ga is determined by the characteristic fine twinned mechanical microstructure of this alloy.
Kobayashi, Yutaka; Ohtsuki, Hisashi
2014-03-01
Learning abilities are categorized into social (learning from others) and individual learning (learning on one's own). Despite the typically higher cost of individual learning, there are mechanisms that allow stable coexistence of both learning modes in a single population. In this paper, we investigate by means of mathematical modeling how the effect of spatial structure on evolutionary outcomes of pure social and individual learning strategies depends on the mechanisms for coexistence. We model a spatially structured population based on the infinite-island framework and consider three scenarios that differ in coexistence mechanisms. Using the inclusive-fitness method, we derive the equilibrium frequency of social learners and the genetic load of social learning (defined as average fecundity reduction caused by the presence of social learning) in terms of some summary statistics, such as relatedness, for each of the three scenarios and compare the results. This comparative analysis not only reconciles previous models that made contradictory predictions as to the effect of spatial structure on the equilibrium frequency of social learners but also derives a simple mathematical rule that determines the sign of the genetic load (i.e. whether or not social learning contributes to the mean fecundity of the population). Copyright © 2013 Elsevier Inc. All rights reserved.
A Handbook of Sound and Vibration Parameters
1978-09-18
fixed in space. (Reference 1.) no motion atay node Static Divergence: (See Divergence.) Statistical Energy Analysis (SEA): Statistical energy analysis is...parameters of the circuits come from statistics of the vibrational characteristics of the structure. Statistical energy analysis is uniquely successful
NASA Astrophysics Data System (ADS)
Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas
2017-04-01
As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.
Double-Layer Mediated Electromechanical Response of Amyloid Fibrils in Liquid Environment
Nikiforov, M.P.; Thompson, G.L.; Reukov, V.V.; Jesse, S.; Guo, S.; Rodriguez, B.J.; Seal, K.; Vertegel, A.A.; Kalinin, S.V.
2010-01-01
Harnessing electrical bias-induced mechanical motion on the nanometer and molecular scale is a critical step towards understanding the fundamental mechanisms of redox processes and implementation of molecular electromechanical machines. Probing these phenomena in biomolecular systems requires electromechanical measurements be performed in liquid environments. Here we demonstrate the use of band excitation piezoresponse force microscopy for probing electromechanical coupling in amyloid fibrils. The approaches for separating the elastic and electromechanical contributions based on functional fits and multivariate statistical analysis are presented. We demonstrate that in the bulk of the fibril the electromechanical response is dominated by double-layer effects (consistent with shear piezoelectricity of biomolecules), while a number of electromechanically active hot spots possibly related to structural defects are observed. PMID:20088597
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros
1984-01-01
The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.
Progressive Failure And Life Prediction of Ceramic and Textile Composites
NASA Technical Reports Server (NTRS)
Xue, David Y.; Shi, Yucheng; Katikala, Madhu; Johnston, William M., Jr.; Card, Michael F.
1998-01-01
An engineering approach to predict the fatigue life and progressive failure of multilayered composite and textile laminates is presented. Analytical models which account for matrix cracking, statistical fiber failures and nonlinear stress-strain behavior have been developed for both composites and textiles. The analysis method is based on a combined micromechanics, fracture mechanics and failure statistics analysis. Experimentally derived empirical coefficients are used to account for the interface of fiber and matrix, fiber strength, and fiber-matrix stiffness reductions. Similar approaches were applied to textiles using Repeating Unit Cells. In composite fatigue analysis, Walker's equation is applied for matrix fatigue cracking and Heywood's formulation is used for fiber strength fatigue degradation. The analysis has been compared with experiment with good agreement. Comparisons were made with Graphite-Epoxy, C/SiC and Nicalon/CAS composite materials. For textile materials, comparisons were made with triaxial braided and plain weave materials under biaxial or uniaxial tension. Fatigue predictions were compared with test data obtained from plain weave C/SiC materials tested at AS&M. Computer codes were developed to perform the analysis. Composite Progressive Failure Analysis for Laminates is contained in the code CPFail. Micromechanics Analysis for Textile Composites is contained in the code MicroTex. Both codes were adapted to run as subroutines for the finite element code ABAQUS and CPFail-ABAQUS and MicroTex-ABAQUS. Graphic user interface (GUI) was developed to connect CPFail and MicroTex with ABAQUS.
Single-Molecule Probing the Energy Landscape of Enzymatic Reaction and Non-Covalent Interactions
NASA Astrophysics Data System (ADS)
Lu, H. Peter; Hu, Dehong; Chen, Yu; Vorpagel, Erich R.
2002-03-01
We have applied single-molecule spectroscopy under physiological conditions to study the mechanisms and dynamics of T4 lysozyme enzymatic reactions, characterizing mode-specific protein conformational dynamics. Enzymatic reaction turnovers and the associated structure changes of individual protein molecules were observed simultaneously in real-time. The overall reaction rates were found to vary widely from molecule-to-molecule, and the initial non-specific binding of the enzyme to the substrate was seen to dominate this inhomogeneity. The reaction steps subsequent to the initial binding were found to have homogeneous rates. Molecular dynamics simulation has been applied to elucidate the mechanism and intermediate states of the single-molecule enzymatic reaction. Combining the analysis of single-molecule experimental trajectories, MD simulation trajectories, and statistical modeling, we have revealed the nature of multiple intermediate states involved in the active enzyme-substrate complex formation and the associated conformational change mechanism and dynamics.
Modeling Selection and Extinction Mechanisms of Biological Systems
NASA Astrophysics Data System (ADS)
Amirjanov, Adil
In this paper, the behavior of a genetic algorithm is modeled to enhance its applicability as a modeling tool of biological systems. A new description model for selection mechanism is introduced which operates on a portion of individuals of population. The extinction and recolonization mechanism is modeled, and solving the dynamics analytically shows that the genetic drift in the population with extinction/recolonization is doubled. The mathematical analysis of the interaction between selection and extinction/recolonization processes is carried out to assess the dynamics of motion of the macroscopic statistical properties of population. Computer simulations confirm that the theoretical predictions of described models are in good approximations. A mathematical model of GA dynamics was also examined, which describes the anti-predator vigilance in an animal group with respect to a known analytical solution of the problem, and showed a good agreement between them to find the evolutionarily stable strategies.
Mendes da Costa, P; Klastersky, J; Gérard, A
1977-01-01
Between November 30, 1971 and March 15, 1976, 46 patients underwent surgery on the colon or rectum. They were randomized into 2 groups, one receiving a mechanical preparation together with lincomycline, neomycine, polymyxine, kanamycine, bacitracine and nystatine, the other a mechanical preparation alone. Analysis of results reveals no statistically significant difference in the frequency of infections, neither local (11/24 with antibiotics vis 13/22 without; chi2 = 0.25) neither general (16/24 and 9/22; chi2 = 0.92). Nor was the postoperative use of antibiotics for local or general infection different in the 2 groups. No influence of age or preoperative radio-therapy could be shown. This randomized trial suggests that there is little advantage in associating antibiotics to mechanical preparation before colorectal surgery. The authors contemplate a new randomized trial in high-risk patients suffering from cancer.
NASA Astrophysics Data System (ADS)
Wang, Dong
2016-03-01
Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Double-slit experiment with single wave-driven particles and its relation to quantum mechanics.
Andersen, Anders; Madsen, Jacob; Reichelt, Christian; Rosenlund Ahl, Sonja; Lautrup, Benny; Ellegaard, Clive; Levinsen, Mogens T; Bohr, Tomas
2015-07-01
In a thought-provoking paper, Couder and Fort [Phys. Rev. Lett. 97, 154101 (2006)] describe a version of the famous double-slit experiment performed with droplets bouncing on a vertically vibrated fluid surface. In the experiment, an interference pattern in the single-particle statistics is found even though it is possible to determine unambiguously which slit the walking droplet passes. Here we argue, however, that the single-particle statistics in such an experiment will be fundamentally different from the single-particle statistics of quantum mechanics. Quantum mechanical interference takes place between different classical paths with precise amplitude and phase relations. In the double-slit experiment with walking droplets, these relations are lost since one of the paths is singled out by the droplet. To support our conclusions, we have carried out our own double-slit experiment, and our results, in particular the long and variable slit passage times of the droplets, cast strong doubt on the feasibility of the interference claimed by Couder and Fort. To understand theoretically the limitations of wave-driven particle systems as analogs to quantum mechanics, we introduce a Schrödinger equation with a source term originating from a localized particle that generates a wave while being simultaneously guided by it. We show that the ensuing particle-wave dynamics can capture some characteristics of quantum mechanics such as orbital quantization. However, the particle-wave dynamics can not reproduce quantum mechanics in general, and we show that the single-particle statistics for our model in a double-slit experiment with an additional splitter plate differs qualitatively from that of quantum mechanics.
[Effect of somatostatin-14 in simple mechanical obstruction of the small intestine].
Jimenez-Garcia, A; Ahmad Araji, O; Balongo Garcia, R; Nogales Munoz, A; Salguero Villadiego, M; Cantillana Martinez, J
1994-02-01
In order to investigate the properties of somatostatin-14 we studied an experimental model of simple mechanical and closed loop occlusion. Forty-eight New Zealand rabbits were assigned randomly to three groups of 16: group C (controls) was operated and treated with saline solution (4 cc/Kg/h); group A was operated and initially treated with saline solution and an equal dose of somatostatin-14 (3.5 micrograms/Kg/h; and group B was operated and treated in the same manner as group A, but later, 8 hours after the laparotomy. The animals were sacrificed 24 hours later; intestinal secretion was quantified, blood and intestinal fluid chemistries were performed and specimens of the intestine were prepared for histological examination. Descriptive statistical analysis of the results was performed with the ANOVA, a semi-quantitative test and the covariance test. Somatostatin-14 produced an improvement in the volume of intestinal secretion in the treated groups compared with the control group. The results were statistically significant in group B treated after an 8-hour delay: closed loop (ml): 6.40 +/- 1.12, 2.50 +/- 0.94, 1.85 +/- 0.83 and simple mechanical occlusion (ml): 175 +/- 33.05, 89.50 +/- 9.27, 57.18 +/- 21.23, p < 0.01 for groups C, A and B C, A and B respectively. Net secretion of Cl and Na ions was also improved, p < 0.01.(ABSTRACT TRUNCATED AT 250 WORDS)
Differentiation of benign and malignant breast lesions by mechanical imaging
Kearney, Thomas; Pollak, Stanley B.; Rohatgi, Chand; Sarvazyan, Noune; Airapetian, Suren; Browning, Stephanie; Sarvazyan, Armen
2009-01-01
Mechanical imaging yields tissue elasticity map and provides quantitative characterization of a detected pathology. The changes in the surface stress patterns as a function of applied load provide information about the elastic composition and geometry of the underlying tissue structures. The objective of this study is the clinical evaluation of breast mechanical imager for breast lesion characterization and differentiation between benign and malignant lesions. The breast mechanical imager includes a probe with pressure sensor array, an electronic unit providing data acquisition from the pressure sensors and communication with a touch-screen laptop computer. We have developed an examination procedure and algorithms to provide assessment of breast lesion features such as hardness related parameters, mobility, and shape. A statistical Bayesian classifier was constructed to distinguish between benign and malignant lesions by utilizing all the listed features as the input. Clinical results for 179 cases, collected at four different clinical sites, have demonstrated that the breast mechanical imager provides a reliable image formation of breast tissue abnormalities and calculation of lesion features. Malignant breast lesions (histologically confirmed) demonstrated increased hardness and strain hardening as well as decreased mobility and longer boundary length in comparison with benign lesions. Statistical analysis of differentiation capability for 147 benign and 32 malignant lesions revealed an average sensitivity of 91.4% and specificity of 86.8% with a standard deviation of ±6.1%. The area under the receiver operating characteristic curve characterizing benign and malignant lesion discrimination is 86.1% with the confidence interval ranging from 80.3 to 90.9%, with a significance level of P = 0.0001 (area = 50%). The multisite clinical study demonstrated the capability of mechanical imaging for characterization and differentiation of benign and malignant breast lesions. We hypothesize that the breast mechanical imager has the potential to be used as a cost effective device for cancer diagnostics that could reduce the benign biopsy rate, serve as an adjunct to mammography and to be utilized as a screening device for breast cancer detection. PMID:19306059
Griessenauer, Christoph J; Medin, Caroline; Maingard, Julian; Chandra, Ronil V; Ng, Wyatt; Brooks, Duncan Mark; Asadi, Hamed; Killer-Oberpfalzer, Monika; Schirmer, Clemens M; Moore, Justin M; Ogilvy, Christopher S; Thomas, Ajith J; Phan, Kevin
2018-02-01
Mechanical thrombectomy has become the standard of care for management of most large vessel occlusion (LVO) strokes. When patients with LVO present with minor stroke symptomatology, no consensus on the role of mechanical thrombectomy exists. A systematic review and meta-analysis were performed to identify studies that focused on mechanical thrombectomy, either as a standalone treatment or with intravenous tissue plasminogen activator (IV tPA), in patients with mild strokes with LVO, defined as a baseline National Institutes of Health Stroke Scale score ≤5 at presentation. Data on methodology, quality criteria, and outcome measures were extracted, and outcomes were compared using odds ratio as a summary statistic. Five studies met the selection criteria and were included. When compared with medical therapy without IV tPA, mechanical thrombectomy and medical therapy with IV tPA were associated with improved 90-day modified Rankin Scale (mRS) score. Among medical patients who were not eligible for IV tPA, those who underwent mechanical thrombectomy were more likely to experience good 90-day mRS than those who were not. There was no significant difference in functional outcome between mechanical thrombectomy and medical therapy with IV tPA, and no treatment subgroup was associated with intracranial hemorrhage or death. In patients with mild strokes due to LVO, mechanical thrombectomy and medical therapy with IV tPA led to better 90-day functional outcome. Mechanical thrombectomy plays an important role in the management of these patients, particularly in those not eligible for IV tPA. Copyright © 2017 Elsevier Inc. All rights reserved.