Factor Analysis and Counseling Research
ERIC Educational Resources Information Center
Weiss, David J.
1970-01-01
Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Fuzzy correlation analysis with realization
NASA Astrophysics Data System (ADS)
Tang, Yue Y.; Fan, Xinrui; Zheng, Ying N.
1998-10-01
The fundamental concept of fuzzy correlation is briefly discussed. Based on the correlation coefficient of classic correlation, polarity correlation and fuzzy correlation, the relationship between the correlations are analyzed. A fuzzy correlation analysis has the merits of both rapidity and accuracy as some amplitude information of random signals has been utilized. It has broad prospects for application. The form of fuzzy correlative analyzer with NLX 112 fuzzy data correlator and single-chip microcomputer is introduced.
An Evaluation of Jet Impingement Heat Transfer Correlations for Piccolo Tube Application
NASA Technical Reports Server (NTRS)
Bond, Thomas (Technical Monitor); Wright, William B.
2004-01-01
Impinging jets have been used for a wide variety of applications where high rates of heat transfer are desired. This report will present a review of heat transfer correlations that have been published. The correlations were then added to the LEWICE software to evaluate the applicability of these correlations to a piccolo tube anti-icing system. The results of this analysis were then compared quantitatively to test results on a representative piccolo tube system.
Frojo, Gianfranco; Tadisina, Kashyap Komarraju; Pressman, Zachary; Chibnall, John T; Lin, Alexander Y; Kraemer, Bruce A
2016-12-01
The integrated plastic surgery match is a competitive process not only for applicants but also for programs vying for highly qualified candidates. Interactions between applicants and program constituents are limited to a single interview visit. The authors aimed to identify components of the interview visit that influence applicant decision making when determining a final program rank list. Thirty-six applicants who were interviewed (100% response) completed the survey. Applicants rated the importance of 20 elements of the interview visit regarding future ranking of the program on a 1 to 5 Likert scale. Data were analyzed using descriptive statistics, hierarchical cluster analysis, analysis of variance, and Pearson correlations. A literature review was performed regarding the plastic surgery integrated residency interview process. Survey questions were categorized into four groups based on mean survey responses:1. Interactions with faculty and residents (mean response > 4),2. Information about the program (3.5-4),3. Ancillaries (food, amenities, stipends) (3-3.5),4. Hospital tour, hotel (<3).Hierarchical item cluster analysis and analysis of variance testing validated these groupings. Average summary scores were calculated for the items representing Interactions, Information, and Ancillaries. Correlation analysis between clusters yielded no significant correlations. A review of the literature yielded a paucity of data on analysis of the interview visit. The interview visit consists of a discrete hierarchy of perceived importance by applicants. The strongest independent factor in determining future program ranking is the quality of interactions between applicants and program constituents on the interview visit. This calls for further investigation and optimization of the interview visit experience.
Intracellular applications of fluorescence correlation spectroscopy: prospects for neuroscience.
Kim, Sally A; Schwille, Petra
2003-10-01
Based on time-averaging fluctuation analysis of small fluorescent molecular ensembles in equilibrium, fluorescence correlation spectroscopy has recently been applied to investigate processes in the intracellular milieu. The exquisite sensitivity of fluorescence correlation spectroscopy provides access to a multitude of measurement parameters (rates of diffusion, local concentration, states of aggregation and molecular interactions) in real time with fast temporal and high spatial resolution. The introduction of dual-color cross-correlation, imaging, two-photon excitation, and coincidence analysis coupled with fluorescence correlation spectroscopy has expanded the utility of the technique to encompass a wide range of promising applications in living cells that may provide unprecedented insight into understanding the molecular mechanisms of intracellular neurobiological processes.
Two-dimensional correlation spectroscopy — Biannual survey 2007-2009
NASA Astrophysics Data System (ADS)
Noda, Isao
2010-06-01
The publication activities in the field of 2D correlation spectroscopy are surveyed with the emphasis on papers published during the last two years. Pertinent review articles and conference proceedings are discussed first, followed by the examination of noteworthy developments in the theory and applications of 2D correlation spectroscopy. Specific topics of interest include Pareto scaling, analysis of randomly sampled spectra, 2D analysis of data obtained under multiple perturbations, evolution of 2D spectra along additional variables, comparison and quantitative analysis of multiple 2D spectra, orthogonal sample design to eliminate interfering cross peaks, quadrature orthogonal signal correction and other data transformation techniques, data pretreatment methods, moving window analysis, extension of kernel and global phase angle analysis, covariance and correlation coefficient mapping, variant forms of sample-sample correlation, and different display methods. Various static and dynamic perturbation methods used in 2D correlation spectroscopy, e.g., temperature, composition, chemical reactions, H/D exchange, physical phenomena like sorption, diffusion and phase transitions, optical and biological processes, are reviewed. Analytical probes used in 2D correlation spectroscopy include IR, Raman, NIR, NMR, X-ray, mass spectrometry, chromatography, and others. Application areas of 2D correlation spectroscopy are diverse, encompassing synthetic and natural polymers, liquid crystals, proteins and peptides, biomaterials, pharmaceuticals, food and agricultural products, solutions, colloids, surfaces, and the like.
A time-series approach to dynamical systems from classical and quantum worlds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fossion, Ruben
2014-01-08
This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.
Spatio-Chromatic Adaptation via Higher-Order Canonical Correlation Analysis of Natural Images
Gutmann, Michael U.; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús
2014-01-01
Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation. PMID:24533049
Spatio-chromatic adaptation via higher-order canonical correlation analysis of natural images.
Gutmann, Michael U; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús
2014-01-01
Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation.
Accuracy of the Parallel Analysis Procedure with Polychoric Correlations
ERIC Educational Resources Information Center
Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah
2009-01-01
The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…
Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results
NASA Technical Reports Server (NTRS)
Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul
1992-01-01
The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
NASA Astrophysics Data System (ADS)
Zimnyakov, Dmitry A.; Tuchin, Valery V.; Yodh, Arjun G.; Mishin, Alexey A.; Peretochkin, Igor S.
1998-04-01
Relationships between decorrelation and depolarization of coherent light scattered by disordered media are examined by using the conception of the photon paths distribution functions. Analysis of behavior of the autocorrelation functions of the scattered field fluctuations and their polarization properties allows us to introduce generalized parameter of scattering media such as specific correlation time. Determination of specific correlation time has been carried out for phantom scattering media (water suspensions of polystyrene spheres). Results of statistical, correlation and polarization analysis of static and dynamic speckle patterns carried out in the experiments with human sclera with artificially controlled optical transmittance are presented. Some possibilities of applications of such polarization- correlation technique for monitoring and visualization of non- single scattering tissue structures are discussed.
Tremor Frequency Assessment by iPhone® Applications: Correlation with EMG Analysis.
Araújo, Rui; Tábuas-Pereira, Miguel; Almendra, Luciano; Ribeiro, Joana; Arenga, Marta; Negrão, Luis; Matos, Anabela; Morgadinho, Ana; Januário, Cristina
2016-10-19
Tremor frequency analysis is usually performed by EMG studies but accelerometers are progressively being more used. The iPhone® contains an accelerometer and many applications claim to be capable of measuring tremor frequency. We tested three applications in twenty-two patients with a diagnosis of PD, ET and Holmes' tremor. EMG needle assessment as well as accelerometry was performed at the same time. There was very strong correlation (Pearson >0.8, p < 0.001) between the three applications, the EMG needle and the accelerometry. Our data suggests the apps LiftPulse®, iSeismometer® and Studymytremor® are a reliable alternative to the EMG for tremor frequency assessment.
DGCA: A comprehensive R package for Differential Gene Correlation Analysis.
McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin
2016-11-15
Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.
Co-occurrence correlations of heavy metals in sediments revealed using network analysis.
Liu, Lili; Wang, Zhiping; Ju, Feng; Zhang, Tong
2015-01-01
In this study, the correlation-based study was used to identify the co-occurrence correlations among metals in marine sediment of Hong Kong, based on the long-term (from 1991 to 2011) temporal and spatial monitoring data. 14 stations out of the total 45 marine sediment monitoring stations were selected from three representative areas, including Deep Bay, Victoria Harbour and Mirs Bay. Firstly, Spearman's rank correlation-based network analysis was conducted as the first step to identify the co-occurrence correlations of metals from raw metadata, and then for further analysis using the normalized metadata. The correlations patterns obtained by network were consistent with those obtained by the other statistic normalization methods, including annual ratios, R-squared coefficient and Pearson correlation coefficient. Both Deep Bay and Victoria Harbour have been polluted by heavy metals, especially for Pb and Cu, which showed strong co-occurrence with other heavy metals (e.g. Cr, Ni, Zn and etc.) and little correlations with the reference parameters (Fe or Al). For Mirs Bay, which has better marine sediment quality compared with Deep Bay and Victoria Harbour, the co-occurrence patterns revealed by network analysis indicated that the metals in sediment dominantly followed the natural geography process. Besides the wide applications in biology, sociology and informatics, it is the first time to apply network analysis in the researches of environment pollutions. This study demonstrated its powerful application for revealing the co-occurrence correlations among heavy metals in marine sediments, which could be further applied for other pollutants in various environment systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
Influence in Canonical Correlation Analysis.
ERIC Educational Resources Information Center
Romanazzi, Mario
1992-01-01
The perturbation theory of the generalized eigenproblem is used to derive influence functions of each squared canonical correlation coefficient and the corresponding canonical vector pair. Three sample versions of these functions are described, and some properties are noted. Two obvious applications, multiple correlation and correspondence…
Detrended fluctuation analysis made flexible to detect range of cross-correlated fluctuations
NASA Astrophysics Data System (ADS)
Kwapień, Jarosław; Oświecimka, Paweł; DroŻdŻ, Stanisław
2015-11-01
The detrended cross-correlation coefficient ρDCCA has recently been proposed to quantify the strength of cross-correlations on different temporal scales in bivariate, nonstationary time series. It is based on the detrended cross-correlation and detrended fluctuation analyses (DCCA and DFA, respectively) and can be viewed as an analog of the Pearson coefficient in the case of the fluctuation analysis. The coefficient ρDCCA works well in many practical situations but by construction its applicability is limited to detection of whether two signals are generally cross-correlated, without the possibility to obtain information on the amplitude of fluctuations that are responsible for those cross-correlations. In order to introduce some related flexibility, here we propose an extension of ρDCCA that exploits the multifractal versions of DFA and DCCA: multifractal detrended fluctuation analysis and multifractal detrended cross-correlation analysis, respectively. The resulting new coefficient ρq not only is able to quantify the strength of correlations but also allows one to identify the range of detrended fluctuation amplitudes that are correlated in two signals under study. We show how the coefficient ρq works in practical situations by applying it to stochastic time series representing processes with long memory: autoregressive and multiplicative ones. Such processes are often used to model signals recorded from complex systems and complex physical phenomena like turbulence, so we are convinced that this new measure can successfully be applied in time-series analysis. In particular, we present an example of such application to highly complex empirical data from financial markets. The present formulation can straightforwardly be extended to multivariate data in terms of the q -dependent counterpart of the correlation matrices and then to the network representation.
Autocorrelation and cross-correlation in time series of homicide and attempted homicide
NASA Astrophysics Data System (ADS)
Machado Filho, A.; da Silva, M. F.; Zebende, G. F.
2014-04-01
We propose in this paper to establish the relationship between homicides and attempted homicides by a non-stationary time-series analysis. This analysis will be carried out by Detrended Fluctuation Analysis (DFA), Detrended Cross-Correlation Analysis (DCCA), and DCCA cross-correlation coefficient, ρ(n). Through this analysis we can identify a positive cross-correlation between homicides and attempted homicides. At the same time, looked at from the point of view of autocorrelation (DFA), this analysis can be more informative depending on time scale. For short scale (days), we cannot identify auto-correlations, on the scale of weeks DFA presents anti-persistent behavior, and for long time scales (n>90 days) DFA presents a persistent behavior. Finally, the application of this new type of statistical analysis proved to be efficient and, in this sense, this paper can contribute to a more accurate descriptive statistics of crime.
NASA Technical Reports Server (NTRS)
Toossi, Mostafa; Weisenburger, Richard; Hashemi-Kia, Mostafa
1993-01-01
This paper presents a summary of some of the work performed by McDonnell Douglas Helicopter Company under NASA Langley-sponsored rotorcraft structural dynamics program known as DAMVIBS (Design Analysis Methods for VIBrationS). A set of guidelines which is applicable to dynamic modeling, analysis, testing, and correlation of both helicopter airframes and a large variety of structural finite element models is presented. Utilization of these guidelines and the key features of their applications to vibration modeling of helicopter airframes are discussed. Correlation studies with the test data, together with the development and applications of a set of efficient finite element model checkout procedures, are demonstrated on a large helicopter airframe finite element model. Finally, the lessons learned and the benefits resulting from this program are summarized.
Sensitivity analysis of a sound absorption model with correlated inputs
NASA Astrophysics Data System (ADS)
Chai, W.; Christen, J.-L.; Zine, A.-M.; Ichchou, M.
2017-04-01
Sound absorption in porous media is a complex phenomenon, which is usually addressed with homogenized models, depending on macroscopic parameters. Since these parameters emerge from the structure at microscopic scale, they may be correlated. This paper deals with sensitivity analysis methods of a sound absorption model with correlated inputs. Specifically, the Johnson-Champoux-Allard model (JCA) is chosen as the objective model with correlation effects generated by a secondary micro-macro semi-empirical model. To deal with this case, a relatively new sensitivity analysis method Fourier Amplitude Sensitivity Test with Correlation design (FASTC), based on Iman's transform, is taken into application. This method requires a priori information such as variables' marginal distribution functions and their correlation matrix. The results are compared to the Correlation Ratio Method (CRM) for reference and validation. The distribution of the macroscopic variables arising from the microstructure, as well as their correlation matrix are studied. Finally the results of tests shows that the correlation has a very important impact on the results of sensitivity analysis. Assessment of correlation strength among input variables on the sensitivity analysis is also achieved.
Application of abstract harmonic analysis to the high-speed recognition of images
NASA Technical Reports Server (NTRS)
Usikov, D. A.
1979-01-01
Methods are constructed for rapidly computing correlation functions using the theory of abstract harmonic analysis. The theory developed includes as a particular case the familiar Fourier transform method for a correlation function which makes it possible to find images which are independent of their translation in the plane. Two examples of the application of the general theory described are the search for images, independent of their rotation and scale, and the search for images which are independent of their translations and rotations in the plane.
Integrated data analysis for genome-wide research.
Steinfath, Matthias; Repsilber, Dirk; Scholz, Matthias; Walther, Dirk; Selbig, Joachim
2007-01-01
Integrated data analysis is introduced as the intermediate level of a systems biology approach to analyse different 'omics' datasets, i.e., genome-wide measurements of transcripts, protein levels or protein-protein interactions, and metabolite levels aiming at generating a coherent understanding of biological function. In this chapter we focus on different methods of correlation analyses ranging from simple pairwise correlation to kernel canonical correlation which were recently applied in molecular biology. Several examples are presented to illustrate their application. The input data for this analysis frequently originate from different experimental platforms. Therefore, preprocessing steps such as data normalisation and missing value estimation are inherent to this approach. The corresponding procedures, potential pitfalls and biases, and available software solutions are reviewed. The multiplicity of observations obtained in omics-profiling experiments necessitates the application of multiple testing correction techniques.
Kim, Sang Hyun
2013-09-01
The purpose of this study was to investigate applicants' behavioral characteristics based on the evaluation of cognitive, affective and social domain shown in self introduction letter and professor's recommendation letter. Self introduction letters and professor's recommendation letters of 109 applicants students who applied to medical school were collected. Frequency analysis and simple correlation were done in self introduction letter and professor's recommendation letter. Frequency analysis showed affective characteristics were most often mentioned in self introduction letter, and cognitive characteristics were most frequently described in professor's recommendation letter. There was a strong correlation between cognitive domains of self introduction letter and cognitive domain of professor's recommendation letter. There was a strong correlation between affective domain of self introduction letter and cognitive domain professor's recommendation letter. It is very important to make full use of self introduction letter and professor's recommendation letter for selecting medical students. Through the frequency analysis and simple correlation, more specific guidelines need to be suggested in order to secure fairness and objectivity in the evaluation of self-introduction letter and professor's recommendation letter.
Sequential Dictionary Learning From Correlated Data: Application to fMRI Data Analysis.
Seghouane, Abd-Krim; Iqbal, Asif
2017-03-22
Sequential dictionary learning via the K-SVD algorithm has been revealed as a successful alternative to conventional data driven methods such as independent component analysis (ICA) for functional magnetic resonance imaging (fMRI) data analysis. fMRI datasets are however structured data matrices with notions of spatio-temporal correlation and temporal smoothness. This prior information has not been included in the K-SVD algorithm when applied to fMRI data analysis. In this paper we propose three variants of the K-SVD algorithm dedicated to fMRI data analysis by accounting for this prior information. The proposed algorithms differ from the K-SVD in their sparse coding and dictionary update stages. The first two algorithms account for the known correlation structure in the fMRI data by using the squared Q, R-norm instead of the Frobenius norm for matrix approximation. The third and last algorithm account for both the known correlation structure in the fMRI data and the temporal smoothness. The temporal smoothness is incorporated in the dictionary update stage via regularization of the dictionary atoms obtained with penalization. The performance of the proposed dictionary learning algorithms are illustrated through simulations and applications on real fMRI data.
Topics on Test Methods for Space Systems and Operations Safety: Applicability of Experimental Data
NASA Technical Reports Server (NTRS)
Hirsch, David B.
2009-01-01
This viewgraph presentation reviews topics on test methods for space systems and operations safety through experimentation and analysis. The contents include: 1) Perception of reality through experimentation and analysis; 2) Measurements, methods, and correlations with real life; and 3) Correlating laboratory aerospace materials flammability data with data in spacecraft environments.
Classical Item Analysis Using Latent Variable Modeling: A Note on a Direct Evaluation Procedure
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2011-01-01
A directly applicable latent variable modeling procedure for classical item analysis is outlined. The method allows one to point and interval estimate item difficulty, item correlations, and item-total correlations for composites consisting of categorical items. The approach is readily employed in empirical research and as a by-product permits…
[Weighted gene co-expression network analysis in biomedicine research].
Liu, Wei; Li, Li; Ye, Hua; Tu, Wei
2017-11-25
High-throughput biological technologies are now widely applied in biology and medicine, allowing scientists to monitor thousands of parameters simultaneously in a specific sample. However, it is still an enormous challenge to mine useful information from high-throughput data. The emergence of network biology provides deeper insights into complex bio-system and reveals the modularity in tissue/cellular networks. Correlation networks are increasingly used in bioinformatics applications. Weighted gene co-expression network analysis (WGCNA) tool can detect clusters of highly correlated genes. Therefore, we systematically reviewed the application of WGCNA in the study of disease diagnosis, pathogenesis and other related fields. First, we introduced principle, workflow, advantages and disadvantages of WGCNA. Second, we presented the application of WGCNA in disease, physiology, drug, evolution and genome annotation. Then, we indicated the application of WGCNA in newly developed high-throughput methods. We hope this review will help to promote the application of WGCNA in biomedicine research.
Wavelet transform: fundamentals, applications, and implementation using acousto-optic correlators
NASA Astrophysics Data System (ADS)
DeCusatis, Casimer M.; Koay, J.; Litynski, Daniel M.; Das, Pankaj K.
1995-10-01
In recent years there has been a great deal of interest in the use of wavelets to supplement or replace conventional Fourier transform signal processing. This paper provides a review of wavelet transforms for signal processing applications, and discusses several emerging applications which benefit from the advantages of wavelets. The wavelet transform can be implemented as an acousto-optic correlator; perfect reconstruction of digital signals may also be achieved using acousto-optic finite impulse response filter banks. Acousto-optic image correlators are discussed as a potential implementation of the wavelet transform, since a 1D wavelet filter bank may be encoded as a 2D image. We discuss applications of the wavelet transform including nondestructive testing of materials, biomedical applications in the analysis of EEG signals, and interference excision in spread spectrum communication systems. Computer simulations and experimental results for these applications are also provided.
Generalization of Clustering Coefficients to Signed Correlation Networks
Costantini, Giulio; Perugini, Marco
2014-01-01
The recent interest in network analysis applications in personality psychology and psychopathology has put forward new methodological challenges. Personality and psychopathology networks are typically based on correlation matrices and therefore include both positive and negative edge signs. However, some applications of network analysis disregard negative edges, such as computing clustering coefficients. In this contribution, we illustrate the importance of the distinction between positive and negative edges in networks based on correlation matrices. The clustering coefficient is generalized to signed correlation networks: three new indices are introduced that take edge signs into account, each derived from an existing and widely used formula. The performances of the new indices are illustrated and compared with the performances of the unsigned indices, both on a signed simulated network and on a signed network based on actual personality psychology data. The results show that the new indices are more resistant to sample variations in correlation networks and therefore have higher convergence compared with the unsigned indices both in simulated networks and with real data. PMID:24586367
Semi-quantitative spectrographic analysis and rank correlation in geochemistry
Flanagan, F.J.
1957-01-01
The rank correlation coefficient, rs, which involves less computation than the product-moment correlation coefficient, r, can be used to indicate the degree of relationship between two elements. The method is applicable in situations where the assumptions underlying normal distribution correlation theory may not be satisfied. Semi-quantitative spectrographic analyses which are reported as grouped or partly ranked data can be used to calculate rank correlations between elements. ?? 1957.
Detrended fluctuation analysis of short datasets: An application to fetal cardiac data
NASA Astrophysics Data System (ADS)
Govindan, R. B.; Wilson, J. D.; Preißl, H.; Eswaran, H.; Campbell, J. Q.; Lowery, C. L.
2007-02-01
Using detrended fluctuation analysis (DFA) we perform scaling analysis of short datasets of length 500-1500 data points. We quantify the long range correlation (exponent α) by computing the mean value of the local exponents αL (in the asymptotic regime). The local exponents are obtained as the (numerical) derivative of the logarithm of the fluctuation function F(s) with respect to the logarithm of the scale factor s:αL=dlog10F(s)/dlog10s. These local exponents display huge variations and complicate the correct quantification of the underlying correlations. We propose the use of the phase randomized surrogate (PRS), which preserves the long range correlations of the original data, to minimize the variations in the local exponents. Using the numerically generated uncorrelated and long range correlated data, we show that performing DFA on several realizations of PRS and estimating αL from the averaged fluctuation functions (of all realizations) can minimize the variations in αL. The application of this approach to the fetal cardiac data (RR intervals) is discussed and we show that there is a statistically significant correlation between α and the gestation age.
Mosaic organization of DNA nucleotides
NASA Technical Reports Server (NTRS)
Peng, C. K.; Buldyrev, S. V.; Havlin, S.; Simons, M.; Stanley, H. E.; Goldberger, A. L.
1994-01-01
Long-range power-law correlations have been reported recently for DNA sequences containing noncoding regions. We address the question of whether such correlations may be a trivial consequence of the known mosaic structure ("patchiness") of DNA. We analyze two classes of controls consisting of patchy nucleotide sequences generated by different algorithms--one without and one with long-range power-law correlations. Although both types of sequences are highly heterogenous, they are quantitatively distinguishable by an alternative fluctuation analysis method that differentiates local patchiness from long-range correlations. Application of this analysis to selected DNA sequences demonstrates that patchiness is not sufficient to account for long-range correlation properties.
ERIC Educational Resources Information Center
Duan, Lian
2012-01-01
Finding the most interesting correlations among items is essential for problems in many commercial, medical, and scientific domains. For example, what kinds of items should be recommended with regard to what has been purchased by a customer? How to arrange the store shelf in order to increase sales? How to partition the whole social network into…
Recent advancement in the field of two-dimensional correlation spectroscopy
NASA Astrophysics Data System (ADS)
Noda, Isao
2008-07-01
The recent advancement in the field of 2D correlation spectroscopy is reviewed with the emphasis on a number of papers published during the last two years. Topics covered by this comprehensive review include books, review articles, and noteworthy developments in the theory and applications of 2D correlation spectroscopy. New 2D correlation techniques are discussed, such as kernel analysis and augmented 2D correlation, model-based correlation, moving window analysis, global phase angle, covariance and correlation coefficient mapping, sample-sample correlation, hybrid and hetero correlation, pretreatment and transformation of data, and 2D correlation combined with other chemometrics techniques. Perturbation methods of both static (e.g., temperature, composition, pressure and stress, spatial distribution and orientation) and dynamic types (e.g., rheo-optical and acoustic, chemical reactions and kinetics, H/D exchange, sorption and diffusion) currently in use are examined. Analytical techniques most commonly employed in 2D correlation spectroscopy are IR, Raman, and NIR, but the growing use of other probes is also noted, including fluorescence, emission, Raman optical activity and vibrational circular dichroism, X-ray absorption and scattering, NMR, mass spectrometry, and even chromatography. The field of applications for 2D correlation spectroscopy is very diverse, encompassing synthetic polymers, liquid crystals, Langmuir-Blodgett films, proteins and peptides, natural polymers and biomaterials, pharmaceuticals, food and agricultural products, water, solutions, inorganic, organic, hybrid or composite materials, and many more.
NASA Astrophysics Data System (ADS)
Hoshor, Cory; Young, Stephan; Rogers, Brent; Currie, James; Oakes, Thomas; Scott, Paul; Miller, William; Caruso, Anthony
2014-03-01
A novel application of the Pearson Cross-Correlation to neutron spectral discernment in a moderating type neutron spectrometer is introduced. This cross-correlation analysis will be applied to spectral response data collected through both MCNP simulation and empirical measurement by the volumetrically sensitive spectrometer for comparison in 1, 2, and 3 spatial dimensions. The spectroscopic analysis methods discussed will be demonstrated to discern various common spectral and monoenergetic neutron sources.
Quintana, Daniel S.
2015-01-01
Meta-analysis synthesizes a body of research investigating a common research question. Outcomes from meta-analyses provide a more objective and transparent summary of a research area than traditional narrative reviews. Moreover, they are often used to support research grant applications, guide clinical practice, and direct health policy. The aim of this article is to provide a practical and non-technical guide for psychological scientists that outlines the steps involved in planning and performing a meta-analysis of correlational datasets. I provide a supplementary R script to demonstrate each analytical step described in the paper, which is readily adaptable for researchers to use for their analyses. While the worked example is the analysis of a correlational dataset, the general meta-analytic process described in this paper is applicable for all types of effect sizes. I also emphasize the importance of meta-analysis protocols and pre-registration to improve transparency and help avoid unintended duplication. An improved understanding this tool will not only help scientists to conduct their own meta-analyses but also improve their evaluation of published meta-analyses. PMID:26500598
Quintana, Daniel S
2015-01-01
Meta-analysis synthesizes a body of research investigating a common research question. Outcomes from meta-analyses provide a more objective and transparent summary of a research area than traditional narrative reviews. Moreover, they are often used to support research grant applications, guide clinical practice, and direct health policy. The aim of this article is to provide a practical and non-technical guide for psychological scientists that outlines the steps involved in planning and performing a meta-analysis of correlational datasets. I provide a supplementary R script to demonstrate each analytical step described in the paper, which is readily adaptable for researchers to use for their analyses. While the worked example is the analysis of a correlational dataset, the general meta-analytic process described in this paper is applicable for all types of effect sizes. I also emphasize the importance of meta-analysis protocols and pre-registration to improve transparency and help avoid unintended duplication. An improved understanding this tool will not only help scientists to conduct their own meta-analyses but also improve their evaluation of published meta-analyses.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology
Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250
Speckle-correlation analysis of the microcapillary blood circulation in nail bed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilenskii, M A; Agafonov, D N; Zimnyakov, D A
2011-04-30
We present the results of the experimental studies of the possibility of monitoring the blood microcirculation in human finger nail bed with application of speckle-correlation analysis, based on estimating the contrast of time-averaged dynamic speckles. The hemodynamics at normal blood circulation and under conditions of partially suppressed blood circulation is analysed. A microscopic analysis is performed to visualise the structural changes in capillaries that are caused by suppressing blood circulation. The problems and prospects of speckle-correlation monitoring of the nail bed microhemodynamics under laboratory and clinical conditions are discussed. (optical technologies in biophysics and medicine)
Application of optical correlation techniques to particle imaging velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1988-01-01
Pulsed laser sheet velocimetry yields nonintrusive measurements of velocity vectors across an extended 2-dimensional region of the flow field. The application of optical correlation techniques to the analysis of multiple exposure laser light sheet photographs can reduce and/or simplify the data reduction time and hardware. Here, Matched Spatial Filters (MSF) are used in a pattern recognition system. Usually MSFs are used to identify the assembly line parts. In this application, the MSFs are used to identify the iso-velocity vector contours in the flow. The patterns to be recognized are the recorded particle images in a pulsed laser light sheet photograph. Measurement of the direction of the partical image displacements between exposures yields the velocity vector. The particle image exposure sequence is designed such that the velocity vector direction is determined unambiguously. A global analysis technique is used in comparison to the more common particle tracking algorithms and Young's fringe analysis technique.
Rakhmatullina, Ekaterina; Bossen, Anke; Höschele, Christoph; Wang, Xiaojie; Beyeler, Barbara; Meier, Christoph; Lussi, Adrian
2011-01-01
We present assembly and application of an optical reflectometer for the analysis of dental erosion. The erosive procedure involved acid-induced softening and initial substance loss phases, which are considered to be difficult for visual diagnosis in a clinic. Change of the specular reflection signal showed the highest sensitivity for the detection of the early softening phase of erosion among tested methods. The exponential decrease of the specular reflection intensity with erosive duration was compared to the increase of enamel roughness. Surface roughness was measured by optical analysis, and the observed tendency was correlated with scanning electron microscopy images of eroded enamel. A high correlation between specular reflection intensity and measurement of enamel softening (r2 ≥ −0.86) as well as calcium release (r2 ≥ −0.86) was found during erosion progression. Measurement of diffuse reflection revealed higher tooth-to-tooth deviation in contrast to the analysis of specular reflection intensity and lower correlation with other applied methods (r2 = 0.42–0.48). The proposed optical method allows simple and fast surface analysis and could be used for further optimization and construction of the first noncontact and cost-effective diagnostic tool for early erosion assessment in vivo. PMID:22029364
NASA Astrophysics Data System (ADS)
Rakhmatullina, Ekaterina; Bossen, Anke; Höschele, Christoph; Wang, Xiaojie; Beyeler, Barbara; Meier, Christoph; Lussi, Adrian
2011-10-01
We present assembly and application of an optical reflectometer for the analysis of dental erosion. The erosive procedure involved acid-induced softening and initial substance loss phases, which are considered to be difficult for visual diagnosis in a clinic. Change of the specular reflection signal showed the highest sensitivity for the detection of the early softening phase of erosion among tested methods. The exponential decrease of the specular reflection intensity with erosive duration was compared to the increase of enamel roughness. Surface roughness was measured by optical analysis, and the observed tendency was correlated with scanning electron microscopy images of eroded enamel. A high correlation between specular reflection intensity and measurement of enamel softening (r2 >= -0.86) as well as calcium release (r2 >= -0.86) was found during erosion progression. Measurement of diffuse reflection revealed higher tooth-to-tooth deviation in contrast to the analysis of specular reflection intensity and lower correlation with other applied methods (r2 = 0.42-0.48). The proposed optical method allows simple and fast surface analysis and could be used for further optimization and construction of the first noncontact and cost-effective diagnostic tool for early erosion assessment in vivo.
Analyses of S-Box in Image Encryption Applications Based on Fuzzy Decision Making Criterion
NASA Astrophysics Data System (ADS)
Rehman, Inayatur; Shah, Tariq; Hussain, Iqtadar
2014-06-01
In this manuscript, we put forward a standard based on fuzzy decision making criterion to examine the current substitution boxes and study their strengths and weaknesses in order to decide their appropriateness in image encryption applications. The proposed standard utilizes the results of correlation analysis, entropy analysis, contrast analysis, homogeneity analysis, energy analysis, and mean of absolute deviation analysis. These analyses are applied to well-known substitution boxes. The outcome of these analyses are additional observed and a fuzzy soft set decision making criterion is used to decide the suitability of an S-box to image encryption applications.
A hybrid correlation analysis with application to imaging genetics
NASA Astrophysics Data System (ADS)
Hu, Wenxing; Fang, Jian; Calhoun, Vince D.; Wang, Yu-Ping
2018-03-01
Investigating the association between brain regions and genes continues to be a challenging topic in imaging genetics. Current brain region of interest (ROI)-gene association studies normally reduce data dimension by averaging the value of voxels in each ROI. This averaging may lead to a loss of information due to the existence of functional sub-regions. Pearson correlation is widely used for association analysis. However, it only detects linear correlation whereas nonlinear correlation may exist among ROIs. In this work, we introduced distance correlation to ROI-gene association analysis, which can detect both linear and nonlinear correlations and overcome the limitation of averaging operations by taking advantage of the information at each voxel. Nevertheless, distance correlation usually has a much lower value than Pearson correlation. To address this problem, we proposed a hybrid correlation analysis approach, by applying canonical correlation analysis (CCA) to the distance covariance matrix instead of directly computing distance correlation. Incorporating CCA into distance correlation approach may be more suitable for complex disease study because it can detect highly associated pairs of ROI and gene groups, and may improve the distance correlation level and statistical power. In addition, we developed a novel nonlinear CCA, called distance kernel CCA, which seeks the optimal combination of features with the most significant dependence. This approach was applied to imaging genetic data from the Philadelphia Neurodevelopmental Cohort (PNC). Experiments showed that our hybrid approach produced more consistent results than conventional CCA across resampling and both the correlation and statistical significance were increased compared to distance correlation analysis. Further gene enrichment analysis and region of interest (ROI) analysis confirmed the associations of the identified genes with brain ROIs. Therefore, our approach provides a powerful tool for finding the correlation between brain imaging and genomic data.
Forensic analysis of social networking application on iOS devices
NASA Astrophysics Data System (ADS)
Zhang, Shuhui; Wang, Lianhai
2013-12-01
The increased use of social networking application on iPhone and iPad make these devices a goldmine for forensic investigators. Besides, QQ, Wechat, Sina Weibo and skype applications are very popular in China and didn't draw attention to researchers. These social networking applications are used not only on computers, but also mobile phones and tablets. This paper focuses on conducting forensic analysis on these four social networking applications on iPhone and iPad devices. The tests consisted of installing the social networking applications on each device, conducting common user activities through each application and correlation analysis with other activities. Advices to the forensic investigators are also given. It could help the investigators to describe the crime behavior and reconstruct the crime venue.
Spectral and correlation analysis with applications to middle-atmosphere radars
NASA Technical Reports Server (NTRS)
Rastogi, Prabhat K.
1989-01-01
The correlation and spectral analysis methods for uniformly sampled stationary random signals, estimation of their spectral moments, and problems arising due to nonstationary are reviewed. Some of these methods are already in routine use in atmospheric radar experiments. Other methods based on the maximum entropy principle and time series models have been used in analyzing data, but are just beginning to receive attention in the analysis of radar signals. These methods are also briefly discussed.
NASA Technical Reports Server (NTRS)
Kenigsberg, I. J.; Dean, M. W.; Malatino, R.
1974-01-01
The correlation achieved with each program provides the material for a discussion of modeling techniques developed for general application to finite-element dynamic analyses of helicopter airframes. Included are the selection of static and dynamic degrees of freedom, cockpit structural modeling, and the extent of flexible-frame modeling in the transmission support region and in the vicinity of large cut-outs. The sensitivity of predicted results to these modeling assumptions are discussed. Both the Sikorsky Finite-Element Airframe Vibration analysis Program (FRAN/Vibration Analysis) and the NASA Structural Analysis Program (NASTRAN) have been correlated with data taken in full-scale vibration tests of a modified CH-53A helicopter.
Applications of statistics to medical science, III. Correlation and regression.
Watanabe, Hiroshi
2012-01-01
In this third part of a series surveying medical statistics, the concepts of correlation and regression are reviewed. In particular, methods of linear regression and logistic regression are discussed. Arguments related to survival analysis will be made in a subsequent paper.
NASA Astrophysics Data System (ADS)
Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing
2015-10-01
Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.
NASA Astrophysics Data System (ADS)
Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.
2018-03-01
We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.
Development problem analysis of correlation leak detector’s software
NASA Astrophysics Data System (ADS)
Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.
2018-05-01
In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.
NASA Astrophysics Data System (ADS)
Ceffa, Nicolo G.; Cesana, Ilaria; Collini, Maddalena; D'Alfonso, Laura; Carra, Silvia; Cotelli, Franco; Sironi, Laura; Chirico, Giuseppe
2017-10-01
Ramification of blood circulation is relevant in a number of physiological and pathological conditions. The oxygen exchange occurs largely in the capillary bed, and the cancer progression is closely linked to the angiogenesis around the tumor mass. Optical microscopy has made impressive improvements in in vivo imaging and dynamic studies based on correlation analysis of time stacks of images. Here, we develop and test advanced methods that allow mapping the flow fields in branched vessel networks at the resolution of 10 to 20 μm. The methods, based on the application of spatiotemporal image correlation spectroscopy and its extension to cross-correlation analysis, are applied here to the case of early stage embryos of zebrafish.
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.
Vocal Parameters and Self-Perception in Individuals With Adductor Spasmodic Dysphonia.
Rojas, Gleidy Vannesa E; Ricz, Hilton; Tumas, Vitor; Rodrigues, Guilherme R; Toscano, Patrícia; Aguiar-Ricz, Lílian
2017-05-01
The study aimed to compare and correlate perceptual-auditory analysis of vocal parameters and self-perception in individuals with adductor spasmodic dysphonia before and after the application of botulinum toxin. This is a prospective cohort study. Sixteen individuals with a diagnosis of adductor spasmodic dysphonia were submitted to the application of botulinum toxin in the thyroarytenoid muscle, to the recording of a voice signal, and to the Voice Handicap Index (VHI) questionnaire before the application and at two time points after application. Two judges performed a perceptual-auditory analysis of eight vocal parameters with the aid of the Praat software for the visualization of narrow band spectrography, pitch, and intensity contour. Comparison of the vocal parameters before toxin application and on the first return revealed a reduction of oscillation intensity (P = 0.002), voice breaks (P = 0.002), and vocal tremor (P = 0.002). The same parameters increased on the second return. The degree of severity, strained-strangled voice, roughness, breathiness, and asthenia was unchanged. The total score and the emotional domain score of the VHI were reduced on the first return. There was a moderate correlation between the degree of voice severity and the total VHI score before application and on the second return, and a weak correlation on the first return. Perceptual-auditory analysis and self-perception proved to be efficient in the recognition of vocal changes and of the vocal impact on individuals with adductor spasmodic dysphonia under treatment with botulinum toxin, permitting the quantitation of changes along time. Copyright © 2017. Published by Elsevier Inc.
Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.
Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben
2017-06-06
Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.
NASA Astrophysics Data System (ADS)
Vorobev, A. V.; Vorobeva, G. R.
2018-03-01
The results of the analysis of geomagnetic data synchronously recorded by the INTERMAGNET magnetic stations are presented. The goal of this research was to distinguish internal correlations between the data and to determine the optimal spatial interval of the geographical coordinates within which the efficient operation of only one magnetic observatory would be satisfactory in most occasions. The results of the observation of correlations between the geomagnetic data on a global scale are summarized and presented. Possible regions of application of these results are determined.
Wang, Jun-Sheng; Olszewski, Emily; Devine, Erin E; Hoffman, Matthew R; Zhang, Yu; Shao, Jun; Jiang, Jack J
2016-08-01
To evaluate the spatiotemporal correlation of vocal fold vibration using eigenmode analysis before and after polyp removal and explore the potential clinical relevance of spatiotemporal analysis of correlation length and entropy as quantitative voice parameters. We hypothesized that increased order in the vibrating signal after surgical intervention would decrease the eigenmode-based entropy and increase correlation length. Prospective case series. Forty subjects (23 males, 17 females) with unilateral (n = 24) or bilateral (n = 16) polyps underwent polyp removal. High-speed videoendoscopy was performed preoperatively and 2 weeks postoperatively. Spatiotemporal analysis was performed to determine entropy, quantification of signal disorder, correlation length, size, and spatially ordered structure of vocal fold vibration in comparison to full spatial consistency. The signal analyzed consists of the vibratory pattern in space and time derived from the high-speed video glottal area contour. Entropy decreased (Z = -3.871, P < .001) and correlation length increased (t = -8.913, P < .001) following polyp excision. The intraclass correlation coefficients (ICC) for correlation length and entropy were 0.84 and 0.93. Correlation length and entropy are sensitive to mass lesions. These parameters could potentially be used to augment subjective visualization after polyp excision when evaluating procedural efficacy. © The Author(s) 2016.
Data analytics using canonical correlation analysis and Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Rickman, Jeffrey M.; Wang, Yan; Rollett, Anthony D.; Harmer, Martin P.; Compson, Charles
2017-07-01
A canonical correlation analysis is a generic parametric model used in the statistical analysis of data involving interrelated or interdependent input and output variables. It is especially useful in data analytics as a dimensional reduction strategy that simplifies a complex, multidimensional parameter space by identifying a relatively few combinations of variables that are maximally correlated. One shortcoming of the canonical correlation analysis, however, is that it provides only a linear combination of variables that maximizes these correlations. With this in mind, we describe here a versatile, Monte-Carlo based methodology that is useful in identifying non-linear functions of the variables that lead to strong input/output correlations. We demonstrate that our approach leads to a substantial enhancement of correlations, as illustrated by two experimental applications of substantial interest to the materials science community, namely: (1) determining the interdependence of processing and microstructural variables associated with doped polycrystalline aluminas, and (2) relating microstructural decriptors to the electrical and optoelectronic properties of thin-film solar cells based on CuInSe2 absorbers. Finally, we describe how this approach facilitates experimental planning and process control.
Reducing Uncertainties in Hydrocarbon Prediction through Application of Elastic Domain
NASA Astrophysics Data System (ADS)
Shamsuddin, S. Z.; Hermana, M.; Ghosh, D. P.; Salim, A. M. A.
2017-10-01
The application of lithology and fluid indicators has helped the geophysicists to discriminate reservoirs to non-reservoirs from a field. This analysis is conducted to select the most suitable lithology and fluid indicator for the Malaysian basins that could lead to better eliminate pitfalls of amplitude. This paper uses different rock physics analysis such as elastic impedance, Lambda-Mu-Rho, and SQp-SQs attribute. Litho-elastic impedance log is generated by correlating the gamma ray log with extended elastic impedance log. The same application is used for fluid-elastic impedance by correlation of EEI log with water saturation or resistivity. The work is done on several well logging data collected from different fields in Malay basin and its neighbouring basin. There's an excellent separation between hydrocarbon sand and background shale for Well-1 from different cross-plot analysis. Meanwhile, the Well-2 shows good separation in LMR plot. The similar method is done on the Well-3 shows fair separation of silty sand and gas sand using SQp-SQs attribute which can be correlated with well log. Based on the point distribution histogram plot, different lithology and fluid can be separated clearly. Simultaneous seismic inversion results in acoustic impedance, Vp/Vs, SQp, and SQs volumes. There are many attributes available in the industry used to separate the lithology and fluid, however some of the methods are not suitable for the application to the basins in Malaysia.
NASA Astrophysics Data System (ADS)
Ming, A. B.; Qin, Z. Y.; Zhang, W.; Chu, F. L.
2013-12-01
Bearing failure is one of the most common reasons of machine breakdowns and accidents. Therefore, the fault diagnosis of rolling element bearings is of great significance to the safe and efficient operation of machines owing to its fault indication and accident prevention capability in engineering applications. Based on the orthogonal projection theory, a novel method is proposed to extract the fault characteristic frequency for the incipient fault diagnosis of rolling element bearings in this paper. With the capability of exposing the oscillation frequency of the signal energy, the proposed method is a generalized form of the squared envelope analysis and named as spectral auto-correlation analysis (SACA). Meanwhile, the SACA is a simplified form of the cyclostationary analysis as well and can be iteratively carried out in applications. Simulations and experiments are used to evaluate the efficiency of the proposed method. Comparing the results of SACA, the traditional envelope analysis and the squared envelope analysis, it is found that the result of SACA is more legible due to the more prominent harmonic amplitudes of the fault characteristic frequency and that the SACA with the proper iteration will further enhance the fault features.
NASA Astrophysics Data System (ADS)
Tsenkova, Roumiana; Murayama, Koichi; Kawano, Sumio; Wu, Yuqing; Toyoda, Kiyohiko; Ozaki, Yukihiro
2000-03-01
We describe the application of two-dimensional correlation spectroscopic (2DCOS) technique for mastitic diagnosis. Seven average spectra in the short wavelength region (700-1100 nm) of mastitic levels separated from healthy to disease were subjected to 2DCOS analysis. Synchronous correlation map clearly showed water and fat bands. Asynchronous correlation map indicated the dynamical variations of milk constituents in milk occurred when a cow gets mastitis.
Kim, Dahan; Curthoys, Nikki M.; Parent, Matthew T.; Hess, Samuel T.
2015-01-01
Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined. PMID:26185614
Kim, Dahan; Curthoys, Nikki M; Parent, Matthew T; Hess, Samuel T
2013-09-01
Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods of its correction in correlation analyses has been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows our method accurately corrects the artificial increase in both types of correlations studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlations examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. Demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc.), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined.
Atmospheric pollution measurement by optical cross correlation methods - A concept
NASA Technical Reports Server (NTRS)
Fisher, M. J.; Krause, F. R.
1971-01-01
Method combines standard spectroscopy with statistical cross correlation analysis of two narrow light beams for remote sensing to detect foreign matter of given particulate size and consistency. Method is applicable in studies of generation and motion of clouds, nuclear debris, ozone, and radiation belts.
Linear unmixing of multidate hyperspectral imagery for crop yield estimation
USDA-ARS?s Scientific Manuscript database
In this paper, we have evaluated an unsupervised unmixing approach, vertex component analysis (VCA), for the application of crop yield estimation. The results show that abundance maps of the vegetation extracted by the approach are strongly correlated to the yield data (the correlation coefficients ...
Cavitation in liquid cryogens. 2: Hydrofoil
NASA Technical Reports Server (NTRS)
Hord, J.
1973-01-01
Boundary layer principles, along with two-phase concepts, are used to improve existing correlative theory for developed cavity data. Details concerning cavity instrumentation, data analysis, correlative techniques, and experimental and theoretical aspects of a cavitating hydrofoil are given. Both desinent and thermodynamic data, using liquid hydrogen and liquid nitrogen, are reported. The thermodynamic data indicated that stable thermodynamic equilibrium exists throughout the vaporous cryogen cavities. The improved correlative formulas were used to evaluate these data. A new correlating parameter based on consideration of mass limiting two-phase flow flux across the cavity interface, is proposed. This correlating parameter appears attractive for future correlative and predictive applications. Agreement between theory and experiment is discussed, and directions for future analysis are suggested. The front half of the cavities, developed on the hydrofoil, may be considered as parabolically shaped.
Kumar, Keshav
2018-03-01
Excitation-emission matrix fluorescence (EEMF) and total synchronous fluorescence spectroscopy (TSFS) are the 2 fluorescence techniques that are commonly used for the analysis of multifluorophoric mixtures. These 2 fluorescence techniques are conceptually different and provide certain advantages over each other. The manual analysis of such highly correlated large volume of EEMF and TSFS towards developing a calibration model is difficult. Partial least square (PLS) analysis can analyze the large volume of EEMF and TSFS data sets by finding important factors that maximize the correlation between the spectral and concentration information for each fluorophore. However, often the application of PLS analysis on entire data sets does not provide a robust calibration model and requires application of suitable pre-processing step. The present work evaluates the application of genetic algorithm (GA) analysis prior to PLS analysis on EEMF and TSFS data sets towards improving the precision and accuracy of the calibration model. The GA algorithm essentially combines the advantages provided by stochastic methods with those provided by deterministic approaches and can find the set of EEMF and TSFS variables that perfectly correlate well with the concentration of each of the fluorophores present in the multifluorophoric mixtures. The utility of the GA assisted PLS analysis is successfully validated using (i) EEMF data sets acquired for dilute aqueous mixture of four biomolecules and (ii) TSFS data sets acquired for dilute aqueous mixtures of four carcinogenic polycyclic aromatic hydrocarbons (PAHs) mixtures. In the present work, it is shown that by using the GA it is possible to significantly improve the accuracy and precision of the PLS calibration model developed for both EEMF and TSFS data set. Hence, GA must be considered as a useful pre-processing technique while developing an EEMF and TSFS calibration model.
Modal Test/Analysis Correlation of Space Station Structures Using Nonlinear Sensitivity
NASA Technical Reports Server (NTRS)
Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan
1992-01-01
The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlation. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.
Modal test/analysis correlation of Space Station structures using nonlinear sensitivity
NASA Technical Reports Server (NTRS)
Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan
1992-01-01
The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlations. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.
Messaraa, C; Metois, A; Walsh, M; Hurley, S; Doyle, L; Mansfield, A; O'Connor, C; Mavon, A
2018-01-24
Skin topographic measurements are of paramount importance in the field of dermo-cosmetic evaluation. The aim of this study was to investigate how the Antera 3D, a multi-purpose handheld camera, correlates with other topographic techniques and changes in skin topography following the use of a cosmetic product. Skin topographic measurements were collected on 26 female volunteers aged 45-70 years with the Antera 3D, the DermaTOP and image analysis on parallel-polarized pictures. Different filters for analysis from the Antera 3D were investigated for repeatability, correlations with other imaging techniques and ability to detect improvements of skin topography following application of a serum. Most of Antera 3D parameters were found to be strongly correlated with the DermaTOP parameters. No association was found between the Antera 3D parameters and measurements on parallel-polarized photographs. The measurements repeatability was comparable among the different filters for analysis, with the exception of wrinkle max depth and roughness Rt. Following a single application of a tightening serum, both Antera 3D wrinkles and texture parameters were able to record significant improvements, with the best improvements observed with the large filter. The Antera 3D demonstrated its relevance for cosmetic product evaluation. We also provide recommendations for the analysis based on our findings. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Rosenkranz, Tabea; Müller, Kai W; Dreier, Michael; Beutel, Manfred E; Wölfling, Klaus
2017-01-01
This paper examines the addictive potential of 8 different Internet applications, distinguishing male and female users. Moreover, differential correlates of problematic use are investigated in Internet gamers (IG) and generalized Internet users (GIU). In a representative sample of 5,667 adolescents aged 12-19 years, use of Internet applications, problematic Internet use, psychopathologic symptoms (emotional problems, hyperactivity/inattention, and psychosomatic complaints), personality (conscientiousness and extraversion), psychosocial correlates (perceived stress and self-efficacy), and coping strategies were assessed. The addictive potential of Internet applications was examined in boys and girls using regression analysis. MANOVAs were conducted to examine differential correlates of problematic Internet use between IG and GIU. Chatting and social networking most strongly predicted problematic Internet use in girls, while gaming was the strongest predictor in boys. Problematic IG exhibited multiple psychosocial problems compared to non-problematic IG. In problematic Internet users, GIU reported even higher psychosocial burden and displayed dysfunctional coping strategies more frequently than gamers. The results extend previous findings on the addictive potential of Internet applications and validate the proposed distinction between specific and generalized problematic Internet use. In addition to Internet gaming disorder, future studies should also focus on other highly addictive Internet applications, that is, chatting or social networking, regarding differential correlates of problematic use. © 2017 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Pirozzi, K. L.; Long, C. J.; McAleer, C. W.; Smith, A. S. T.; Hickman, J. J.
2013-08-01
Rigorous analysis of muscle function in in vitro systems is needed for both acute and chronic biomedical applications. Forces generated by skeletal myotubes on bio-microelectromechanical cantilevers were calculated using a modified version of Stoney's thin-film equation and finite element analysis (FEA), then analyzed for regression to physical parameters. The Stoney's equation results closely matched the more intensive FEA and the force correlated to cross-sectional area (CSA). Normalizing force to measured CSA significantly improved the statistical sensitivity and now allows for close comparison of in vitro data to in vivo measurements for applications in exercise physiology, robotics, and modeling neuromuscular diseases.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
Inferential Procedures for Correlation Coefficients Corrected for Attenuation.
ERIC Educational Resources Information Center
Hakstian, A. Ralph; And Others
1988-01-01
A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)
Dalmolin, Graziele de Lima; Lunardi, Valéria Lerch; Lunardi, Guilherme Lerch; Barlem, Edison Luiz Devos; Silveira, Rosemary Silva da
2014-01-01
to identify relationships between moral distress and Burnout in the professional performance from the perceptions of the experiences of nursing workers. this is a survey type study with 375 nursing workers working in three different hospitals of southern Rio Grande do Sul, with the application of adaptations of the Moral Distress Scale and the Maslach Burnout Inventory, validated and standardized for use in Brazil. Data validation occurred through factor analysis and Cronbach's alpha. For the data analysis bivariate analysis using Pearson's correlation and multivariate analysis using multiple regression were performed. the existence of a weak correlation between moral distress and Burnout was verified. A possible positive correlation between Burnout and therapeutic obstinacy, and a negative correlation between professional fulfillment and moral distress were identified. the need was identified for further studies that include mediating and moderating variables that may explain more clearly the models studied.
Dalmolin, Graziele de Lima; Lunardi, Valéria Lerch; Lunardi, Guilherme Lerch; Barlem, Edison Luiz Devos; da Silveira, Rosemary Silva
2014-01-01
Objective to identify relationships between moral distress and Burnout in the professional performance from the perceptions of the experiences of nursing workers. Methods this is a survey type study with 375 nursing workers working in three different hospitals of southern Rio Grande do Sul, with the application of adaptations of the Moral Distress Scale and the Maslach Burnout Inventory, validated and standardized for use in Brazil. Data validation occurred through factor analysis and Cronbach's alpha. For the data analysis bivariate analysis using Pearson's correlation and multivariate analysis using multiple regression were performed. Results the existence of a weak correlation between moral distress and Burnout was verified. A possible positive correlation between Burnout and therapeutic obstinacy, and a negative correlation between professional fulfillment and moral distress were identified. Conclusion the need was identified for further studies that include mediating and moderating variables that may explain more clearly the models studied. PMID:24553701
The use of copula functions for predictive analysis of correlations between extreme storm tides
NASA Astrophysics Data System (ADS)
Domino, Krzysztof; Błachowicz, Tomasz; Ciupak, Maurycy
2014-11-01
In this paper we present a method used in quantitative description of weakly predictable hydrological, extreme events at inland sea. Investigations for correlations between variations of individual measuring points, employing combined statistical methods, were carried out. As a main tool for this analysis we used a two-dimensional copula function sensitive for correlated extreme effects. Additionally, a new proposed methodology, based on Detrended Fluctuations Analysis (DFA) and Anomalous Diffusion (AD), was used for the prediction of negative and positive auto-correlations and associated optimum choice of copula functions. As a practical example we analysed maximum storm tides data recorded at five spatially separated places at the Baltic Sea. For the analysis we used Gumbel, Clayton, and Frank copula functions and introduced the reversed Clayton copula. The application of our research model is associated with modelling the risk of high storm tides and possible storm flooding.
Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng
2018-04-20
Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.
Continued investigation of potential application of Omega navigation to civil aviation
NASA Technical Reports Server (NTRS)
Baxa, E. G., Jr.
1978-01-01
Major attention is given to an analysis of receiver repeatability in measuring OMEGA phase data. Repeatability is defined as the ability of two like receivers which are co-located to achieve the same LOP phase readings. Specific data analysis is presented. A propagation model is described which has been used in the analysis of propagation anomalies. Composite OMEGA analysis is presented in terms of carrier phase correlation analysis and the determination of carrier phase weighting coefficients for minimizing composite phase variation. Differential OMEGA error analysis is presented for receiver separations. Three frequency analysis includes LOP error and position error based on three and four OMEGA transmissions. Results of phase amplitude correlation studies are presented.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.
1976-01-01
A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.
The Identification and Tracking of Uterine Contractions Using Template Based Cross-Correlation.
McDonald, Sarah C; Brooker, Graham; Phipps, Hala; Hyett, Jon
2017-09-01
The purpose of this paper is to outline a novel method of using template based cross-correlation to identify and track uterine contractions during labour. A purpose built six-channel Electromyography (EMG) device was used to collect data from consenting women during labour and birth. A range of templates were constructed for the purpose of identifying and tracking uterine activity when cross-correlated with the EMG signal. Peak finding techniques were applied on the cross-correlated result to simplify and automate the identification and tracking of contractions. The EMG data showed a unique pattern when a woman was contracting with key features of the contraction signal remaining consistent and identifiable across subjects. Contraction profiles across subjects were automatically identified using template based cross-correlation. Synthetic templates from a rectangular function with a duration of between 5 and 10 s performed best at identifying and tracking uterine activity across subjects. The successful application of this technique provides opportunity for both simple and accurate real-time analysis of contraction data while enabling investigations into the application of techniques such as machine learning which could enable automated learning from contraction data as part of real-time monitoring and post analysis.
Hierarchical multivariate covariance analysis of metabolic connectivity.
Carbonell, Felix; Charil, Arnaud; Zijdenbos, Alex P; Evans, Alan C; Bedell, Barry J
2014-12-01
Conventional brain connectivity analysis is typically based on the assessment of interregional correlations. Given that correlation coefficients are derived from both covariance and variance, group differences in covariance may be obscured by differences in the variance terms. To facilitate a comprehensive assessment of connectivity, we propose a unified statistical framework that interrogates the individual terms of the correlation coefficient. We have evaluated the utility of this method for metabolic connectivity analysis using [18F]2-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. As an illustrative example of the utility of this approach, we examined metabolic connectivity in angular gyrus and precuneus seed regions of mild cognitive impairment (MCI) subjects with low and high β-amyloid burdens. This new multivariate method allowed us to identify alterations in the metabolic connectome, which would not have been detected using classic seed-based correlation analysis. Ultimately, this novel approach should be extensible to brain network analysis and broadly applicable to other imaging modalities, such as functional magnetic resonance imaging (MRI).
Spectral density mapping at multiple magnetic fields suitable for 13C NMR relaxation studies
NASA Astrophysics Data System (ADS)
Kadeřávek, Pavel; Zapletal, Vojtěch; Fiala, Radovan; Srb, Pavel; Padrta, Petr; Přecechtělová, Jana Pavlíková; Šoltésová, Mária; Kowalewski, Jozef; Widmalm, Göran; Chmelík, Josef; Sklenář, Vladimír; Žídek, Lukáš
2016-05-01
Standard spectral density mapping protocols, well suited for the analysis of 15N relaxation rates, introduce significant systematic errors when applied to 13C relaxation data, especially if the dynamics is dominated by motions with short correlation times (small molecules, dynamic residues of macromolecules). A possibility to improve the accuracy by employing cross-correlated relaxation rates and on measurements taken at several magnetic fields has been examined. A suite of protocols for analyzing such data has been developed and their performance tested. Applicability of the proposed protocols is documented in two case studies, spectral density mapping of a uniformly labeled RNA hairpin and of a selectively labeled disaccharide exhibiting highly anisotropic tumbling. Combination of auto- and cross-correlated relaxation data acquired at three magnetic fields was applied in the former case in order to separate effects of fast motions and conformational or chemical exchange. An approach using auto-correlated relaxation rates acquired at five magnetic fields, applicable to anisotropically moving molecules, was used in the latter case. The results were compared with a more advanced analysis of data obtained by interpolation of auto-correlated relaxation rates measured at seven magnetic fields, and with the spectral density mapping of cross-correlated relaxation rates. The results showed that sufficiently accurate values of auto- and cross-correlated spectral density functions at zero and 13C frequencies can be obtained from data acquired at three magnetic fields for uniformly 13C -labeled molecules with a moderate anisotropy of the rotational diffusion tensor. Analysis of auto-correlated relaxation rates at five magnetic fields represents an alternative for molecules undergoing highly anisotropic motions.
The Supernovae Analysis Application (SNAP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bayless, Amanda J.; Fryer, Christopher Lee; Wollaeger, Ryan Thomas
The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginningmore » to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.« less
The Supernovae Analysis Application (SNAP)
Bayless, Amanda J.; Fryer, Christopher Lee; Wollaeger, Ryan Thomas; ...
2017-09-06
The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginningmore » to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.« less
The Supernovae Analysis Application (SNAP)
NASA Astrophysics Data System (ADS)
Bayless, Amanda J.; Fryer, Chris L.; Wollaeger, Ryan; Wiggins, Brandon; Even, Wesley; de la Rosa, Janie; Roming, Peter W. A.; Frey, Lucy; Young, Patrick A.; Thorpe, Rob; Powell, Luke; Landers, Rachel; Persson, Heather D.; Hay, Rebecca
2017-09-01
The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginning to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.
ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.
Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey
2018-06-21
Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.
Measuring the Cobb angle with the iPhone in kyphoses: a reliability study.
Jacquot, Frederic; Charpentier, Axelle; Khelifi, Sofiane; Gastambide, Daniel; Rigal, Regis; Sautet, Alain
2012-08-01
Smartphones have gained widespread use in the healthcare field to fulfill a variety of tasks. We developed a small iPhone application to take advantage of the built-in position sensor to measure angles in a variety of spinal deformities. We present a reliability study of this tool in measuring kyphotic angles. Radiographs taken from 20 different patients' charts were presented to a panel of six operators at two different times. Radiographs were measured with the protractor and the iPhone application and statistical analysis was applied to measure intraclass correlation coefficients between both measurement methods, and to measure intra- and interobserver reliability The intraclass correlation coefficient calculated between methods (i.e. CobbMeter application on the iPhone versus standard method with the protractor) was 0.963 for all measures, indicating excellent correlation was obtained between the CobbMeter application and the standard method. The interobserver correlation coefficient was 0.965. The intraobserver ICC was 0.977, indicating excellent reproductibility of measurements at different times for all operators. The interobserver ICC between fellowship trained senior surgeons and general orthopaedic residents was 0.989. Consistently, the ICC for intraobserver and interobserver correlations was higher with the CobbMeter application than with the regular protractor method. This difference was not statistically significant. Measuring kyphotic angles with the iPhone application appears to be a valid procedure and is in no way inferior to the standard way of measuring the Cobb angle in kyphotic deformities.
Fundamentals of digital filtering with applications in geophysical prospecting for oil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesko, A.
This book is a comprehensive work bringing together the important mathematical foundations and computing techniques for numerical filtering methods. The first two parts of the book introduce the techniques, fundamental theory and applications, while the third part treats specific applications in geophysical prospecting. Discussion is limited to linear filters, but takes in related fields such as correlational and spectral analysis.
The QAP weighted network analysis method and its application in international services trade
NASA Astrophysics Data System (ADS)
Xu, Helian; Cheng, Long
2016-04-01
Based on QAP (Quadratic Assignment Procedure) correlation and complex network theory, this paper puts forward a new method named QAP Weighted Network Analysis Method. The core idea of the method is to analyze influences among relations in a social or economic group by building a QAP weighted network of networks of relations. In the QAP weighted network, a node depicts a relation and an undirect edge exists between any pair of nodes if there is significant correlation between relations. As an application of the QAP weighted network, we study international services trade by using the QAP weighted network, in which nodes depict 10 kinds of services trade relations. After the analysis of international services trade by QAP weighted network, and by using distance indicators, hierarchy tree and minimum spanning tree, the conclusion shows that: Firstly, significant correlation exists in all services trade, and the development of any one service trade will stimulate the other nine. Secondly, as the economic globalization goes deeper, correlations in all services trade have been strengthened continually, and clustering effects exist in those services trade. Thirdly, transportation services trade, computer and information services trade and communication services trade have the most influence and are at the core in all services trade.
2011-04-01
Proceedings, Bristol, UK (2006). 5. M. A. Mentzer, Applied Optics Fundamentals and Device Applications: Nano, MOEMS , and Biotechnology, CRC Taylor...ballistic sensing, flash x-ray cineradiography, digital image correlation, image processing al- gorithms, and applications of MOEMS to nano- and
Computed tomography for non-destructive evaluation of composites: Applications and correlations
NASA Technical Reports Server (NTRS)
Goldberg, B.; Hediger, L.; Noel, E.
1985-01-01
The state-of-the-art fabrication techniques for composite materials are such that stringent species-specific acceptance criteria must be generated to insure product reliability. Non-destructive evaluation techniques including computed tomography (CT), X-ray radiography (RT), and ultrasonic scanning (UT) are investigated and compared to determine their applicability and limitations to graphite epoxy, carbon-carbon, and carbon-phenolic materials. While the techniques appear complementary, CT is shown to provide significant, heretofore unattainable data. Finally, a correlation of NDE techniques to destructive analysis is presented.
Importance and use of correlational research.
Curtis, Elizabeth A; Comiskey, Catherine; Dempsey, Orla
2016-07-01
The importance of correlational research has been reported in the literature yet few research texts discuss design in any detail. To discuss important issues and considerations in correlational research, and suggest ways to avert potential problems during the preparation and application of the design. This article targets the gap identified in the literature regarding correlational research design. Specifically, it discusses the importance and purpose of correlational research, its application, analysis and interpretation with contextualisations to nursing and health research. Findings from correlational research can be used to determine prevalence and relationships among variables, and to forecast events from current data and knowledge. In spite of its many uses, prudence is required when using the methodology and analysing data. To assist researchers in reducing mistakes, important issues are singled out for discussion and several options put forward for analysing data. Correlational research is widely used and this paper should be particularly useful for novice nurse researchers. Furthermore, findings generated from correlational research can be used, for example, to inform decision-making, and to improve or initiate health-related activities or change.
Gallo, Stephen A; Carpenter, Afton S; Irwin, David; McPartland, Caitlin D; Travis, Joseph; Reynders, Sofie; Thompson, Lisa A; Glisson, Scott R
2014-01-01
There is a paucity of data in the literature concerning the validation of the grant application peer review process, which is used to help direct billions of dollars in research funds. Ultimately, this validation will hinge upon empirical data relating the output of funded projects to the predictions implicit in the overall scientific merit scores from the peer review of submitted applications. In an effort to address this need, the American Institute of Biological Sciences (AIBS) conducted a retrospective analysis of peer review data of 2,063 applications submitted to a particular research program and the bibliometric output of the resultant 227 funded projects over an 8-year period. Peer review scores associated with applications were found to be moderately correlated with the total time-adjusted citation output of funded projects, although a high degree of variability existed in the data. Analysis over time revealed that as average annual scores of all applications (both funded and unfunded) submitted to this program improved with time, the average annual citation output per application increased. Citation impact did not correlate with the amount of funds awarded per application or with the total annual programmatic budget. However, the number of funded applications per year was found to correlate well with total annual citation impact, suggesting that improving funding success rates by reducing the size of awards may be an efficient strategy to optimize the scientific impact of research program portfolios. This strategy must be weighed against the need for a balanced research portfolio and the inherent high costs of some areas of research. The relationship observed between peer review scores and bibliometric output lays the groundwork for establishing a model system for future prospective testing of the validity of peer review formats and procedures.
Gallo, Stephen A.; Carpenter, Afton S.; Irwin, David; McPartland, Caitlin D.; Travis, Joseph; Reynders, Sofie; Thompson, Lisa A.; Glisson, Scott R.
2014-01-01
There is a paucity of data in the literature concerning the validation of the grant application peer review process, which is used to help direct billions of dollars in research funds. Ultimately, this validation will hinge upon empirical data relating the output of funded projects to the predictions implicit in the overall scientific merit scores from the peer review of submitted applications. In an effort to address this need, the American Institute of Biological Sciences (AIBS) conducted a retrospective analysis of peer review data of 2,063 applications submitted to a particular research program and the bibliometric output of the resultant 227 funded projects over an 8-year period. Peer review scores associated with applications were found to be moderately correlated with the total time-adjusted citation output of funded projects, although a high degree of variability existed in the data. Analysis over time revealed that as average annual scores of all applications (both funded and unfunded) submitted to this program improved with time, the average annual citation output per application increased. Citation impact did not correlate with the amount of funds awarded per application or with the total annual programmatic budget. However, the number of funded applications per year was found to correlate well with total annual citation impact, suggesting that improving funding success rates by reducing the size of awards may be an efficient strategy to optimize the scientific impact of research program portfolios. This strategy must be weighed against the need for a balanced research portfolio and the inherent high costs of some areas of research. The relationship observed between peer review scores and bibliometric output lays the groundwork for establishing a model system for future prospective testing of the validity of peer review formats and procedures. PMID:25184367
NASA Astrophysics Data System (ADS)
Dong, Keqiang; Zhang, Hong; Gao, You
2017-01-01
Identifying the mutual interaction in aero-engine gas path system is a crucial problem that facilitates the understanding of emerging structures in complex system. By employing the multiscale multifractal detrended cross-correlation analysis method to aero-engine gas path system, the cross-correlation characteristics between gas path system parameters are established. Further, we apply multiscale multifractal detrended cross-correlation distance matrix and minimum spanning tree to investigate the mutual interactions of gas path variables. The results can infer that the low-spool rotor speed (N1) and engine pressure ratio (EPR) are main gas path parameters. The application of proposed method contributes to promote our understanding of the internal mechanisms and structures of aero-engine dynamics.
Mingus Discontinuous Multiphysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pat Notz, Dan Turner
Mingus provides hybrid coupled local/non-local mechanics analysis capabilities that extend several traditional methods to applications with inherent discontinuities. Its primary features include adaptations of solid mechanics, fluid dynamics and digital image correlation that naturally accommodate dijointed data or irregular solution fields by assimilating a variety of discretizations (such as control volume finite elements, peridynamics and meshless control point clouds). The goal of this software is to provide an analysis framework form multiphysics engineering problems with an integrated image correlation capability that can be used for experimental validation and model
On the equivalence of the RTI and SVM approaches to time correlated analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, S.; Favalli, A.; Henzlova, D.
2014-11-21
Recently two papers on how to perform passive neutron auto-correlation analysis on time gated histograms formed from pulse train data, generically called time correlation analysis (TCA), have appeared in this journal [1,2]. For those of us working in international nuclear safeguards these treatments are of particular interest because passive neutron multiplicity counting is a widely deployed technique for the quantification of plutonium. The purpose of this letter is to show that the skewness-variance-mean (SVM) approach developed in [1] is equivalent in terms of assay capability to the random trigger interval (RTI) analysis laid out in [2]. Mathematically we could alsomore » use other numerical ways to extract the time correlated information from the histogram data including for example what we might call the mean, mean square, and mean cube approach. The important feature however, from the perspective of real world applications, is that the correlated information extracted is the same, and subsequently gets interpreted in the same way based on the same underlying physics model.« less
The association between body mass index and severe biliary infections: a multivariate analysis.
Stewart, Lygia; Griffiss, J McLeod; Jarvis, Gary A; Way, Lawrence W
2012-11-01
Obesity has been associated with worse infectious disease outcomes. It is a risk factor for cholesterol gallstones, but little is known about associations between body mass index (BMI) and biliary infections. We studied this using factors associated with biliary infections. A total of 427 patients with gallstones were studied. Gallstones, bile, and blood (as applicable) were cultured. Illness severity was classified as follows: none (no infection or inflammation), systemic inflammatory response syndrome (fever, leukocytosis), severe (abscess, cholangitis, empyema), or multi-organ dysfunction syndrome (bacteremia, hypotension, organ failure). Associations between BMI and biliary bacteria, bacteremia, gallstone type, and illness severity were examined using bivariate and multivariate analysis. BMI inversely correlated with pigment stones, biliary bacteria, bacteremia, and increased illness severity on bivariate and multivariate analysis. Obesity correlated with less severe biliary infections. BMI inversely correlated with pigment stones and biliary bacteria; multivariate analysis showed an independent correlation between lower BMI and illness severity. Most patients with severe biliary infections had a normal BMI, suggesting that obesity may be protective in biliary infections. This study examined the correlation between BMI and biliary infection severity. Published by Elsevier Inc.
ERIC Educational Resources Information Center
Konold, Timothy R.; Glutting, Joseph J.
2008-01-01
This study employed a correlated trait-correlated method application of confirmatory factor analysis to disentangle trait and method variance from measures of attention-deficit/hyperactivity disorder obtained at the college level. The two trait factors were "Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition" ("DSM-IV")…
A Generalized Method of Image Analysis from an Intercorrelation Matrix which May Be Singular.
ERIC Educational Resources Information Center
Yanai, Haruo; Mukherjee, Bishwa Nath
1987-01-01
This generalized image analysis method is applicable to singular and non-singular correlation matrices (CMs). Using the orthogonal projector and a weaker generalized inverse matrix, image and anti-image covariance matrices can be derived from a singular CM. (SLD)
ERIC Educational Resources Information Center
Jordan, Lawrence A.
1975-01-01
Calls attention to several errors in a recent application of canonical correlation analysis. The reanalysis contradicts Cropley's conclusion that "creativity tests can be said to possess reasonable and encouraging long-range predictive validity." (Author/SDH)
NASA Astrophysics Data System (ADS)
Qianxiang, Zhou
2012-07-01
It is very important to clarify the geometric characteristic of human body segment and constitute analysis model for ergonomic design and the application of ergonomic virtual human. The typical anthropometric data of 1122 Chinese men aged 20-35 years were collected using three-dimensional laser scanner for human body. According to the correlation between different parameters, curve fitting were made between seven trunk parameters and ten body parameters with the SPSS 16.0 software. It can be concluded that hip circumference and shoulder breadth are the most important parameters in the models and the two parameters have high correlation with the others parameters of human body. By comparison with the conventional regressive curves, the present regression equation with the seven trunk parameters is more accurate to forecast the geometric dimensions of head, neck, height and the four limbs with high precision. Therefore, it is greatly valuable for ergonomic design and analysis of man-machine system.This result will be very useful to astronaut body model analysis and application.
WGCNA: an R package for weighted correlation network analysis.
Langfelder, Peter; Horvath, Steve
2008-12-29
Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.
WGCNA: an R package for weighted correlation network analysis
Langfelder, Peter; Horvath, Steve
2008-01-01
Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008
Correlation analysis of respiratory signals by using parallel coordinate plots.
Saatci, Esra
2018-01-01
The understanding of the bonds and the relationships between the respiratory signals, i.e. the airflow, the mouth pressure, the relative temperature and the relative humidity during breathing may provide the improvement on the measurement methods of respiratory mechanics and sensor designs or the exploration of the several possible applications in the analysis of respiratory disorders. Therefore, the main objective of this study was to propose a new combination of methods in order to determine the relationship between respiratory signals as a multidimensional data. In order to reveal the coupling between the processes two very different methods were used: the well-known statistical correlation analysis (i.e. Pearson's correlation and cross-correlation coefficient) and parallel coordinate plots (PCPs). Curve bundling with the number intersections for the correlation analysis, Least Mean Square Time Delay Estimator (LMS-TDE) for the point delay detection and visual metrics for the recognition of the visual structures were proposed and utilized in PCP. The number of intersections was increased when the correlation coefficient changed from high positive to high negative correlation between the respiratory signals, especially if whole breath was processed. LMS-TDE coefficients plotted in PCP indicated well-matched point delay results to the findings in the correlation analysis. Visual inspection of PCB by visual metrics showed range, dispersions, entropy comparisons and linear and sinusoidal-like relationships between the respiratory signals. It is demonstrated that the basic correlation analysis together with the parallel coordinate plots perceptually motivates the visual metrics in the display and thus can be considered as an aid to the user analysis by providing meaningful views of the data. Copyright © 2017 Elsevier B.V. All rights reserved.
Purkayastha, Sagar N; Byrne, Michael D; O'Malley, Marcia K
2012-01-01
Gaming controllers are attractive devices for research due to their onboard sensing capabilities and low-cost. However, a proper quantitative analysis regarding their suitability for use in motion capture, rehabilitation and as input devices for teleoperation and gesture recognition has yet to be conducted. In this paper, a detailed analysis of the sensors of two of these controllers, the Nintendo Wiimote and the Sony Playstation 3 Sixaxis, is presented. The acceleration and angular velocity data from the sensors of these controllers were compared and correlated with computed acceleration and angular velocity data derived from a high resolution encoder. The results show high correlation between the sensor data from the controllers and the computed data derived from the position data of the encoder. From these results, it can be inferred that the Wiimote is more consistent and better suited for motion capture applications and as an input device than the Sixaxis. The applications of the findings are discussed with respect to potential research ventures.
NASA Astrophysics Data System (ADS)
Sanjaya, Kadek Heri; Sya'bana, Yukhi Mustaqim Kusuma
2017-01-01
Research on eco-friendly vehicle development in Indonesia has largely neglected ergonomic study, despite the fact that traffic accidents have resulted in greater economic cost than fuel subsidy. We have performed a biomechanical experiment on human locomotion earlier. In this article, we describe the importance of implementing the biomechanical measurement methods in transportation ergonomic study. The instruments such as electromyogram (EMG), load cell, pressure sensor, and motion analysis methods as well as cross-correlation function analysis were explained, then the possibility of their application in driving behavior study is described. We describe the potentials and challenges of the biomechanical methods concerning the future vehicle development. The methods provide greater advantages in objective and accurate measurement not only in human task performance but also its correlation with vehicle performance.
Determination of magnetic helicity in the solar wind and implications for cosmic ray propagation
NASA Technical Reports Server (NTRS)
Matthaeus, W. M.; Goldstein, M. L.
1981-01-01
Magnetic helicity (Hm) is the mean value of the correlation between a turbulent magnetic field and the magnetic vector potential. A technique is described for determining Hm and its 'reduced' spectrum from the two point magnetic correlation matrix. The application of the derived formalism to solar wind magnetic fluctuations is discussed, taking into account cases for which only single point measurements are available. The application procedure employs the usual 'frozen in approximation' approach. The considered method is applied to an analysis of several periods of Voyager 2 interplanetary magnetometer data near 2.8 AU. During these periods the correlation length, or energy containing length, was found to be approximately 3 x 10 to the 11th cm
Guerreiro, Rita; Escott-Price, Valentina; Darwent, Lee; Parkkinen, Laura; Ansorge, Olaf; Hernandez, Dena G.; Nalls, Michael A.; Clark, Lorraine; Honig, Lawrence; Marder, Karen; van der Flier, Wiesje; Holstege, Henne; Louwersheimer, Eva; Lemstra, Afina; Scheltens, Philip; Rogaeva, Ekaterina; St George-Hyslop, Peter; Londos, Elisabet; Zetterberg, Henrik; Ortega-Cubero, Sara; Pastor, Pau; Ferman, Tanis J.; Graff-Radford, Neill R.; Ross, Owen A.; Barber, Imelda; Braae, Anne; Brown, Kristelle; Morgan, Kevin; Maetzler, Walter; Berg, Daniela; Troakes, Claire; Al-Sarraj, Safa; Lashley, Tammaryn; Compta, Yaroslau; Revesz, Tamas; Lees, Andrew; Cairns, Nigel J.; Halliday, Glenda M.; Mann, David; Pickering-Brown, Stuart; Powell, John; Lunnon, Katie; Lupton, Michelle K.; Dickson, Dennis; Hardy, John; Singleton, Andrew; Bras, Jose
2016-01-01
The similarities between dementia with Lewy bodies (DLB) and both Parkinson's disease (PD) and Alzheimer's disease (AD) are many and range from clinical presentation, to neuropathological characteristics, to more recently identified, genetic determinants of risk. Because of these overlapping features, diagnosing DLB is challenging and has clinical implications since some therapeutic agents that are applicable in other diseases have adverse effects in DLB. Having shown that DLB shares some genetic risk with PD and AD, we have now quantified the amount of sharing through the application of genetic correlation estimates, and show that, from a purely genetic perspective, and excluding the strong association at the APOE locus, DLB is equally correlated to AD and PD. PMID:26643944
A study of correlations in the stock market
NASA Astrophysics Data System (ADS)
Sharma, Chandradew; Banerjee, Kinjal
2015-08-01
We study the various sectors of the Bombay Stock Exchange (BSE) for a period of 8 years from April 2006 to March 2014. Using the data of daily returns of a period of eight years we make a direct model free analysis of the pattern of the sectorial indices movement and the correlations among them. Our analysis shows significant auto correlation among the individual sectors and also strong cross-correlation among sectors. We also find that auto correlations in some of the sectors persist in time. This is a very significant result and has not been reported so far in Indian context. These findings will be very useful in model building for prediction of price movement of equities, derivatives and portfolio management. We show that the Random Walk Hypothesis is not applicable in modeling the Indian market and mean-variance-skewness-kurtosis based portfolio optimization might be required. We also find that almost all sectors are highly correlated during large fluctuation periods and have only moderate correlation during normal periods.
Hierarchical multivariate covariance analysis of metabolic connectivity
Carbonell, Felix; Charil, Arnaud; Zijdenbos, Alex P; Evans, Alan C; Bedell, Barry J
2014-01-01
Conventional brain connectivity analysis is typically based on the assessment of interregional correlations. Given that correlation coefficients are derived from both covariance and variance, group differences in covariance may be obscured by differences in the variance terms. To facilitate a comprehensive assessment of connectivity, we propose a unified statistical framework that interrogates the individual terms of the correlation coefficient. We have evaluated the utility of this method for metabolic connectivity analysis using [18F]2-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. As an illustrative example of the utility of this approach, we examined metabolic connectivity in angular gyrus and precuneus seed regions of mild cognitive impairment (MCI) subjects with low and high β-amyloid burdens. This new multivariate method allowed us to identify alterations in the metabolic connectome, which would not have been detected using classic seed-based correlation analysis. Ultimately, this novel approach should be extensible to brain network analysis and broadly applicable to other imaging modalities, such as functional magnetic resonance imaging (MRI). PMID:25294129
NASA Astrophysics Data System (ADS)
Karageorgiou, Elissaios; Lewis, Scott M.; Riley McCarten, J.; Leuthold, Arthur C.; Hemmy, Laura S.; McPherson, Susan E.; Rottunda, Susan J.; Rubins, David M.; Georgopoulos, Apostolos P.
2012-10-01
In previous work (Georgopoulos et al 2007 J. Neural Eng. 4 349-55) we reported on the use of magnetoencephalographic (MEG) synchronous neural interactions (SNI) as a functional biomarker in Alzheimer's dementia (AD) diagnosis. Here we report on the application of canonical correlation analysis to investigate the relations between SNI and cognitive neuropsychological (NP) domains in AD patients. First, we performed individual correlations between each SNI and each NP, which provided an initial link between SNI and specific cognitive tests. Next, we performed factor analysis on each set, followed by a canonical correlation analysis between the derived SNI and NP factors. This last analysis optimally associated the entire MEG signal with cognitive function. The results revealed that SNI as a whole were mostly associated with memory and language, and, slightly less, executive function, processing speed and visuospatial abilities, thus differentiating functions subserved by the frontoparietal and the temporal cortices. These findings provide a direct interpretation of the information carried by the SNI and set the basis for identifying specific neural disease phenotypes according to cognitive deficits.
Accurate Structural Correlations from Maximum Likelihood Superpositions
Theobald, Douglas L; Wuttke, Deborah S
2008-01-01
The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR) models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA) of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method (“PCA plots”) for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology. PMID:18282091
[Medical school admission test at the University of Goettingen - which applicants will benefit?].
Simmenroth-Nayda, Anne; Meskauskas, Erik; Burckhardt, Gerhard; Görlich, Yvonne
2014-01-01
Medical schools in Germany may select 60% of the student applicants through their own admission tests. The influence of the school-leaving examination grades (EGs) in each of the procedural steps is controversial. At Goettingen Medical School, we combine a structured interview and a communicative skills assessment. We analysed how many applicants succeeded in our admission test, compared to a model which only takes EGs into account. Admission scores were transferred into SPSS-21. Sociodemographic data were submitted by the Stiftung Hochschulstart. Besides descriptive statistics, we used Pearson-correlation and means comparisons (t-test, analysis of variance). 221 applicants (EGs 1.0-1.9) were invited in the winter semester 2013/14 and 222 applicants (EGs 1.1-1.8) in the summer semester 2014. The proportion of women was 68% (winter) and 74% (summer). Sixteen and 37 applicants had a medical vocational training and performed slightly better. The analysis showed that our test was gender neutral. EGs did not correlate with interviews or skills assessment. Despite a two-fold impact of EGs, 26 (winter) and 44 (summer) of the overall 181 applicants had EGs of 1.4 -1.9, which would have been too low for admission otherwise. If EGs were only considered once, 40 (winter) and 59 (summer) applicants would have succeeded. Copyright © 2014. Published by Elsevier GmbH.
Baker, Jannah; White, Nicole; Mengersen, Kerrie
2014-11-20
Spatial analysis is increasingly important for identifying modifiable geographic risk factors for disease. However, spatial health data from surveys are often incomplete, ranging from missing data for only a few variables, to missing data for many variables. For spatial analyses of health outcomes, selection of an appropriate imputation method is critical in order to produce the most accurate inferences. We present a cross-validation approach to select between three imputation methods for health survey data with correlated lifestyle covariates, using as a case study, type II diabetes mellitus (DM II) risk across 71 Queensland Local Government Areas (LGAs). We compare the accuracy of mean imputation to imputation using multivariate normal and conditional autoregressive prior distributions. Choice of imputation method depends upon the application and is not necessarily the most complex method. Mean imputation was selected as the most accurate method in this application. Selecting an appropriate imputation method for health survey data, after accounting for spatial correlation and correlation between covariates, allows more complete analysis of geographic risk factors for disease with more confidence in the results to inform public policy decision-making.
Rasmuson, James O; Roggli, Victor L; Boelter, Fred W; Rasmuson, Eric J; Redinger, Charles F
2014-01-01
A detailed evaluation of the correlation and linearity of industrial hygiene retrospective exposure assessment (REA) for cumulative asbestos exposure with asbestos lung burden analysis (LBA) has not been previously performed, but both methods are utilized for case-control and cohort studies and other applications such as setting occupational exposure limits. (a) To correlate REA with asbestos LBA for a large number of cases from varied industries and exposure scenarios; (b) to evaluate the linearity, precision, and applicability of both industrial hygiene exposure reconstruction and LBA; and (c) to demonstrate validation methods for REA. A panel of four experienced industrial hygiene raters independently estimated the cumulative asbestos exposure for 363 cases with limited exposure details in which asbestos LBA had been independently determined. LBA for asbestos bodies was performed by a pathologist by both light microscopy and scanning electron microscopy (SEM) and free asbestos fibers by SEM. Precision, reliability, correlation and linearity were evaluated via intraclass correlation, regression analysis and analysis of covariance. Plaintiff's answers to interrogatories, work history sheets, work summaries or plaintiff's discovery depositions that were obtained in court cases involving asbestos were utilized by the pathologist to provide a summarized brief asbestos exposure and work history for each of the 363 cases. Linear relationships between REA and LBA were found when adjustment was made for asbestos fiber-type exposure differences. Significant correlation between REA and LBA was found with amphibole asbestos lung burden and mixed fiber-types, but not with chrysotile. The intraclass correlation coefficients (ICC) for the precision of the industrial hygiene rater cumulative asbestos exposure estimates and the precision of repeated laboratory analysis were found to be in the excellent range. The ICC estimates were performed independent of specific asbestos fiber-type. Both REA and pathology assessment are reliable and complementary predictive methods to characterize asbestos exposures. Correlation analysis between the two methods effectively validates both REA methodology and LBA procedures within the determined precision, particularly for cumulative amphibole asbestos exposures since chrysotile fibers, for the most part, are not retained in the lung for an extended period of time.
Nonlinear analysis of structures. [within framework of finite element method
NASA Technical Reports Server (NTRS)
Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.
1974-01-01
The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.
Khokhlova, V N
1999-01-01
The multiunit activity of neurons in the motor cortex was recorded in 6 rabbits during glutamate (or physiological saline) iontophoretic application. Interaction between the neighboring neurons was evaluated by means of statistical cross-correlation analysis of spike trains. It was found that glutamate did not produce significant changes in cross-correlations.
Separated Representations and Fast Algorithms for Materials Science
2007-10-29
Quantum Chemisty , 127 (1999), pp. 143–269. [28] A. Smilde, R. Bro, and P. Geladi, Multi-way Analysis. Applications in the Chemical Sciences, John...Advances in highly correlated approaches. Advances in Quantum Chemisty , 127:143–269, 1999. [58] Age Smilde, Rasmus Bro, and Paul Geladi. Multi-way Analysis
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Model-driven meta-analyses for informing health care: a diabetes meta-analysis as an exemplar.
Brown, Sharon A; Becker, Betsy Jane; García, Alexandra A; Brown, Adama; Ramírez, Gilbert
2015-04-01
A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. © The Author(s) 2014.
MODEL-DRIVEN META-ANALYSES FOR INFORMING HEALTH CARE: A DIABETES META-ANALYSIS AS AN EXEMPLAR
Brown, Sharon A.; Becker, Betsy Jane; García, Alexandra A.; Brown, Adama; Ramírez, Gilbert
2015-01-01
A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. PMID:25142707
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
The use of dwell time cross-correlation functions to study single-ion channel gating kinetics.
Ball, F G; Kerry, C J; Ramsey, R L; Sansom, M S; Usherwood, P N
1988-01-01
The derivation of cross-correlation functions from single-channel dwell (open and closed) times is described. Simulation of single-channel data for simple gating models, alongside theoretical treatment, is used to demonstrate the relationship of cross-correlation functions to underlying gating mechanisms. It is shown that time irreversibility of gating kinetics may be revealed in cross-correlation functions. Application of cross-correlation function analysis to data derived from the locust muscle glutamate receptor-channel provides evidence for multiple gateway states and time reversibility of gating. A model for the gating of this channel is used to show the effect of omission of brief channel events on cross-correlation functions. PMID:2462924
The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright; Milos Manic
2010-05-01
This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.
Guerreiro, Rita; Escott-Price, Valentina; Darwent, Lee; Parkkinen, Laura; Ansorge, Olaf; Hernandez, Dena G; Nalls, Michael A; Clark, Lorraine; Honig, Lawrence; Marder, Karen; van der Flier, Wiesje; Holstege, Henne; Louwersheimer, Eva; Lemstra, Afina; Scheltens, Philip; Rogaeva, Ekaterina; St George-Hyslop, Peter; Londos, Elisabet; Zetterberg, Henrik; Ortega-Cubero, Sara; Pastor, Pau; Ferman, Tanis J; Graff-Radford, Neill R; Ross, Owen A; Barber, Imelda; Braae, Anne; Brown, Kristelle; Morgan, Kevin; Maetzler, Walter; Berg, Daniela; Troakes, Claire; Al-Sarraj, Safa; Lashley, Tammaryn; Compta, Yaroslau; Revesz, Tamas; Lees, Andrew; Cairns, Nigel J; Halliday, Glenda M; Mann, David; Pickering-Brown, Stuart; Powell, John; Lunnon, Katie; Lupton, Michelle K; Dickson, Dennis; Hardy, John; Singleton, Andrew; Bras, Jose
2016-02-01
The similarities between dementia with Lewy bodies (DLB) and both Parkinson's disease (PD) and Alzheimer's disease (AD) are many and range from clinical presentation, to neuropathological characteristics, to more recently identified, genetic determinants of risk. Because of these overlapping features, diagnosing DLB is challenging and has clinical implications since some therapeutic agents that are applicable in other diseases have adverse effects in DLB. Having shown that DLB shares some genetic risk with PD and AD, we have now quantified the amount of sharing through the application of genetic correlation estimates, and show that, from a purely genetic perspective, and excluding the strong association at the APOE locus, DLB is equally correlated to AD and PD. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Ware, William B.; Galassi, John P.
2006-01-01
Correlational data and regression analysis provide the school counselor with a method to describe growth in achievement test scores from elementary to high school. Using Microsoft Excel, this article shows the reader in a step-by-step manner how to describe this growth pattern and how to evaluate interventions that attempt to enhance achievement…
Kumar, Vineet; Rana, Vikas; Soni, P L
2013-01-01
Mucilaginous polysaccharide extracted from Dalbergia sissoo Roxb. leaves has a number of medicinal applications. Molecular weight studies and correlation analysis of the structure of polysaccharide with oligosaccharides can be helpful for further utilisation, modification and structure-activity relationship for biological applications. To determine molecular weight of medicinally important polysaccharide. To establish an unequivocal correlation of the polysaccharide monosugars with constituting oligosaccharides and glucuronic acid content based on gas-liquid chromatography (GLC) with the spectrophotometric method. Complete and partial hydrolytic studies of pure polysaccharide yielded constituting monosugars and oligosaccharides. The ratio of sugars in polysaccharide and oligosaccharides was studied by preparation of alditol acetates and analysed using GLC. The uronic acid content was studied by GLC analysis and spectrophotometry. Molecular weight of the polysaccharide was determined using the viscometric method. Dalbergia sissoo leaves yielded 14.0% pure polysaccharide, containing 15.7% of glucuronic acid. Complete hydrolysis and GLC analysis of alditol acetate derivatives of reduced and unreduced monosugars indicated the presence of L-rhamnose, D-glucuronic acid, D-galactose and D-glucose in 1.00:1.00:2.00:2.33 molar ratios. Partial hydrolysis followed by monosugar analysis of oligosaccharides established the monosugar ratio in complete agreement with polysaccharide, thereby corroborating the sugar ratio. Similar uronic acid content was obtained by GLC and spectrophotometry. The polysaccharide had an average molecular weight of 1.5 × 10⁵ Da. The study has established an obvious correlation of the structure of polysaccharide with oligosaccharides, leading to unambiguous identification of monosaccharides, which normally is not studied conclusively while reporting the polysaccharide structure. The molecular weight of the polysaccharide was determined. Copyright © 2012 John Wiley & Sons, Ltd.
Design and analysis of coherent OCDM en/decoder based on photonic crystal
NASA Astrophysics Data System (ADS)
Zhang, Chongfu; Qiu, Kun
2008-08-01
The design and performance analysis of a new coherent optical en/decoder based on photonic crystal (PhC) for optical code -division -multiple (OCDM) are presented in this paper. In this scheme, the optical pulse phase and time delay can be flexibly controlled by photonic crystal phase shifter and time delayer by using the appropriate design of fabrication. According to the PhC transmission matrix theorem, combination calculation of the impurity and normal period layers is applied, and performances of the PhC-based optical en/decoder are also analyzed. The reflection, transmission, time delay characteristic and optical spectrum of pulse en/decoded are studied for the waves tuned in the photonic band-gap by numerical calculation. Theoretical analysis and numerical results indicate that the optical pulse is achieved to properly phase modulation and time delay, and an auto-correlation of about 8 dB ration and cross-correlation is gained, which demonstrates the applicability of true pulse phase modulation in a number of applications.
LC-MS/MS signal suppression effects in the analysis of pesticides in complex environmental matrices.
Choi, B K; Hercules, D M; Gusev, A I
2001-02-01
The application of LC separation and mobile phase additives in addressing LC-MS/MS matrix signal suppression effects for the analysis of pesticides in a complex environmental matrix was investigated. It was shown that signal suppression is most significant for analytes eluting early in the LC-MS analysis. Introduction of different buffers (e.g. ammonium formate, ammonium hydroxide, formic acid) into the LC mobile phase was effective in improving signal correlation between the matrix and standard samples. The signal improvement is dependent on buffer concentration as well as LC separation of the matrix components. The application of LC separation alone was not effective in addressing suppression effects when characterizing complex matrix samples. Overloading of the LC column by matrix components was found to significantly contribute to analyte-matrix co-elution and suppression of signal. This signal suppression effect can be efficiently compensated by 2D LC (LC-LC) separation techniques. The effectiveness of buffers and LC separation in improving signal correlation between standard and matrix samples is discussed.
Liu, Jing; Drane, Wanzer; Liu, Xuefeng; Wu, Tiejian
2009-01-01
This study was to explore the relationships between personal exposure to ten volatile organic compounds (VOCs) and biochemical liver tests with the application of canonical correlation analysis. Data from a subsample of the 1999–2000 National Health and Nutrition Examination Survey were used. Serum albumin, total bilirubin (TB), alanine aminotransferase (ALT), aspartate aminotransferase (AST), lactate dehydrogenase (LDH), alkaline phosphatase (ALP), and γ-glutamyl transferase (GGT) served as the outcome variables. Personal exposures to benzene, chloroform, ethylbenzene, tetrachloroethene, toluene, trichloroethene, o-xylene, m-, p-xylene, 1,4-dichlorobenzene, and methyl tert-butyl ether (MTBE) were assessed through the use of passive exposure monitors worn by study participants. The first two canonical correlations were 0.3218 and 0.2575, suggesting a positive correlation mainly between the six VOCs (benzene, ethylbenzene, toluene, o-xylene, m-, p-xylene, and MTBE) and the three biochemical liver tests (albumin, ALP, and GGT) and a positive correlation mainly between the two VOCs (1,4-dichlorobenzene and tetrachloroethene) and the two biochemical liver tests (LDH and TB). Subsequent multiple linear regressions show that exposure to benzene, toluene, or MTBE was associated with serum albumin, while exposure to tetrachloroethene was associated with LDH and total bilirubin. In conclusion, exposure to certain VOCs as a group or individually may influence certain biochemical liver test results in the general population. PMID:19117555
Liu, Jing; Drane, Wanzer; Liu, Xuefeng; Wu, Tiejian
2009-02-01
This study was to explore the relationships between personal exposure to 10 volatile organic compounds (VOCs) and biochemical liver tests with the application of canonical correlation analysis. Data from a subsample of the 1999-2000 National Health and Nutrition Examination Survey were used. Serum albumin, total bilirubin (TB), alanine aminotransferase (ALT), aspartate aminotransferase (AST), lactate dehydrogenase (LDH), alkaline phosphatase (ALP), and gamma-glutamyl transferase (GGT) served as the outcome variables. Personal exposures to benzene, chloroform, ethylbenzene, tetrachloroethene, toluene, trichloroethene, o-xylene, m-,p-xylene, 1,4-dichlorobenzene, and methyl tert-butyl ether (MTBE) were assessed through the use of passive exposure monitors worn by study participants. The first two canonical correlations were 0.3218 and 0.2575, suggesting a positive correlation mainly between the six VOCs (benzene, ethylbenzene, toluene, o-xylene, m-,p-xylene, and MTBE) and the three biochemical liver tests (albumin, ALP, and GGT) and a positive correlation mainly between the two VOCs (1,4-dichlorobenzene and tetrachloroethene) and the two biochemical liver tests (LDH and TB). Subsequent multiple linear regressions show that exposure to benzene, toluene, or MTBE was associated with serum albumin, while exposure to tetrachloroethene was associated with LDH and total bilirubin. In conclusion, exposure to certain VOCs as a group or individually may influence certain biochemical liver test results in the general population.
Alexander, John C; Minhajuddin, Abu; Joshi, Girish P
2017-08-01
Use of healthcare-related smartphone applications is common. However, there is concern that inaccurate information from these applications may lead patients to make erroneous healthcare decisions. The objective of this study is to study smartphone applications purporting to measure vital sign data using only onboard technology compared with monitors used routinely in clinical practice. This is a prospective trial comparing correlation between a clinically utilized vital sign monitor (Propaq CS, WelchAllyn, Skaneateles Falls, NY, USA) and four smartphone application-based monitors Instant Blood Pressure, Instant Blood Pressure Pro, Pulse Oximeter, and Pulse Oximeter Pro. We performed measurements of heart rate (HR), systolic blood pressures (SBP), diastolic blood pressure (DBP), and oxygen saturation (SpO 2 ) using standard monitor and four smartphone applications. Analysis of variance was used to compare measurements from the applications to the routine monitor. The study was completed on 100 healthy volunteers. Comparison of routine monitor with the smartphone applications shows significant differences in terms of HR, SpO 2 and DBP. The SBP values from the applications were not significantly different from those from the routine monitor, but had wide limits of agreement signifying a large degree of variation in the compared values. The degree of correlation between monitors routinely used in clinical practice and the smartphone-based applications studied is insufficient to recommend clinical utilization. This lack of correlation suggests that the applications evaluated do not provide clinically meaningful data. The inaccurate data provided by these applications can potentially contribute to patient harm.
A new method for correlation analysis of compositional (environmental) data - a worked example.
Reimann, C; Filzmoser, P; Hron, K; Kynčlová, P; Garrett, R G
2017-12-31
Most data in environmental sciences and geochemistry are compositional. Already the unit used to report the data (e.g., μg/l, mg/kg, wt%) implies that the analytical results for each element are not free to vary independently of the other measured variables. This is often neglected in statistical analysis, where a simple log-transformation of the single variables is insufficient to put the data into an acceptable geometry. This is also important for bivariate data analysis and for correlation analysis, for which the data need to be appropriately log-ratio transformed. A new approach based on the isometric log-ratio (ilr) transformation, leading to so-called symmetric coordinates, is presented here. Summarizing the correlations in a heat-map gives a powerful tool for bivariate data analysis. Here an application of the new method using a data set from a regional geochemical mapping project based on soil O and C horizon samples is demonstrated. Differences to 'classical' correlation analysis based on log-transformed data are highlighted. The fact that some expected strong positive correlations appear and remain unchanged even following a log-ratio transformation has probably led to the misconception that the special nature of compositional data can be ignored when working with trace elements. The example dataset is employed to demonstrate that using 'classical' correlation analysis and plotting XY diagrams, scatterplots, based on the original or simply log-transformed data can easily lead to severe misinterpretations of the relationships between elements. Copyright © 2017 Elsevier B.V. All rights reserved.
Fractal mechanisms and heart rate dynamics. Long-range correlations and their breakdown with disease
NASA Technical Reports Server (NTRS)
Peng, C. K.; Havlin, S.; Hausdorff, J. M.; Mietus, J. E.; Stanley, H. E.; Goldberger, A. L.
1995-01-01
Under healthy conditions, the normal cardiac (sinus) interbeat interval fluctuates in a complex manner. Quantitative analysis using techniques adapted from statistical physics reveals the presence of long-range power-law correlations extending over thousands of heartbeats. This scale-invariant (fractal) behavior suggests that the regulatory system generating these fluctuations is operating far from equilibrium. In contrast, it is found that for subjects at high risk of sudden death (e.g., congestive heart failure patients), these long-range correlations break down. Application of fractal scaling analysis and related techniques provides new approaches to assessing cardiac risk and forecasting sudden cardiac death, as well as motivating development of novel physiologic models of systems that appear to be heterodynamic rather than homeostatic.
From micro-correlations to macro-correlations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eliazar, Iddo, E-mail: iddo.eliazar@intel.com
2016-11-15
Random vectors with a symmetric correlation structure share a common value of pair-wise correlation between their different components. The symmetric correlation structure appears in a multitude of settings, e.g. mixture models. In a mixture model the components of the random vector are drawn independently from a general probability distribution that is determined by an underlying parameter, and the parameter itself is randomized. In this paper we study the overall correlation of high-dimensional random vectors with a symmetric correlation structure. Considering such a random vector, and terming its pair-wise correlation “micro-correlation”, we use an asymptotic analysis to derive the random vector’smore » “macro-correlation” : a score that takes values in the unit interval, and that quantifies the random vector’s overall correlation. The method of obtaining macro-correlations from micro-correlations is then applied to a diverse collection of frameworks that demonstrate the method’s wide applicability.« less
Dual-color two-photon fluorescence correlation spectroscopy
NASA Astrophysics Data System (ADS)
Berland, Keith M.
2001-04-01
Fluorescence correlation spectroscopy (FCS) is rapidly growing in popularity as a research tool in biological and biophysical research. Under favorable conditions, FCS measurements can produce an accurate characterization of the chemical, physical, and kinetic properties of a biological system. However, interpretation of FCS data quickly becomes complicated as the heterogeneity of a molecular system increases, as well as when there is significant non-stationery fluorescence background (e.g. intracellular autofluorescence). Use of multi-parameter correlation measurements is one promising approach that can improve the fidelity of FCS measurements in complex systems. In particular, the use of dual-color fluorescence assays, in which different interacting molecular species are labeled with unique fluorescent indicators, can "tune" the sensitivity of FCS measurements in favor of particular molecular species of interest, while simultaneously minimizing the contribution of other molecular species to the overall fluorescence correlation signal. Here we introduce the combined application of two-photon fluorescence excitation and dual-color cross-correlation analysis for detecting molecular interactions in solution. The use of two-photon excitation is particularly advantageous for dual-color FCS applications due to the uncomplicated optical alignment and the superior capabilities for intracellular applications. The theory of two-photon dual-color FCS is introduced, and initial results quantifying hybridization reactions between three independent single stranded DNA molecules are presented.
Beeckman, Dimitri; Defloor, Tom; Schoonhoven, Lisette; Vanderwee, Katrien
2011-09-01
Evidence-based guidelines for pressure ulcer prevention have been developed and promoted by authoritative organizations. However, nonadherence to these guidelines is frequently reported. Negative attitudes and lack of knowledge may act as barriers to using guidelines in clinical practice. To study the knowledge and attitudes of nurses about pressure ulcer prevention in Belgian hospitals and to explore the correlation between knowledge, attitudes, and the application of adequate prevention. A cross-sectional multicenter study was performed in a random sample of 14 Belgian hospitals, representing 207 wards. Out of that group, 94 wards were randomly selected (2105 patients). Clinical observations were performed to assess the adequacy of pressure ulcer prevention and pressure ulcer prevalence. From each participating ward, a random selection of at least five nurses completed an extensively validated knowledge and attitude instrument. In total, 553 nurses participated. A logistic regression analysis was performed to evaluate the correlation between knowledge, attitudes, and the application of adequate prevention. Pressure ulcer prevalence (Category I-IV) was 13.5% (284/2105). Approximately 30% (625/2105) of the patients were at risk (Bradenscore <17 and/or presence of pressure ulcer). Only 13.9% (87/625) of these patients received fully adequate prevention whilst in bed and when seated. The mean knowledge and attitude scores were 49.7% and 71.3%, respectively. The application of adequate prevention on a nursing ward was significantly correlated with the attitudes of the nurses (OR = 3.07, p = .05). No independent correlation was found between knowledge and the application of adequate prevention (OR = 0.75, p = .71). Knowledge of nurses in Belgian hospitals about the prevention of pressure ulcers is inadequate. The attitudes of nurses toward pressure ulcers are significantly correlated with the application of adequate prevention. No correlation was found between knowledge and the application of adequate prevention. Copyright ©2011 Sigma Theta Tau International.
Working covariance model selection for generalized estimating equations.
Carey, Vincent J; Wang, You-Gan
2011-11-20
We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Bernacca, P. L.
1971-01-01
The correlation between the equatorial velocities of the components of double stars is studied from a statistical standpoint. A theory of rotational correlation is developed and discussed with regard to its applicability to existing observations. The theory is then applied to a sample of visual binaries which are the least studied for rotational coupling. Consideration of eclipsing systems and spectroscopic binaries is limited to show how the degrees of freedom in the spin parallelism problem can be reduced. The analysis lends support to the existence of synchronism in closely spaced binaries.
Digital Image Correlation Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Dan; Crozier, Paul; Reu, Phil
DICe is an open source digital image correlation (DIC) tool intended for use as a module in an external application or as a standalone analysis code. It's primary capability is computing full-field displacements and strains from sequences of digital These images are typically of a material sample undergoing a materials characterization experiment, but DICe is also useful for other applications (for example, trajectory tracking). DICe is machine portable (Windows, Linux and Mac) and can be effectively deployed on a high performance computing platform. Capabilities from DICe can be invoked through a library interface, via source code integration of DICe classesmore » or through a graphical user interface.« less
NASA Astrophysics Data System (ADS)
Lan, Tian; Cheng, Kai; Ren, Tina; Arce, Stephen Hugo; Tseng, Yiider
2016-09-01
Cell migration is an essential process in organism development and physiological maintenance. Although current methods permit accurate comparisons of the effects of molecular manipulations and drug applications on cell motility, effects of alterations in subcellular activities on motility cannot be fully elucidated from those methods. Here, we develop a strategy termed cell-nuclear (CN) correlation to parameterize represented dynamic subcellular activities and to quantify their contributions in mesenchymal-like migration. Based on the biophysical meaning of the CN correlation, we propose a cell migration potential index (CMPI) to measure cell motility. When the effectiveness of CMPI was evaluated with respect to one of the most popular cell migration analysis methods, Persistent Random Walk, we found that the cell motility estimates among six cell lines used in this study were highly consistent between these two approaches. Further evaluations indicated that CMPI can be determined using a shorter time period and smaller cell sample size, and it possesses excellent reliability and applicability, even in the presence of a wide range of noise, as might be generated from individual imaging acquisition systems. The novel approach outlined here introduces a robust strategy through an analysis of subcellular locomotion activities for single cell migration assessment.
Suppression of pulmonary vasculature in lung perfusion MRI using correlation analysis.
Risse, Frank; Kuder, Tristan A; Kauczor, Hans-Ulrich; Semmler, Wolfhard; Fink, Christian
2009-11-01
The purpose of the study was to evaluate the feasibility of suppressing the pulmonary vasculature in lung perfusion MRI using cross-correlation analysis (CCA). Perfusion magnetic resonance imaging (MRI) (3D FLASH, TR/TE/flip angle: 0.8 ms/2.1 ms/40 degrees ) of the lungs was performed in seven healthy volunteers at 1.5 Tesla after injection of Gd-DTPA. CCA was performed pixel-wise in lung segmentations using the signal time-course of the main pulmonary artery and left atrium as references. Pixels with high correlation coefficients were considered as arterial or venous and excluded from further analysis. Quantitative perfusion parameters [pulmonary blood flow (PBF) and volume (PBV)] were calculated for manual lung segmentations separately, with the entire left and right lung with all intrapulmonary vessels (IPV) included, excluded manually or excluded using CCA. The application of CCA allowed reliable suppression of hilar and large IPVs. Using vascular suppression by CCA, perfusion parameters were significantly reduced (p = 0.001). The reduction was 8% for PBF and 13% for PBV compared with manual exclusion and 15% for PBF and 25% for PBV when all vessel structures were included. The application of CCA improves the visualisation and quantification of lung perfusion in MRI. Overestimation of perfusion parameters caused by pulmonary vessels is significantly reduced.
Two-dimensional correlation spectroscopy in polymer study
Park, Yeonju; Noda, Isao; Jung, Young Mee
2015-01-01
This review outlines the recent works of two-dimensional correlation spectroscopy (2DCOS) in polymer study. 2DCOS is a powerful technique applicable to the in-depth analysis of various spectral data of polymers obtained under some type of perturbation. The powerful utility of 2DCOS combined with various analytical techniques in polymer studies and noteworthy developments of 2DCOS used in this field are also highlighted. PMID:25815286
Meyer, Hans Jonas; Leifels, Leonard; Schob, Stefan; Garnov, Nikita; Surov, Alexey
2018-01-01
Nowadays, multiparametric investigations of head and neck squamous cell carcinoma (HNSCC) are established. These approaches can better characterize tumor biology and behavior. Diffusion weighted imaging (DWI) can by means of apparent diffusion coefficient (ADC) quantitatively characterize different tissue compartments. Dynamic contrast-enhanced magnetic resonance imaging (DCE MRI) reflects perfusion and vascularization of tissues. Recently, a novel approach of data acquisition, namely histogram analysis of different images is a novel diagnostic approach, which can provide more information of tissue heterogeneity. The purpose of this study was to analyze possible associations between DWI, and DCE parameters derived from histogram analysis in patients with HNSCC. Overall, 34 patients, 9 women and 25 men, mean age, 56.7±10.2years, with different HNSCC were involved in the study. DWI was obtained by using of an axial echo planar imaging sequence with b-values of 0 and 800s/mm 2 . Dynamic T1w DCE sequence after intravenous application of contrast medium was performed for estimation of the following perfusion parameters: volume transfer constant (K trans ), volume of the extravascular extracellular leakage space (Ve), and diffusion of contrast medium from the extravascular extracellular leakage space back to the plasma (Kep). Both ADC and perfusion parameters maps were processed offline in DICOM format with custom-made Matlab-based application. Thereafter, polygonal ROIs were manually drawn on the transferred maps on each slice. For every parameter, mean, maximal, minimal, and median values, as well percentiles 10th, 25th, 75th, 90th, kurtosis, skewness, and entropy were estimated. Сorrelation analysis identified multiple statistically significant correlations between the investigated parameters. Ve related parameters correlated well with different ADC values. Especially, percentiles 10 and 75, mode, and median values showed stronger correlations in comparison to other parameters. Thereby, the calculated correlation coefficients ranged from 0.62 to 0.69. Furthermore, K trans related parameters showed multiple slightly to moderate significant correlations with different ADC values. Strongest correlations were identified between ADC P75 and K trans min (p=0.58, P=0.0007), and ADC P75 and K trans P10 (p=0.56, P=0.001). Only four K ep related parameters correlated statistically significant with ADC fractions. Strongest correlation was found between K ep max and ADC mode (p=-0.47, P=0.008). Multiple statistically significant correlations between, DWI and DCE MRI parameters derived from histogram analysis were identified in HNSCC. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wong, Kenneth H.; Choi, Jae; Wilson, William; Berry, Joel; Henderson, Fraser C., Sr.
2009-02-01
Abnormal stretch and strain is a major cause of injury to the spinal cord and brainstem. Such forces can develop from age-related degeneration, congenital malformations, occupational exposure, or trauma such as sporting accidents, whiplash and blast injury. While current imaging technologies provide excellent morphology and anatomy of the spinal cord, there is no validated diagnostic tool to assess mechanical stresses exerted upon the spinal cord and brainstem. Furthermore, there is no current means to correlate these stress patterns with known spinal cord injuries and other clinical metrics such as neurological impairment. We have therefore developed the spinal cord stress injury assessment (SCOSIA) system, which uses imaging and finite element analysis to predict stretch injury. This system was tested on a small cohort of neurosurgery patients. Initial results show that the calculated stress values decreased following surgery, and that this decrease was accompanied by a significant decrease in neurological symptoms. Regression analysis identified modest correlations between stress values and clinical metrics. The strongest correlations were seen with the Brainstem Disability Index (BDI) and the Karnofsky Performance Score (KPS), whereas the weakest correlations were seen with the American Spinal Injury Association (ASIA) scale. SCOSIA therefore shows encouraging initial results and may have wide applicability to trauma and degenerative disease involving the spinal cord and brainstem.
Application of selected methods of remote sensing for detecting carbonaceous water pollution
NASA Technical Reports Server (NTRS)
Davis, E. M.; Fosbury, W. J.
1973-01-01
A reach of the Houston Ship Channel was investigated during three separate overflights correlated with ground truth sampling on the Channel. Samples were analyzed for such conventional parameters as biochemical oxygen demand, chemical oxygen demand, total organic carbon, total inorganic carbon, turbidity, chlorophyll, pH, temperature, dissolved oxygen, and light penetration. Infrared analyses conducted on each sample included reflectance ATR analysis, carbon tetrachloride extraction of organics and subsequent scanning, and KBr evaporate analysis of CCl4 extract concentrate. Imagery which was correlated with field and laboratory data developed from ground truth sampling included that obtained from aerial KA62 hardware, RC-8 metric camera systems, and the RS-14 infrared scanner. The images were subjected to analysis by three film density gradient interpretation units. Data were then analyzed for correlations between imagery interpretation as derived from the three instruments and laboratory infrared signatures and other pertinent field and laboratory analyses.
Analysis and testing of axial compression in imperfect slender truss struts
NASA Technical Reports Server (NTRS)
Lake, Mark S.; Georgiadis, Nicholas
1990-01-01
The axial compression of imperfect slender struts for large space structures is addressed. The load-shortening behavior of struts with initially imperfect shapes and eccentric compressive end loading is analyzed using linear beam-column theory and results are compared with geometrically nonlinear solutions to determine the applicability of linear analysis. A set of developmental aluminum clad graphite/epoxy struts sized for application to the Space Station Freedom truss are measured to determine their initial imperfection magnitude, load eccentricity, and cross sectional area and moment of inertia. Load-shortening curves are determined from axial compression tests of these specimens and are correlated with theoretical curves generated using linear analysis.
The Japanese utilities` expectations for subchannel analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toba, Akio; Omoto, Akira
1995-12-01
Boiling water reactor (BWR) utilities in Japan began to consider the development of a mechanistic model to describe the critical heat transfer conditions in the BWR fuel subchannel. Such a mechanistic model will not only decrease the necessity of tests, but will also help by removing some overly conservative safety margins in thermal hydraulics. With the use of a postdryout heat transfer correlation, new acceptance criteria may be applicable to evaluate the fuel integrity. Mechanistic subchannel analysis models will certainly back up this approach. This model will also be applicable to the analysis of large-size fuel bundles and examination ofmore » corrosion behavior.« less
Zhe Fan; Zhong Wang; Guanglin Li; Ruomei Wang
2016-08-01
Motion classification system based on surface Electromyography (sEMG) pattern recognition has achieved good results in experimental condition. But it is still a challenge for clinical implement and practical application. Many factors contribute to the difficulty of clinical use of the EMG based dexterous control. The most obvious and important is the noise in the EMG signal caused by electrode shift, muscle fatigue, motion artifact, inherent instability of signal and biological signals such as Electrocardiogram. In this paper, a novel method based on Canonical Correlation Analysis (CCA) was developed to eliminate the reduction of classification accuracy caused by electrode shift. The average classification accuracy of our method were above 95% for the healthy subjects. In the process, we validated the influence of electrode shift on motion classification accuracy and discovered the strong correlation with correlation coefficient of >0.9 between shift position data and normal position data.
Madrigal, Pedro
2017-03-01
Computational evaluation of variability across DNA or RNA sequencing datasets is a crucial step in genomic science, as it allows both to evaluate reproducibility of biological or technical replicates, and to compare different datasets to identify their potential correlations. Here we present fCCAC, an application of functional canonical correlation analysis to assess covariance of nucleic acid sequencing datasets such as chromatin immunoprecipitation followed by deep sequencing (ChIP-seq). We show how this method differs from other measures of correlation, and exemplify how it can reveal shared covariance between histone modifications and DNA binding proteins, such as the relationship between the H3K4me3 chromatin mark and its epigenetic writers and readers. An R/Bioconductor package is available at http://bioconductor.org/packages/fCCAC/ . pmb59@cam.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Characteristic analysis on UAV-MIMO channel based on normalized correlation matrix.
Gao, Xi jun; Chen, Zi li; Hu, Yong Jiang
2014-01-01
Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication.
Characteristic Analysis on UAV-MIMO Channel Based on Normalized Correlation Matrix
Xi jun, Gao; Zi li, Chen; Yong Jiang, Hu
2014-01-01
Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication. PMID:24977185
Random-Phase Approximation Methods
NASA Astrophysics Data System (ADS)
Chen, Guo P.; Voora, Vamsee K.; Agee, Matthew M.; Balasubramani, Sree Ganesh; Furche, Filipp
2017-05-01
Random-phase approximation (RPA) methods are rapidly emerging as cost-effective validation tools for semilocal density functional computations. We present the theoretical background of RPA in an intuitive rather than formal fashion, focusing on the physical picture of screening and simple diagrammatic analysis. A new decomposition of the RPA correlation energy into plasmonic modes leads to an appealing visualization of electron correlation in terms of charge density fluctuations. Recent developments in the areas of beyond-RPA methods, RPA correlation potentials, and efficient algorithms for RPA energy and property calculations are reviewed. The ability of RPA to approximately capture static correlation in molecules is quantified by an analysis of RPA natural occupation numbers. We illustrate the use of RPA methods in applications to small-gap systems such as open-shell d- and f-element compounds, radicals, and weakly bound complexes, where semilocal density functional results exhibit strong functional dependence.
Effect of Correlated Precision Errors on Uncertainty of a Subsonic Venturi Calibration
NASA Technical Reports Server (NTRS)
Hudson, S. T.; Bordelon, W. J., Jr.; Coleman, H. W.
1996-01-01
An uncertainty analysis performed in conjunction with the calibration of a subsonic venturi for use in a turbine test facility produced some unanticipated results that may have a significant impact in a variety of test situations. Precision uncertainty estimates using the preferred propagation techniques in the applicable American National Standards Institute/American Society of Mechanical Engineers standards were an order of magnitude larger than precision uncertainty estimates calculated directly from a sample of results (discharge coefficient) obtained at the same experimental set point. The differences were attributable to the effect of correlated precision errors, which previously have been considered negligible. An analysis explaining this phenomenon is presented. The article is not meant to document the venturi calibration, but rather to give a real example of results where correlated precision terms are important. The significance of the correlated precision terms could apply to many test situations.
Analysis of thrips distribution: application of spatial statistics and Kriging
John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard
1991-01-01
Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...
Antenna systems for base station diversity in urban small and micro cells
NASA Astrophysics Data System (ADS)
Eggers, Patrick C. F.; Toftgard, Jorn; Oprea, Alex M.
1993-09-01
This paper describes cross-correlation properties for compact urban base station antenna configurations, nearly all resulting in very low envelope cross-correlation coefficients of about 0.1 to 0.3. A focus is set on polarization diversity systems for their potential in improving link quality when hand-held terminals are involved. An expression is given for the correlation function of compound space and polarization diversity systems. Dispersion and envelope dynamic statistics are presented for the measured environments. For microcell applications, it is found that systems such as GSM having a bandwidth of 200 MHz or less can use narrowband cross-correlation analysis directly.
Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos
2017-01-01
Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.
Gas detection by correlation spectroscopy employing a multimode diode laser.
Lou, Xiutao; Somesfalean, Gabriel; Zhang, Zhiguo
2008-05-01
A gas sensor based on the gas-correlation technique has been developed using a multimode diode laser (MDL) in a dual-beam detection scheme. Measurement of CO(2) mixed with CO as an interfering gas is successfully demonstrated using a 1570 nm tunable MDL. Despite overlapping absorption spectra and occasional mode hops, the interfering signals can be effectively excluded by a statistical procedure including correlation analysis and outlier identification. The gas concentration is retrieved from several pair-correlated signals by a linear-regression scheme, yielding a reliable and accurate measurement. This demonstrates the utility of the unsophisticated MDLs as novel light sources for gas detection applications.
Analysis of the two-point velocity correlations in turbulent boundary layer flows
NASA Technical Reports Server (NTRS)
Oberlack, M.
1995-01-01
The general objective of the present work is to explore the use of Rapid Distortion Theory (RDT) in analysis of the two-point statistics of the log-layer. RDT is applicable only to unsteady flows where the non-linear turbulence-turbulence interaction can be neglected in comparison to linear turbulence-mean interactions. Here we propose to use RDT to examine the structure of the large energy-containing scales and their interaction with the mean flow in the log-region. The contents of the work are twofold: First, two-point analysis methods will be used to derive the law-of-the-wall for the special case of zero mean pressure gradient. The basic assumptions needed are one-dimensionality in the mean flow and homogeneity of the fluctuations. It will be shown that a formal solution of the two-point correlation equation can be obtained as a power series in the von Karman constant, known to be on the order of 0.4. In the second part, a detailed analysis of the two-point correlation function in the log-layer will be given. The fundamental set of equations and a functional relation for the two-point correlation function will be derived. An asymptotic expansion procedure will be used in the log-layer to match Kolmogorov's universal range and the one-point correlations to the inviscid outer region valid for large correlation distances.
NASA Astrophysics Data System (ADS)
Schroer, M. A.; Gutt, C.; Grübel, G.
2014-07-01
Recently the analysis of scattering patterns by angular cross-correlation analysis (CCA) was introduced to reveal the orientational order in disordered samples with special focus to future applications on x-ray free-electron laser facilities. We apply this CCA approach to ultra-small-angle light-scattering data obtained from two-dimensional monolayers of microspheres. The films were studied in addition by optical microscopy. This combined approach allows to calculate the cross-correlations of the scattering patterns, characterized by the orientational correlation function Ψl(q), as well as to obtain the real-space structure of the monolayers. We show that CCA is sensitive to the orientational order of monolayers formed by the microspheres which are not directly visible from the scattering patterns. By mixing microspheres of different radii the sizes of ordered monolayer domains is reduced. For these samples it is shown that Ψl(q) quantitatively describes the degree of hexagonal order of the two-dimensional films. The experimental CCA results are compared with calculations based on the microscopy images. Both techniques show qualitatively similar features. Differences can be attributed to the wave-front distortion of the laser beam in the experiment. This effect is discussed by investigating the effect of different wave fronts on the cross-correlation analysis results. The so-determined characteristics of the cross-correlation analysis will be also relevant for future x-ray-based studies.
Cao, Hongyou; Liu, Quanmin; Wahab, Magd Abdel
2017-01-01
Output-based structural damage detection is becoming increasingly appealing due to its potential in real engineering applications without any restriction regarding excitation measurements. A new transmissibility-based damage detection approach is presented in this study by combining transmissibility with correlation analysis in order to strengthen its performance in discriminating damaged from undamaged scenarios. From this perspective, damage detection strategies are hereafter established by constructing damage-sensitive indicators from a derived transmissibility. A cantilever beam is numerically analyzed to verify the feasibility of the proposed damage detection procedure, and an ASCE (American Society of Civil Engineers) benchmark is henceforth used in the validation for its application in engineering structures. The results of both studies reveal a good performance of the proposed methodology in identifying damaged states from intact states. The comparison between the proposed indicator and the existing indicator also affirms its applicability in damage detection, which might be adopted in further structural health monitoring systems as a discrimination criterion. This study contributed an alternative criterion for transmissibility-based damage detection in addition to the conventional ones. PMID:28773218
NASA Astrophysics Data System (ADS)
Lamperti, Marco; Nardo, Luca; Bondani, Maria
2015-05-01
Site-specific fluorescence-resonance-energy-transfer donor-acceptor dual-labelled oligonucleotide probes are widely used in state-of-art biotechnological applications. Such applications include their usage as primers in polymerase chain reaction. However, the steady-state fluorescence intensity signal emitted by these molecular tools strongly depends from the specificities of the probe conformation. For this reason, the information which can be reliably inferred by steady-state fluorimetry performed on such samples is forcedly confined to a semi-qualitative level. Namely, fluorescent emission is frequently used as ON/OFF indicator of the probe hybridization state, i.e. detection of fluorescence signals indicates either hybridization to or detachment from the template DNA of the probe. Nonetheless, a fully quantitative analysis of their fluorescence emission properties would disclose other exciting applications of dual-labelled probes in biosensing. Here we show how time-correlated single-photon counting can be applied to get rid of the technical limitations and interpretational ambiguities plaguing the intensity analysis, and to derive information on the template DNA reaching single-base.
Nemr, Kátia; Amar, Ali; Abrahão, Marcio; Leite, Grazielle Capatto de Almeida; Köhle, Juliana; Santos, Alexandra de O; Correa, Luiz Artur Costa
2005-01-01
As a result of technology evolution and development, methods of voice evaluation have changed both in medical and speech and language pathology practice. To relate the results of perceptual evaluation, acoustic analysis and medical evaluation in the diagnosis of vocal and/or laryngeal affections of the population with vocal complaint. Clinical prospective. 29 people that attended vocal health protection campaign were evaluated. They were submitted to perceptual evaluation (AFPA), acoustic analysis (AA), indirect laryngoscopy (LI) and telelaryngoscopy (TL). Correlations between medical and speech language pathology evaluation methods were established, verifying possible statistical signification with the application of Fischer Exact Test. There were statistically significant results in the correlation between AFPA and LI, AFPA and TL, LI and TL. This research study conducted in a vocal health protection campaign presented correlations between speech language pathology evaluation and perceptual evaluation and clinical evaluation, as well as between vocal affection and/or laryngeal medical exams.
Segmentation of the Speaker's Face Region with Audiovisual Correlation
NASA Astrophysics Data System (ADS)
Liu, Yuyu; Sato, Yoichi
The ability to find the speaker's face region in a video is useful for various applications. In this work, we develop a novel technique to find this region within different time windows, which is robust against the changes of view, scale, and background. The main thrust of our technique is to integrate audiovisual correlation analysis into a video segmentation framework. We analyze the audiovisual correlation locally by computing quadratic mutual information between our audiovisual features. The computation of quadratic mutual information is based on the probability density functions estimated by kernel density estimation with adaptive kernel bandwidth. The results of this audiovisual correlation analysis are incorporated into graph cut-based video segmentation to resolve a globally optimum extraction of the speaker's face region. The setting of any heuristic threshold in this segmentation is avoided by learning the correlation distributions of speaker and background by expectation maximization. Experimental results demonstrate that our method can detect the speaker's face region accurately and robustly for different views, scales, and backgrounds.
A combined method for correlative 3D imaging of biological samples from macro to nano scale
NASA Astrophysics Data System (ADS)
Kellner, Manuela; Heidrich, Marko; Lorbeer, Raoul-Amadeus; Antonopoulos, Georgios C.; Knudsen, Lars; Wrede, Christoph; Izykowski, Nicole; Grothausmann, Roman; Jonigk, Danny; Ochs, Matthias; Ripken, Tammo; Kühnel, Mark P.; Meyer, Heiko
2016-10-01
Correlative analysis requires examination of a specimen from macro to nano scale as well as applicability of analytical methods ranging from morphological to molecular. Accomplishing this with one and the same sample is laborious at best, due to deformation and biodegradation during measurements or intermediary preparation steps. Furthermore, data alignment using differing imaging techniques turns out to be a complex task, which considerably complicates the interconnection of results. We present correlative imaging of the accessory rat lung lobe by combining a modified Scanning Laser Optical Tomography (SLOT) setup with a specially developed sample preparation method (CRISTAL). CRISTAL is a resin-based embedding method that optically clears the specimen while allowing sectioning and preventing degradation. We applied and correlated SLOT with Multi Photon Microscopy, histological and immunofluorescence analysis as well as Transmission Electron Microscopy, all in the same sample. Thus, combining CRISTAL with SLOT enables the correlative utilization of a vast variety of imaging techniques.
Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva
2018-01-15
Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.
Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data.
Ying, Gui-Shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard
2017-04-01
To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field in the elderly. When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI -0.03 to 0.32D, p = 0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, p = 0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller p-values, while analysis of the worse eye provided larger p-values than mixed effects models and marginal models. In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision.
Multiscale Detrended Cross-Correlation Analysis of STOCK Markets
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2014-06-01
In this paper, we employ the detrended cross-correlation analysis (DCCA) to investigate the cross-correlations between different stock markets. We report the results of cross-correlated behaviors in US, Chinese and European stock markets in period 1997-2012 by using DCCA method. The DCCA shows the cross-correlated behaviors of intra-regional and inter-regional stock markets in the short and long term which display the similarities and differences of cross-correlated behaviors simply and roughly and the persistence of cross-correlated behaviors of fluctuations. Then, because of the limitation and inapplicability of DCCA method, we propose multiscale detrended cross-correlation analysis (MSDCCA) method to avoid "a priori" selecting the ranges of scales over which two coefficients of the classical DCCA method are identified, and employ MSDCCA to reanalyze these cross-correlations to exhibit some important details such as the existence and position of minimum, maximum and bimodal distribution which are lost if the scale structure is described by two coefficients only and essential differences and similarities in the scale structures of cross-correlation of intra-regional and inter-regional markets. More statistical characteristics of cross-correlation obtained by MSDCCA method help us to understand how two different stock markets influence each other and to analyze the influence from thus two inter-regional markets on the cross-correlation in detail, thus we get a richer and more detailed knowledge of the complex evolutions of dynamics of the cross-correlations between stock markets. The application of MSDCCA is important to promote our understanding of the internal mechanisms and structures of financial markets and helps to forecast the stock indices based on our current results demonstrated the cross-correlations between stock indices. We also discuss the MSDCCA methods of secant rolling window with different sizes and, lastly, provide some relevant implications and issue.
Mammals and Magnetostratigraphy.
ERIC Educational Resources Information Center
Prothero, Donald R.
1988-01-01
Discusses magnetic polarity stratigraphy as a tool for correlation of fossiliferous terrestrial deposits. Explains the strengths, weaknesses, limitations, preferred conditions, sampling, laboratory analysis, and applications of this technique. A table of paleomagnetic studies on vertebrate-bearing terrestrial sections arranged by age and locality…
NASA Technical Reports Server (NTRS)
Gaston, S.; Wertheim, M.; Orourke, J. A.
1973-01-01
Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.
Macro elemental analysis of food samples by nuclear analytical technique
NASA Astrophysics Data System (ADS)
Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.
2017-06-01
Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.
Meta-analysis in Stata using gllamm.
Bagos, Pantelis G
2015-12-01
There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with correlated estimates, for multilevel or hierarchical meta-analysis, or for meta-analysis of longitudinal data. In this work, we show with practical applications that many disparate models, including but not limited to the ones mentioned earlier, can be fitted using gllamm. The software is very versatile and can handle a wide variety of models with applications in a wide range of disciplines. The method presented here takes advantage of these modeling capabilities and makes use of appropriate transformations, based on the Cholesky decomposition of the inverse of the covariance matrix, known as generalized least squares, in order to handle correlated data. The models described earlier can be thought of as special instances of a general linear mixed-model formulation, but to the author's knowledge, a general exposition in order to incorporate all the available models for meta-analysis as special cases and the instructions to fit them in Stata has not been presented so far. Source code is available at http:www.compgen.org/tools/gllamm. Copyright © 2015 John Wiley & Sons, Ltd.
Peña, Adrián F; Doronin, Alexander; Tuchin, Valery V; Meglinski, Igor
2014-08-01
The influence of a low-frequency electric field applied to soft biological tissues ex vivo at normal conditions and upon the topical application of optical clearing agents has been studied by optical coherence tomography (OCT). The electro-kinetic response of tissues has been observed and quantitatively evaluated by the double correlation OCT approach, utilizing consistent application of an adaptive Wiener filtering and Fourier domain correlation algorithm. The results show that fluctuations, induced by the electric field within the biological tissues are exponentially increased in time. We demonstrate that in comparison to impedance measurements and the mapping of the temperature profile at the surface of the tissue samples, the double correlation OCT approach is much more sensitive to the changes associated with the tissues' electro-kinetic response. We also found that topical application of the optical clearing agent reduces the tissues' electro-kinetic response and is cooling the tissue, thus reducing the temperature induced by the electric current by a few degrees. We anticipate that dcOCT approach can find a new application in bioelectrical impedance analysis and monitoring of the electric properties of biological tissues, including the resistivity of high water content tissues and its variations.
Mägi, Reedik; Horikoshi, Momoko; Sofer, Tamar; Mahajan, Anubha; Kitajima, Hidetoshi; Franceschini, Nora; McCarthy, Mark I.; Morris, Andrew P.
2017-01-01
Abstract Trans-ethnic meta-analysis of genome-wide association studies (GWAS) across diverse populations can increase power to detect complex trait loci when the underlying causal variants are shared between ancestry groups. However, heterogeneity in allelic effects between GWAS at these loci can occur that is correlated with ancestry. Here, a novel approach is presented to detect SNP association and quantify the extent of heterogeneity in allelic effects that is correlated with ancestry. We employ trans-ethnic meta-regression to model allelic effects as a function of axes of genetic variation, derived from a matrix of mean pairwise allele frequency differences between GWAS, and implemented in the MR-MEGA software. Through detailed simulations, we demonstrate increased power to detect association for MR-MEGA over fixed- and random-effects meta-analysis across a range of scenarios of heterogeneity in allelic effects between ethnic groups. We also demonstrate improved fine-mapping resolution, in loci containing a single causal variant, compared to these meta-analysis approaches and PAINTOR, and equivalent performance to MANTRA at reduced computational cost. Application of MR-MEGA to trans-ethnic GWAS of kidney function in 71,461 individuals indicates stronger signals of association than fixed-effects meta-analysis when heterogeneity in allelic effects is correlated with ancestry. Application of MR-MEGA to fine-mapping four type 2 diabetes susceptibility loci in 22,086 cases and 42,539 controls highlights: (i) strong evidence for heterogeneity in allelic effects that is correlated with ancestry only at the index SNP for the association signal at the CDKAL1 locus; and (ii) 99% credible sets with six or fewer variants for five distinct association signals. PMID:28911207
Oscillation Baselining and Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).
NASA Astrophysics Data System (ADS)
Lototzis, M.; Papadopoulos, G. K.; Droulia, F.; Tseliou, A.; Tsiros, I. X.
2018-04-01
There are several cases where a circular variable is associated with a linear one. A typical example is wind direction that is often associated with linear quantities such as air temperature and air humidity. The analysis of a statistical relationship of this kind can be tested by the use of parametric and non-parametric methods, each of which has its own advantages and drawbacks. This work deals with correlation analysis using both the parametric and the non-parametric procedure on a small set of meteorological data of air temperature and wind direction during a summer period in a Mediterranean climate. Correlations were examined between hourly, daily and maximum-prevailing values, under typical and non-typical meteorological conditions. Both tests indicated a strong correlation between mean hourly wind directions and mean hourly air temperature, whereas mean daily wind direction and mean daily air temperature do not seem to be correlated. In some cases, however, the two procedures were found to give quite dissimilar levels of significance on the rejection or not of the null hypothesis of no correlation. The simple statistical analysis presented in this study, appropriately extended in large sets of meteorological data, may be a useful tool for estimating effects of wind on local climate studies.
NASA Astrophysics Data System (ADS)
Delignières, Didier; Marmelat, Vivien
2014-01-01
In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.
2010-01-01
Introduction Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. Methods The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. Results The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. Conclusions We anticipate that free web applications, such as ImmunoRatio, will make the quantitative image analysis of ER, PR, and Ki-67 easy and straightforward in the diagnostic assessment of breast cancer specimens. PMID:20663194
Tuominen, Vilppu J; Ruotoistenmäki, Sanna; Viitanen, Arttu; Jumppanen, Mervi; Isola, Jorma
2010-01-01
Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. We anticipate that free web applications, such as ImmunoRatio, will make the quantitative image analysis of ER, PR, and Ki-67 easy and straightforward in the diagnostic assessment of breast cancer specimens.
A correlated meta-analysis strategy for data mining "OMIC" scans.
Province, Michael A; Borecki, Ingrid B
2013-01-01
Meta-analysis is becoming an increasingly popular and powerful tool to integrate findings across studies and OMIC dimensions. But there is the danger that hidden dependencies between putatively "independent" studies can cause inflation of type I error, due to reinforcement of the evidence from false-positive findings. We present here a simple method for conducting meta-analyses that automatically estimates the degree of any such non-independence between OMIC scans and corrects the inference for it, retaining the proper type I error structure. The method does not require the original data from the source studies, but operates only on summary analysis results from these in OMIC scans. The method is applicable in a wide variety of situations including combining GWAS and or sequencing scan results across studies with dependencies due to overlapping subjects, as well as to scans of correlated traits, in a meta-analysis scan for pleiotropic genetic effects. The method correctly detects which scans are actually independent in which case it yields the traditional meta-analysis, so it may safely be used in all cases, when there is even a suspicion of correlation amongst scans.
Comparison of Penalty Functions for Sparse Canonical Correlation Analysis
Chalise, Prabhakar; Fridley, Brooke L.
2011-01-01
Canonical correlation analysis (CCA) is a widely used multivariate method for assessing the association between two sets of variables. However, when the number of variables far exceeds the number of subjects, such in the case of large-scale genomic studies, the traditional CCA method is not appropriate. In addition, when the variables are highly correlated the sample covariance matrices become unstable or undefined. To overcome these two issues, sparse canonical correlation analysis (SCCA) for multiple data sets has been proposed using a Lasso type of penalty. However, these methods do not have direct control over sparsity of solution. An additional step that uses Bayesian Information Criterion (BIC) has also been suggested to further filter out unimportant features. In this paper, a comparison of four penalty functions (Lasso, Elastic-net, SCAD and Hard-threshold) for SCCA with and without the BIC filtering step have been carried out using both real and simulated genotypic and mRNA expression data. This study indicates that the SCAD penalty with BIC filter would be a preferable penalty function for application of SCCA to genomic data. PMID:21984855
Work probability distribution for a ferromagnet with long-ranged and short-ranged correlations
NASA Astrophysics Data System (ADS)
Bhattacharjee, J. K.; Kirkpatrick, T. R.; Sengers, J. V.
2018-04-01
Work fluctuations and work probability distributions are fundamentally different in systems with short-ranged versus long-ranged correlations. Specifically, in systems with long-ranged correlations the work distribution is extraordinarily broad compared to systems with short-ranged correlations. This difference profoundly affects the possible applicability of fluctuation theorems like the Jarzynski fluctuation theorem. The Heisenberg ferromagnet, well below its Curie temperature, is a system with long-ranged correlations in very low magnetic fields due to the presence of Goldstone modes. As the magnetic field is increased the correlations gradually become short ranged. Hence, such a ferromagnet is an ideal system for elucidating the changes of the work probability distribution as one goes from a domain with long-ranged correlations to a domain with short-ranged correlations by tuning the magnetic field. A quantitative analysis of this crossover behavior of the work probability distribution and the associated fluctuations is presented.
NASA Astrophysics Data System (ADS)
Yuan, Naiming; Xoplaki, Elena; Zhu, Congwen; Luterbacher, Juerg
2016-06-01
In this paper, two new methods, Temporal evolution of Detrended Cross-Correlation Analysis (TDCCA) and Temporal evolution of Detrended Partial-Cross-Correlation Analysis (TDPCCA), are proposed by generalizing DCCA and DPCCA. Applying TDCCA/TDPCCA, it is possible to study correlations on multi-time scales and over different periods. To illustrate their properties, we used two climatological examples: i) Global Sea Level (GSL) versus North Atlantic Oscillation (NAO); and ii) Summer Rainfall over Yangtze River (SRYR) versus previous winter Pacific Decadal Oscillation (PDO). We find significant correlations between GSL and NAO on time scales of 60 to 140 years, but the correlations are non-significant between 1865-1875. As for SRYR and PDO, significant correlations are found on time scales of 30 to 35 years, but the correlations are more pronounced during the recent 30 years. By combining TDCCA/TDPCCA and DCCA/DPCCA, we proposed a new correlation-detection system, which compared to traditional methods, can objectively show how two time series are related (on which time scale, during which time period). These are important not only for diagnosis of complex system, but also for better designs of prediction models. Therefore, the new methods offer new opportunities for applications in natural sciences, such as ecology, economy, sociology and other research fields.
Analysis of variances of quasirapidities in collisions of gold nuclei with track-emulsion nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulamov, K. G.; Zhokhova, S. I.; Lugovoi, V. V., E-mail: lugovoi@uzsci.net
2012-08-15
A new method of an analysis of variances was developed for studying n-particle correlations of quasirapidities in nucleus-nucleus collisions for a large constant number n of particles. Formulas that generalize the results of the respective analysis to various values of n were derived. Calculations on the basis of simple models indicate that the method is applicable, at least for n {>=} 100. Quasirapidity correlations statistically significant at a level of 36 standard deviations were discovered in collisions between gold nuclei and track-emulsion nuclei at an energy of 10.6 GeV per nucleon. The experimental data obtained in our present study aremore » contrasted against the theory of nucleus-nucleus collisions.« less
Ling, Hangjian; Katz, Joseph
2014-09-20
This paper deals with two issues affecting the application of digital holographic microscopy (DHM) for measuring the spatial distribution of particles in a dense suspension, namely discriminating between real and virtual images and accurate detection of the particle center. Previous methods to separate real and virtual fields have involved applications of multiple phase-shifted holograms, combining reconstructed fields of multiple axially displaced holograms, and analysis of intensity distributions of weakly scattering objects. Here, we introduce a simple approach based on simultaneously recording two in-line holograms, whose planes are separated by a short distance from each other. This distance is chosen to be longer than the elongated trace of the particle. During reconstruction, the real images overlap, whereas the virtual images are displaced by twice the distance between hologram planes. Data analysis is based on correlating the spatial intensity distributions of the two reconstructed fields to measure displacement between traces. This method has been implemented for both synthetic particles and a dense suspension of 2 μm particles. The correlation analysis readily discriminates between real and virtual images of a sample containing more than 1300 particles. Consequently, we can now implement DHM for three-dimensional tracking of particles when the hologram plane is located inside the sample volume. Spatial correlations within the same reconstructed field are also used to improve the detection of the axial location of the particle center, extending previously introduced procedures to suspensions of microscopic particles. For each cross section within a particle trace, we sum the correlations among intensity distributions in all planes located symmetrically on both sides of the section. This cumulative correlation has a sharp peak at the particle center. Using both synthetic and recorded particle fields, we show that the uncertainty in localizing the axial location of the center is reduced to about one particle's diameter.
Wang, Yikai; Kang, Jian; Kemmer, Phebe B.; Guo, Ying
2016-01-01
Currently, network-oriented analysis of fMRI data has become an important tool for understanding brain organization and brain networks. Among the range of network modeling methods, partial correlation has shown great promises in accurately detecting true brain network connections. However, the application of partial correlation in investigating brain connectivity, especially in large-scale brain networks, has been limited so far due to the technical challenges in its estimation. In this paper, we propose an efficient and reliable statistical method for estimating partial correlation in large-scale brain network modeling. Our method derives partial correlation based on the precision matrix estimated via Constrained L1-minimization Approach (CLIME), which is a recently developed statistical method that is more efficient and demonstrates better performance than the existing methods. To help select an appropriate tuning parameter for sparsity control in the network estimation, we propose a new Dens-based selection method that provides a more informative and flexible tool to allow the users to select the tuning parameter based on the desired sparsity level. Another appealing feature of the Dens-based method is that it is much faster than the existing methods, which provides an important advantage in neuroimaging applications. Simulation studies show that the Dens-based method demonstrates comparable or better performance with respect to the existing methods in network estimation. We applied the proposed partial correlation method to investigate resting state functional connectivity using rs-fMRI data from the Philadelphia Neurodevelopmental Cohort (PNC) study. Our results show that partial correlation analysis removed considerable between-module marginal connections identified by full correlation analysis, suggesting these connections were likely caused by global effects or common connection to other nodes. Based on partial correlation, we find that the most significant direct connections are between homologous brain locations in the left and right hemisphere. When comparing partial correlation derived under different sparse tuning parameters, an important finding is that the sparse regularization has more shrinkage effects on negative functional connections than on positive connections, which supports previous findings that many of the negative brain connections are due to non-neurophysiological effects. An R package “DensParcorr” can be downloaded from CRAN for implementing the proposed statistical methods. PMID:27242395
Wang, Yikai; Kang, Jian; Kemmer, Phebe B; Guo, Ying
2016-01-01
Currently, network-oriented analysis of fMRI data has become an important tool for understanding brain organization and brain networks. Among the range of network modeling methods, partial correlation has shown great promises in accurately detecting true brain network connections. However, the application of partial correlation in investigating brain connectivity, especially in large-scale brain networks, has been limited so far due to the technical challenges in its estimation. In this paper, we propose an efficient and reliable statistical method for estimating partial correlation in large-scale brain network modeling. Our method derives partial correlation based on the precision matrix estimated via Constrained L1-minimization Approach (CLIME), which is a recently developed statistical method that is more efficient and demonstrates better performance than the existing methods. To help select an appropriate tuning parameter for sparsity control in the network estimation, we propose a new Dens-based selection method that provides a more informative and flexible tool to allow the users to select the tuning parameter based on the desired sparsity level. Another appealing feature of the Dens-based method is that it is much faster than the existing methods, which provides an important advantage in neuroimaging applications. Simulation studies show that the Dens-based method demonstrates comparable or better performance with respect to the existing methods in network estimation. We applied the proposed partial correlation method to investigate resting state functional connectivity using rs-fMRI data from the Philadelphia Neurodevelopmental Cohort (PNC) study. Our results show that partial correlation analysis removed considerable between-module marginal connections identified by full correlation analysis, suggesting these connections were likely caused by global effects or common connection to other nodes. Based on partial correlation, we find that the most significant direct connections are between homologous brain locations in the left and right hemisphere. When comparing partial correlation derived under different sparse tuning parameters, an important finding is that the sparse regularization has more shrinkage effects on negative functional connections than on positive connections, which supports previous findings that many of the negative brain connections are due to non-neurophysiological effects. An R package "DensParcorr" can be downloaded from CRAN for implementing the proposed statistical methods.
NASA Astrophysics Data System (ADS)
Beyer, W. K. G.
The estimation accuracy of the group delay measured in a single video frequency band was analyzed as a function of the system bandwidth and the signal to noise ratio. Very long base interferometry (VLBI) measurements from geodetic experiments were used to check the geodetic applicability of the Mark 2 evaluation system. The geodetic observation quantities and the correlation geometry are introduced. The data flow in the VLBI experiment, the correlation analysis, the analyses and evaluation in the MK2 system, and the delay estimation procedure following the least squares method are presented. It is shown that the MK2 system is no longer up to date for geodetic applications. The superiority of the developed estimation method with respect to the interpolation algorithm is demonstrated. The numerical investigations show the deleterious influence of the distorting bit shift effects.
Correlation Filtering of Modal Dynamics using the Laplace Wavelet
NASA Technical Reports Server (NTRS)
Freudinger, Lawrence C.; Lind, Rick; Brenner, Martin J.
1997-01-01
Wavelet analysis allows processing of transient response data commonly encountered in vibration health monitoring tasks such as aircraft flutter testing. The Laplace wavelet is formulated as an impulse response of a single mode system to be similar to data features commonly encountered in these health monitoring tasks. A correlation filtering approach is introduced using the Laplace wavelet to decompose a signal into impulse responses of single mode subsystems. Applications using responses from flutter testing of aeroelastic systems demonstrate modal parameters and stability estimates can be estimated by correlation filtering free decay data with a set of Laplace wavelets.
Advanced Statistics for Exotic Animal Practitioners.
Hodsoll, John; Hellier, Jennifer M; Ryan, Elizabeth G
2017-09-01
Correlation and regression assess the association between 2 or more variables. This article reviews the core knowledge needed to understand these analyses, moving from visual analysis in scatter plots through correlation, simple and multiple linear regression, and logistic regression. Correlation estimates the strength and direction of a relationship between 2 variables. Regression can be considered more general and quantifies the numerical relationships between an outcome and 1 or multiple variables in terms of a best-fit line, allowing predictions to be made. Each technique is discussed with examples and the statistical assumptions underlying their correct application. Copyright © 2017 Elsevier Inc. All rights reserved.
Optical implementation of neocognitron and its applications to radar signature discrimination
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin; Stoner, William W.
1991-01-01
A feature-extraction-based optoelectronic neural network is introduced. The system implementation approach applies the principle of the neocognitron paradigm first introduced by Fukushima et al. (1983). A multichannel correlator is used as a building block of a generic single layer of the neocognitron for shift-invariant feature correlation. Multilayer processing is achieved by iteratively feeding back the output of the feature correlator to the input spatial light modulator. Successful pattern recognition with intraclass fault tolerance and interclass discrimination is achieved using this optoelectronic neocognitron. Detailed system analysis is described. Experimental demonstration of radar signature processing is also provided.
Variable Selection through Correlation Sifting
NASA Astrophysics Data System (ADS)
Huang, Jim C.; Jojic, Nebojsa
Many applications of computational biology require a variable selection procedure to sift through a large number of input variables and select some smaller number that influence a target variable of interest. For example, in virology, only some small number of viral protein fragments influence the nature of the immune response during viral infection. Due to the large number of variables to be considered, a brute-force search for the subset of variables is in general intractable. To approximate this, methods based on ℓ1-regularized linear regression have been proposed and have been found to be particularly successful. It is well understood however that such methods fail to choose the correct subset of variables if these are highly correlated with other "decoy" variables. We present a method for sifting through sets of highly correlated variables which leads to higher accuracy in selecting the correct variables. The main innovation is a filtering step that reduces correlations among variables to be selected, making the ℓ1-regularization effective for datasets on which many methods for variable selection fail. The filtering step changes both the values of the predictor variables and output values by projections onto components obtained through a computationally-inexpensive principal components analysis. In this paper we demonstrate the usefulness of our method on synthetic datasets and on novel applications in virology. These include HIV viral load analysis based on patients' HIV sequences and immune types, as well as the analysis of seasonal variation in influenza death rates based on the regions of the influenza genome that undergo diversifying selection in the previous season.
1996-10-01
aligned using an octree search algorithm combined with cross correlation analysis . Successive 4x downsampling with optional and specifiable neighborhood...desired and the search engine embedded in the OODBMS will find the requested imagery and que it to the user for further analysis . This application was...obtained during Hoftmann-LaRoche production pathology imaging performed at UMICH. Versant works well and is easy to use; 3) Pathology Image Analysis
Bajwa, Nadia M; Yudkowsky, Rachel; Belli, Dominique; Vu, Nu Viet; Park, Yoon Soo
2017-03-01
The purpose of this study was to provide validity and feasibility evidence in measuring professionalism using the Professionalism Mini-Evaluation Exercise (P-MEX) scores as part of a residency admissions process. In 2012 and 2013, three standardized-patient-based P-MEX encounters were administered to applicants invited for an interview at the University of Geneva Pediatrics Residency Program. Validity evidence was gathered for P-MEX content (item analysis); response process (qualitative feedback); internal structure (inter-rater reliability with intraclass correlation and Generalizability); relations to other variables (correlations); and consequences (logistic regression to predict admission). To improve reliability, Kane's formula was used to create an applicant composite score using P-MEX, structured letter of recommendation (SLR), and structured interview (SI) scores. Applicant rank lists using composite scores versus faculty global ratings were compared using the Wilcoxon signed-rank test. Seventy applicants were assessed. Moderate associations were found between pairwise correlations of P-MEX scores and SLR (r = 0.25, P = .036), SI (r = 0.34, P = .004), and global ratings (r = 0.48, P < .001). Generalizability of the P-MEX using three cases was moderate (G-coefficient = 0.45). P-MEX scores had the greatest correlation with acceptance (r = 0.56, P < .001), were the strongest predictor of acceptance (OR 4.37, P < .001), and increased pseudo R-squared by 0.20 points. Including P-MEX scores increased composite score reliability from 0.51 to 0.74. Rank lists of applicants using composite score versus global rating differed significantly (z = 5.41, P < .001). Validity evidence supports the use of P-MEX scores to improve the reliability of the residency admissions process by improving applicant composite score reliability.
1976-03-01
pseudo -ranae and range rate correlations , and GDM software etficiency. Other simplifications include the eliwination of all or part of che multipath...signal is available. Then the pdf parameters are trivially available by simple mean, variance and correlation measurements on the quadrature signal...This report investigates the application of CSEL to the LES 8/9 and GPS satellite programs. In addition, a new analysis of the effects of soft and
Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data
Ying, Gui-shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard
2017-01-01
Purpose To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. Methods We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field data in the elderly. Results When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI −0.03 to 0.32D, P=0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, P=0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller P-values, while analysis of the worse eye provided larger P-values than mixed effects models and marginal models. Conclusion In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision. PMID:28102741
Price-volume multifractal analysis and its application in Chinese stock markets
NASA Astrophysics Data System (ADS)
Yuan, Ying; Zhuang, Xin-tian; Liu, Zhi-ying
2012-06-01
An empirical research on Chinese stock markets is conducted using statistical tools. First, the multifractality of stock price return series, ri(ri=ln(Pt+1)-ln(Pt)) and trading volume variation series, vi(vi=ln(Vt+1)-ln(Vt)) is confirmed using multifractal detrended fluctuation analysis. Furthermore, a multifractal detrended cross-correlation analysis between stock price return and trading volume variation in Chinese stock markets is also conducted. It is shown that the cross relationship between them is also found to be multifractal. Second, the cross-correlation between stock price Pi and trading volume Vi is empirically studied using cross-correlation function and detrended cross-correlation analysis. It is found that both Shanghai stock market and Shenzhen stock market show pronounced long-range cross-correlations between stock price and trading volume. Third, a composite index R based on price and trading volume is introduced. Compared with stock price return series ri and trading volume variation series vi, R variation series not only remain the characteristics of original series but also demonstrate the relative correlation between stock price and trading volume. Finally, we analyze the multifractal characteristics of R variation series before and after three financial events in China (namely, Price Limits, Reform of Non-tradable Shares and financial crisis in 2008) in the whole period of sample to study the changes of stock market fluctuation and financial risk. It is found that the empirical results verified the validity of R.
Mobile Technology Application for Improved Urine Concentration Measurement Pilot Study.
Walawender, Laura; Patterson, Jeremy; Strouse, Robert; Ketz, John; Saxena, Vijay; Alexy, Emily; Schwaderer, Andrew
2018-01-01
Objectives: Low hydration has a deleterious effect on many conditions. In the absence of a urine concentrating defect, urine concentration is a marker of hydration status. However, markers to evaluate hydration status have not been well studied in children. The objectives of this paper are to compare measures of thirst and urine concentration in children and to develop a novel mobile technology application to measure urine concentration. Study Design: Children age 12-17 years were selected ( n = 21) for this pilot study. Thirst perception, specific gravity (automated dipstick analysis and refractometer), and urine color scale results were correlated to urine osmolality. The technology department developed a mobile technology camera application to measure light penetrance into urine which was tested on 25 random anonymized urine samples. Results: The patients' thirst perception and color scale as well as two researchers color scale did not significantly correlate with osmolality. Correlation between osmolality and hydration markers resulted in the following Pearson coefficients: SG automated dipstick, 0.61 ( P 0.003); SG refractometer, 0.98 ( P < 0.0001); urine color scale (patient), 0.37 ( P 0.10), and light penetrance, -0.77 ( P < 0.0001). The correlation of light penetrance with osmolality was stronger than all measures except SG by refractometer and osmolality. Conclusion: The mobile technology application may be a more accurate tool for urine concentration measurement than specific gravity by automated dipstick, subjective thirst, and urine color scale, but lags behind specific gravity measured by refractometer. The mobile technology application is a step toward patient oriented hydration strategies.
Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...
2015-09-10
We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, M; Fan, T; Duan, J
2015-06-15
Purpose: Prospectively assess the potential utility of texture analysis for differentiation of central cancer from atelectasis. Methods: 0 consecutive central lung cancer patients who were referred for CT imaging and PET-CT were enrolled. Radiotherapy doctor delineate the tumor and atelectasis according to the fusion imaging based on CT image and PET-CT image. The texture parameters (such as energy, correlation, sum average, difference average, difference entropy), were obtained respectively to quantitatively discriminate tumor and atelectasis based on gray level co-occurrence matrix (GLCM) Results: The texture analysis results showed that the parameters of correlation and sum average had an obviously statistical significance(P<0.05).more » Conclusion: the results of this study indicate that texture analysis may be useful for the differentiation of central lung cancer and atelectasis.« less
Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-05-01
The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.
NASA Astrophysics Data System (ADS)
Winkel, B. V.
1995-03-01
The purpose of this report is to document the Multi-Function Waste Tank Facility (MWTF) Project position on the concrete mechanical properties needed to perform design/analysis calculations for the MWTF secondary concrete structure. This report provides a position on MWTF concrete properties for the Title 1 and Title 2 calculations. The scope of the report is limited to mechanical properties and does not include the thermophysical properties of concrete needed to perform heat transfer calculations. In the 1970's, a comprehensive series of tests were performed at Construction Technology Laboratories (CTL) on two different Hanford concrete mix designs. Statistical correlations of the CTL data were later generated by Pacific Northwest Laboratories (PNL). These test results and property correlations have been utilized in various design/analysis efforts of Hanford waste tanks. However, due to changes in the concrete design mix and the lower range of MWTF operating temperatures, plus uncertainties in the CTL data and PNL correlations, it was prudent to evaluate the CTL data base and PNL correlations, relative to the MWTF application, and develop a defendable position. The CTL test program for Hanford concrete involved two different mix designs: a 3 kip/sq in mix and a 4.5 kip/sq in mix. The proposed 28-day design strength for the MWTF tanks is 5 kip/sq in. In addition to this design strength difference, there are also differences between the CTL and MWTF mix design details. Also of interest, are the appropriate application of the MWTF concrete properties in performing calculations demonstrating ACI Code compliance. Mix design details and ACI Code issues are addressed in Sections 3.0 and 5.0, respectively. The CTL test program and PNL data correlations focused on a temperature range of 250 to 450 F. The temperature range of interest for the MWTF tank concrete application is 70 to 200 F.
Enterprise systems in financial sector - an application in precious metal trading forecasting
NASA Astrophysics Data System (ADS)
Chen, Xiaozhu; Fang, Yiwei
2013-11-01
The use of enterprise systems has become increasingly popular in the financial service industry. This paper discusses the applications of enterprise systems in the financial sectors and presents an application in gold price forecasting. We carefully examine the impacts of a few most widely assumed factors that have significant impact on the long-term gold price using statistical regression techniques. The analysis on our proposed linear regression mode indicates that the United States ultra scale of M2 money supply has been the most important catalyst for the rising price of gold, and the CRB index upward trend has also been the weighty factor for pushing up the gold price. In addition, the gold price has a low negative correlation with the Dow Jones Industrial Average, and low positive correlations with the US dollar index and the gold ETFs holdings.
The prediction of postpartum depression: The role of the PRECEDE model and health locus of control
Moshki, Mahdi; Kharazmi, Akram; Cheravi, Khadijeh; Beydokhti, Tahereh Baloochi
2015-01-01
Background: The main purpose of this study was to investigate the effect of the PRECEDE model and health locus of control (HLC) on postpartum depression. This study used the path analysis to test the pattern of causal relations through the correlation coefficients. Materials and Method: The participants included 230 pregnant women in the north-east of Iran who were selected by convenience sampling. To analyze data, Pearson correlation and path analysis were applied to examine the relationships between variables using SPSS 20 and LISREL 8.50software. Results: The result of path analysis showed that a positive correlation exists between predisposing (knowledge, internal HLC, powerful others HLC, chance HLC) enabling and reinforcing factors with postpartum depression by GHQ score (GFI = 1, RSMEA = 000). Conclusion: The current study supported the application of the PRECEDE model and HLC in understanding the promoting behaviors in mental health and demonstrated their relationships with postpartum depression. PMID:26288792
2010-01-01
Background Glucocorticoids (GC) represent the core treatment modality for many inflammatory diseases. Its mode of action is difficult to grasp, not least because it includes direct modulation of many components of the extracellular matrix as well as complex anti-inflammatory effects. Protein expression profile of skin proteins is being changed with topical application of GC, however, the knowledge about singular markers in this regard is only patchy and collaboration is ill defined. Material/Methods Scar formation was observed under different doses of GC, which were locally applied on the back skin of mice (1 to 3 weeks). After euthanasia we analyzed protein expression of collagen I and III (picrosirius) in scar tissue together with 16 additional protein markers, which are involved in wound healing, with immunhistochemistry. For assessing GC's effect on co-expression we compared our results with a model of random figures to estimate how many significant correlations should be expected by chance. Results GC altered collagen and protein expression with distinct results in different areas of investigation. Most often we observed a reduced expression after application of low dose GC. In the scar infiltrate a multivariate analysis confirmed the significant impact of both GC concentrations. Calculation of Spearman's correlation coefficient similarly resulted in a significant impact of GC, and furthermore, offered the possibility to grasp the entire interactive profile in between all variables studied. The biological markers, which were connected by significant correlations could be arranged in a highly cross-linked network that involved most of the markers measured. A marker highly cross-linked with more than 3 significant correlations was indicated by a higher variation of all its correlations to the other variables, resulting in a standard deviation of > 0.2. Conclusion In addition to immunohistochemical analysis of single protein markers multivariate analysis of co-expressions by use of correlation coefficients reveals the complexity of biological relationships and identifies complex biological effects of GC on skin scarring. Depiction of collaborative clusters will help to understand functional pathways. The functional importance of highly cross-linked proteins will have to be proven in subsequent studies. PMID:20509951
Wang, Hong-Qiang; Tsai, Chung-Jui
2013-01-01
With the rapid increase of omics data, correlation analysis has become an indispensable tool for inferring meaningful associations from a large number of observations. Pearson correlation coefficient (PCC) and its variants are widely used for such purposes. However, it remains challenging to test whether an observed association is reliable both statistically and biologically. We present here a new method, CorSig, for statistical inference of correlation significance. CorSig is based on a biology-informed null hypothesis, i.e., testing whether the true PCC (ρ) between two variables is statistically larger than a user-specified PCC cutoff (τ), as opposed to the simple null hypothesis of ρ = 0 in existing methods, i.e., testing whether an association can be declared without a threshold. CorSig incorporates Fisher's Z transformation of the observed PCC (r), which facilitates use of standard techniques for p-value computation and multiple testing corrections. We compared CorSig against two methods: one uses a minimum PCC cutoff while the other (Zhu's procedure) controls correlation strength and statistical significance in two discrete steps. CorSig consistently outperformed these methods in various simulation data scenarios by balancing between false positives and false negatives. When tested on real-world Populus microarray data, CorSig effectively identified co-expressed genes in the flavonoid pathway, and discriminated between closely related gene family members for their differential association with flavonoid and lignin pathways. The p-values obtained by CorSig can be used as a stand-alone parameter for stratification of co-expressed genes according to their correlation strength in lieu of an arbitrary cutoff. CorSig requires one single tunable parameter, and can be readily extended to other correlation measures. Thus, CorSig should be useful for a wide range of applications, particularly for network analysis of high-dimensional genomic data. A web server for CorSig is provided at http://202.127.200.1:8080/probeWeb. R code for CorSig is freely available for non-commercial use at http://aspendb.uga.edu/downloads.
The Correlation Between Dislocations and Vacancy Defects Using Positron Annihilation Spectroscopy
NASA Astrophysics Data System (ADS)
Pang, Jinbiao; Li, Hui; Zhou, Kai; Wang, Zhu
2012-07-01
An analysis program for positron annihilation lifetime spectra is only applicable to isolated defects, but is of no use in the presence of defective correlations. Such limitations have long caused problems for positron researchers in their studies of complicated defective systems. In order to solve this problem, we aim to take a semiconductor material, for example, to achieve a credible average lifetime of single crystal silicon under plastic deformation at different temperatures using positron life time spectroscopy. By establishing reasonable positron trapping models with defective correlations and sorting out four lifetime components with multiple parameters, as well as their respective intensities, information is obtained on the positron trapping centers, such as the positron trapping rates of defects, the density of the dislocation lines and correlation between the dislocation lines, and the vacancy defects, by fitting with the average lifetime with the aid of Matlab software. These results give strong grounds for the existence of dislocation-vacancy correlation in plastically deformed silicon, and lay a theoretical foundation for the analysis of positron lifetime spectra when the positron trapping model involves dislocation-related defects.
Charlton, Paula C; Mentiplay, Benjamin F; Pua, Yong-Hao; Clark, Ross A
2015-05-01
Traditional methods of assessing joint range of motion (ROM) involve specialized tools that may not be widely available to clinicians. This study assesses the reliability and validity of a custom Smartphone application for assessing hip joint range of motion. Intra-tester reliability with concurrent validity. Passive hip joint range of motion was recorded for seven different movements in 20 males on two separate occasions. Data from a Smartphone, bubble inclinometer and a three dimensional motion analysis (3DMA) system were collected simultaneously. Intraclass correlation coefficients (ICCs), coefficients of variation (CV) and standard error of measurement (SEM) were used to assess reliability. To assess validity of the Smartphone application and the bubble inclinometer against the three dimensional motion analysis system, intraclass correlation coefficients and fixed and proportional biases were used. The Smartphone demonstrated good to excellent reliability (ICCs>0.75) for four out of the seven movements, and moderate to good reliability for the remaining three movements (ICC=0.63-0.68). Additionally, the Smartphone application displayed comparable reliability to the bubble inclinometer. The Smartphone application displayed excellent validity when compared to the three dimensional motion analysis system for all movements (ICCs>0.88) except one, which displayed moderate to good validity (ICC=0.71). Smartphones are portable and widely available tools that are mostly reliable and valid for assessing passive hip range of motion, with potential for large-scale use when a bubble inclinometer is not available. However, caution must be taken in its implementation as some movement axes demonstrated only moderate reliability. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Teodoro, George; Kurc, Tahsin; Andrade, Guilherme; Kong, Jun; Ferreira, Renato; Saltz, Joel
2015-01-01
We carry out a comparative performance study of multi-core CPUs, GPUs and Intel Xeon Phi (Many Integrated Core-MIC) with a microscopy image analysis application. We experimentally evaluate the performance of computing devices on core operations of the application. We correlate the observed performance with the characteristics of computing devices and data access patterns, computation complexities, and parallelization forms of the operations. The results show a significant variability in the performance of operations with respect to the device used. The performances of operations with regular data access are comparable or sometimes better on a MIC than that on a GPU. GPUs are more efficient than MICs for operations that access data irregularly, because of the lower bandwidth of the MIC for random data accesses. We propose new performance-aware scheduling strategies that consider variabilities in operation speedups. Our scheduling strategies significantly improve application performance compared to classic strategies in hybrid configurations. PMID:28239253
Principles of operation and data reduction techniques for the loft drag disc turbine transducer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silverman, S.
An analysis of the single- and two-phase flow data applicable to the loss-of-fluid test (LOFT) is presented for the LOFT drag turbine transducer. Analytical models which were employed to correlate the experimental data are presented.
Application of Monte Carlo algorithms to the Bayesian analysis of the Cosmic Microwave Background
NASA Technical Reports Server (NTRS)
Jewell, J.; Levin, S.; Anderson, C. H.
2004-01-01
Power spectrum estimation and evaluation of associated errors in the presence of incomplete sky coverage; nonhomogeneous, correlated instrumental noise; and foreground emission are problems of central importance for the extraction of cosmological information from the cosmic microwave background (CMB).
An alternative approach to measure similarity between two deterministic transient signals
NASA Astrophysics Data System (ADS)
Shin, Kihong
2016-06-01
In many practical engineering applications, it is often required to measure the similarity of two signals to gain insight into the conditions of a system. For example, an application that monitors machinery can regularly measure the signal of the vibration and compare it to a healthy reference signal in order to monitor whether or not any fault symptom is developing. Also in modal analysis, a frequency response function (FRF) from a finite element model (FEM) is often compared with an FRF from experimental modal analysis. Many different similarity measures are applicable in such cases, and correlation-based similarity measures may be most frequently used among these such as in the case where the correlation coefficient in the time domain and the frequency response assurance criterion (FRAC) in the frequency domain are used. Although correlation-based similarity measures may be particularly useful for random signals because they are based on probability and statistics, we frequently deal with signals that are largely deterministic and transient. Thus, it may be useful to develop another similarity measure that takes the characteristics of the deterministic transient signal properly into account. In this paper, an alternative approach to measure the similarity between two deterministic transient signals is proposed. This newly proposed similarity measure is based on the fictitious system frequency response function, and it consists of the magnitude similarity and the shape similarity. Finally, a few examples are presented to demonstrate the use of the proposed similarity measure.
Jürgens, Julian H W; Schulz, Nadine; Wybranski, Christian; Seidensticker, Max; Streit, Sebastian; Brauner, Jan; Wohlgemuth, Walter A; Deuerling-Zheng, Yu; Ricke, Jens; Dudeck, Oliver
2015-02-01
The objective of this study was to compare the parameter maps of a new flat-panel detector application for time-resolved perfusion imaging in the angiography room (FD-CTP) with computed tomography perfusion (CTP) in an experimental tumor model. Twenty-four VX2 tumors were implanted into the hind legs of 12 rabbits. Three weeks later, FD-CTP (Artis zeego; Siemens) and CTP (SOMATOM Definition AS +; Siemens) were performed. The parameter maps for the FD-CTP were calculated using a prototype software, and those for the CTP were calculated with VPCT-body software on a dedicated syngo MultiModality Workplace. The parameters were compared using Pearson product-moment correlation coefficient and linear regression analysis. The Pearson product-moment correlation coefficient showed good correlation values for both the intratumoral blood volume of 0.848 (P < 0.01) and the blood flow of 0.698 (P < 0.01). The linear regression analysis of the perfusion between FD-CTP and CTP showed for the blood volume a regression equation y = 4.44x + 36.72 (P < 0.01) and for the blood flow y = 0.75x + 14.61 (P < 0.01). This preclinical study provides evidence that FD-CTP allows a time-resolved (dynamic) perfusion imaging of tumors similar to CTP, which provides the basis for clinical applications such as the assessment of tumor response to locoregional therapies directly in the angiography suite.
An optical/digital processor - Hardware and applications
NASA Technical Reports Server (NTRS)
Casasent, D.; Sterling, W. M.
1975-01-01
A real-time two-dimensional hybrid processor consisting of a coherent optical system, an optical/digital interface, and a PDP-11/15 control minicomputer is described. The input electrical-to-optical transducer is an electron-beam addressed potassium dideuterium phosphate (KD2PO4) light valve. The requirements and hardware for the output optical-to-digital interface, which is constructed from modular computer building blocks, are presented. Initial experimental results demonstrating the operation of this hybrid processor in phased-array radar data processing, synthetic-aperture image correlation, and text correlation are included. The applications chosen emphasize the role of the interface in the analysis of data from an optical processor and possible extensions to the digital feedback control of an optical processor.
Boschetti, Lucio; Ottavian, Matteo; Facco, Pierantonio; Barolo, Massimiliano; Serva, Lorenzo; Balzan, Stefania; Novelli, Enrico
2013-11-01
The use of near-infrared spectroscopy (NIRS) is proposed in this study for the characterization of the quality parameters of a smoked and dry-cured meat product known as Bauernspeck (originally from Northern Italy), as well as of some technological traits of the pork carcass used for its manufacturing. In particular, NIRS is shown to successfully estimate several key quality parameters (including water activity, moisture, dry matter, ash and protein content), suggesting its suitability for real time application in replacement of expensive and time consuming chemical analysis. Furthermore, a correlative approach based on canonical correlation analysis was used to investigate the spectral regions that are mostly correlated to the characteristics of interest. The identification of these regions, which can be linked to the absorbance of the main functional chemical groups, is intended to provide a better understanding of the chemical structure of the substrate under investigation. Copyright © 2013 Elsevier Ltd. All rights reserved.
A new class of random processes with application to helicopter noise
NASA Technical Reports Server (NTRS)
Hardin, Jay C.; Miamee, A. G.
1989-01-01
The concept of dividing random processes into classes (e.g., stationary, locally stationary, periodically correlated, and harmonizable) has long been employed. A new class of random processes is introduced which includes many of these processes as well as other interesting processes which fall into none of the above classes. Such random processes are denoted as linearly correlated. This class is shown to include the familiar stationary and periodically correlated processes as well as many other, both harmonizable and non-harmonizable, nonstationary processes. When a process is linearly correlated for all t and harmonizable, its two-dimensional power spectral density S(x) (omega 1, omega 2) is shown to take a particularly simple form, being non-zero only on lines such that omega 1 to omega 2 = + or - r(k) where the r(k's) are (not necessarily equally spaced) roots of a characteristic function. The relationship of such processes to the class of stationary processes is examined. In addition, the application of such processes in the analysis of typical helicopter noise signals is described.
A new class of random processes with application to helicopter noise
NASA Technical Reports Server (NTRS)
Hardin, Jay C.; Miamee, A. G.
1989-01-01
The concept of dividing random processes into classes (e.g., stationary, locally stationary, periodically correlated, and harmonizable) has long been employed. A new class of random processes is introduced which includes many of these processes as well as other interesting processes which fall into none of the above classes. Such random processes are denoted as linearly correlated. This class is shown to include the familiar stationary and periodically correlated processes as well as many other, both harmonizable and non-harmonizable, nonstationary processes. When a process is linearly correlated for all t and harmonizable, its two-dimensional power spectral density S(x)(omega 1, omega 2) is shown to take a particularly simple form, being non-zero only on lines such that omega 1 to omega 2 = + or - r(k) where the r(k's) are (not necessarily equally spaced) roots of a characteristic function. The relationship of such processes to the class of stationary processes is examined. In addition, the application of such processes in the analysis of typical helicopter noise signals is described.
High Speed Research (HSR) Multi-Year Summary Report for Calendar Years 1995-1999
NASA Technical Reports Server (NTRS)
Baker, Myles; Boyd, William
1999-01-01
The Aeroelasticity Task is intended to provide demonstrated technology readiness to predict and improve flutter characteristics of an HSCT configuration. This requires aerodynamic codes that are applicable to the wide range of flight regimes in which the HSCT will operate, and are suitable to provide the higher fidelity required for evaluation of aeroservoelastic coupling effects. Prediction of these characteristics will result in reduced airplane weight and risk associated with a highly flexible, low-aspect ratio supersonic airplane with narrow fuselage, relatively thin wings, and heavy engines. This Task is subdivided into three subtasks. The first subtask includes the design, fabrication, and testing of wind-tunnel models suitable to provide an experimental database relevant to HSCT configurations. The second subtask includes validation of candidate unsteady aerodynamic codes, applicable in the Mach and frequency ranges of interest for the HSCT, through analysis test correlation with the test data. The third subtask includes efforts to develop and enhance these codes for application to HSCT configurations. The wind tunnel models designed and constructed during this program furnished data which were useful for the analysis test correlation work but there were shortcomings. There was initial uncertainty in the proper tunnel configuration for testing, there was a need for higher quality measured model geometry, and there was a need for better measured model displacements in the test data. One of the models exhibited changes in its dynamic characteristics during testing. Model design efforts were hampered by a need for more and earlier analysis support and better knowledge of material properties. Success of the analysis test correlation work was somewhat muted by the uncertainties in the wind tunnel model data. The planned extent of the test data was not achieved, partly due to the delays in the model design and fabrication which could not be extended due to termination of the HSR program.
NASA Technical Reports Server (NTRS)
Gabel, R.; Lang, P. F.; Smith, L. A.; Reed, D. A.
1989-01-01
Boeing Helicopter, together with other United States helicopter manufacturers, participated in a finite element applications program to emplace in the United States a superior capability to utilize finite element analysis models in support of helicopter airframe design. The activities relating to planning and creating a finite element vibrations model of the Boeing Model 36-0 composite airframe are summarized, along with the subsequent analytical correlation with ground shake test data.
Dirac points, spinons and spin liquid in twisted bilayer graphene
NASA Astrophysics Data System (ADS)
Irkhin, V. Yu.; Skryabin, Yu. N.
2018-05-01
Twisted bilayer graphene is an excellent example of highly correlated system demonstrating a nearly flat electron band, the Mott transition and probably a spin liquid state. Besides the one-electron picture, analysis of Dirac points is performed in terms of spinon Fermi surface in the limit of strong correlations. Application of gauge field theory to describe deconfined spin liquid phase is treated. Topological quantum transitions, including those from small to large Fermi surface in the presence of van Hove singularities, are discussed.
A Flight Prediction for Performance of the SWAS Solar Array Deployment Mechanism
NASA Technical Reports Server (NTRS)
Seniderman, Gary; Daniel, Walter K.
1999-01-01
The focus of this paper is a comparison of ground-based solar array deployment tests with the on-orbit deployment. The discussion includes a summary of the mechanisms involved and the correlation of a dynamics model with ground based test results. Some of the unique characteristics of the mechanisms are explained through the analysis of force and angle data acquired from the test deployments. The correlated dynamics model is then used to predict the performance of the system in its flight application.
Polarization-direction correlation measurement --- Experimental test of the PDCO methods
NASA Astrophysics Data System (ADS)
Starosta, K.; Morek, T.; Droste, Ch.; Rohoziński, S. G.; Srebrny, J.; Bergstrem, M.; Herskind, B.
1998-04-01
Information about spins and parities of excited states is crucial for nuclear structure studies. In ``in-beam" gamma ray spectroscopy the directional correlation (DCO) or angular distribution measurements are widely used tools for multipolarity assignment; although, it is known that neither of these methods is sensitive to electric or magnetic character of gamma radiation. Multipolarity of gamma rays may be determined when the results of the DCO analysis are combined with the results of linear polarization measurements. The large total efficiency of modern multidetector arrays allows one to carry out coincidence measurements between the polarimeter and the remaining detectors. The aim of the present study was to test experimentally the possibility of polarization-direction correlation measurements using the EUROGAM II array. The studied nucleus was ^164Yb produced in the ^138Ba(^30Si,4n) reaction at beam energies of 150 and 155 MeV. The angular correlation, linear polarization and direction-polarization correlation were measured for the strong transitions in yrast and non yrast cascades. Application of the PDCO analysis to a transition connecting a side band with the yrast band allowed one to rule out most of the ambiguities in multipolarity assignment occuring if one used angular correlations only.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Madhavi Z; Labbe, Nicole; Wagner, Rebekah J.
2013-01-01
This chapter details the application of LIBS in a number of environmental areas of research such as carbon sequestration and climate change. LIBS has also been shown to be useful in other high resolution environmental applications for example, elemental mapping and detection of metals in plant materials. LIBS has also been used in phytoremediation applications. Other biological research involves a detailed understanding of wood chemistry response to precipitation variations and also to forest fires. A cross-section of Mountain pine (pinceae Pinus pungen Lamb.) was scanned using a translational stage to determine the differences in the chemical features both before andmore » after a fire event. Consequently, by monitoring the elemental composition pattern of a tree and by looking for abrupt changes, one can reconstruct the disturbance history of a tree and a forest. Lastly we have shown that multivariate analysis of the LIBS data is necessary to standardize the analysis and correlate to other standard laboratory techniques. LIBS along with multivariate statistical analysis makes it a very powerful technology that can be transferred from laboratory to field applications with ease.« less
Interplay between past market correlation structure changes and future volatility outbursts.
Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T
2016-11-18
We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of "correlation structure persistence" on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a "metacorrelation" that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility.
Interplay between past market correlation structure changes and future volatility outbursts
NASA Astrophysics Data System (ADS)
Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.
2016-11-01
We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of “correlation structure persistence” on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a “metacorrelation” that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility.
Interplay between past market correlation structure changes and future volatility outbursts
Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T.
2016-01-01
We report significant relations between past changes in the market correlation structure and future changes in the market volatility. This relation is made evident by using a measure of “correlation structure persistence” on correlation-based information filtering networks that quantifies the rate of change of the market dependence structure. We also measured changes in the correlation structure by means of a “metacorrelation” that measures a lagged correlation between correlation matrices computed over different time windows. Both methods show a deep interplay between past changes in correlation structure and future changes in volatility and we demonstrate they can anticipate market risk variations and this can be used to better forecast portfolio risk. Notably, these methods overcome the curse of dimensionality that limits the applicability of traditional econometric tools to portfolios made of a large number of assets. We report on forecasting performances and statistical significance of both methods for two different equity datasets. We also identify an optimal region of parameters in terms of True Positive and False Positive trade-off, through a ROC curve analysis. We find that this forecasting method is robust and it outperforms logistic regression predictors based on past volatility only. Moreover the temporal analysis indicates that methods based on correlation structural persistence are able to adapt to abrupt changes in the market, such as financial crises, more rapidly than methods based on past volatility. PMID:27857144
NASA Technical Reports Server (NTRS)
Sopher, R.; Twomey, W. J.
1990-01-01
NASA-Langley is sponsoring a rotorcraft structural dynamics program with the objective to establish in the U.S. a superior capability to utilize finite element analysis models for calculations to support industrial design of helicopter airframe structures. In the initial phase of the program, teams from the major U.S. manufacturers of helicopter airframes will apply extant finite element analysis methods to calculate loads and vibrations of helicopter airframes, and perform correlations between analysis and measurements. The aforementioned rotorcraft structural dynamics program was given the acronym DAMVIBS (Design Analysis Method for Vibrations). Sikorsky's RDYNE Rotorcraft Dynamics Analysis used for the correlation study, the specifics of the application of RDYNE to the AH-1G, and comparisons of the predictions of the method with flight data for loads and vibrations on the AH-1G are described. RDYNE was able to predict trends of variations of loads and vibrations with airspeed, but in some instances magnitudes differed from measured results by factors of two or three to one. Sensitivities were studied of predictions to rotor inflow modeling, effects of torsional modes, number of blade bending modes, fuselage structural damping, and hub modal content.
Das, Atanu; Mukhopadhyay, Chaitali
2007-10-28
We have performed molecular dynamics (MD) simulation of the thermal denaturation of one protein and one peptide-ubiquitin and melittin. To identify the correlation in dynamics among various secondary structural fragments and also the individual contribution of different residues towards thermal unfolding, principal component analysis method was applied in order to give a new insight to protein dynamics by analyzing the contribution of coefficients of principal components. The cross-correlation matrix obtained from MD simulation trajectory provided important information regarding the anisotropy of backbone dynamics that leads to unfolding. Unfolding of ubiquitin was found to be a three-state process, while that of melittin, though smaller and mostly helical, is more complicated.
NASA Astrophysics Data System (ADS)
Das, Atanu; Mukhopadhyay, Chaitali
2007-10-01
We have performed molecular dynamics (MD) simulation of the thermal denaturation of one protein and one peptide—ubiquitin and melittin. To identify the correlation in dynamics among various secondary structural fragments and also the individual contribution of different residues towards thermal unfolding, principal component analysis method was applied in order to give a new insight to protein dynamics by analyzing the contribution of coefficients of principal components. The cross-correlation matrix obtained from MD simulation trajectory provided important information regarding the anisotropy of backbone dynamics that leads to unfolding. Unfolding of ubiquitin was found to be a three-state process, while that of melittin, though smaller and mostly helical, is more complicated.
Scanning fluorescence correlation spectroscopy comes full circle.
Gunther, German; Jameson, David M; Aguilar, Joao; Sánchez, Susana A
2018-02-07
In this article, we review the application of fluorescence correlation spectroscopy (FCS) methods to studies on live cells. We begin with a brief overview of the theory underlying FCS, highlighting the type of information obtainable. We then focus on circular scanning FCS. Specifically, we discuss instrumentation and data analysis and offer some considerations regarding sample preparation. Two examples from the literature are discussed in detail. First, we show how this method, coupled with the photon counting histogram analysis, can provide information on yeast ribosomal structures in live cells. The combination of scanning FCS with dual channel detection in the study of lipid domains in live cells is also illustrated. Copyright © 2018 Elsevier Inc. All rights reserved.
Astrophysical data analysis with information field theory
NASA Astrophysics Data System (ADS)
Enßlin, Torsten
2014-12-01
Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.
NASA Astrophysics Data System (ADS)
Gnyp, Andriy
2009-06-01
Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.
The Use of Citation Counting to Identify Research Trends
ERIC Educational Resources Information Center
Rothman, Harry; Woodhead, Michael
1971-01-01
The analysis and application of manpower statistics to identify some long-term international research trends in economic entomology and pest conrol are described. Movements in research interests, particularly towards biological methods of control, correlations between these sectors, and the difficulties encountered in the construction of a…
Applicability of Berry's index in bite mark analysis
Antony, Palathottungal Joseph; Pillai, Karthigakannan Subramanian; George, Giju Baby; Varghese, Thomas; Puthalath, Mohammed Shibin; Arakkal, Leena Johnson
2015-01-01
Objectives: This study attempts to highlight the usefulness of applying Berry's Index as an adjuvant to support and aid in bite analysis. Materials and Methods: This study was conducted among 100 students between the ages of 18–30 from Mar Baselios Dental Collage, Kothamangalam. Out of the 100 subjects, there were 50 males and 50 females. The data obtained was tabulated and analyzed using Statistical Package for Social Sciences, Version 16 (SPSS). Results: The mean value of the width of the upper central incisor for male and female was 0.7602 cm and 0.7765 cm respectively. The mean value of the bizygomatic width for male and female was 12.54 cm and 12.42 cm respectively. The correlation between the upper central incisor width and the bizygomatic width was inferred to have a good positive correlation with a value 0f 0.613. Pearson correlation coefficient with greater correlation between the upper central incisor width and the bizygomatic width in female patient (r = 0.678) compared with male patient (r = 0. 525). Conclusion: Berry's Index can be a useful adjuvant to bite analysis by providing a means of determining the facial proportions of an individual from the width of the central incisors. PMID:25709316
Correa, Nicolle M; Li, Yi-Ou; Adalı, Tülay; Calhoun, Vince D
2008-12-01
Typically data acquired through imaging techniques such as functional magnetic resonance imaging (fMRI), structural MRI (sMRI), and electroencephalography (EEG) are analyzed separately. However, fusing information from such complementary modalities promises to provide additional insight into connectivity across brain networks and changes due to disease. We propose a data fusion scheme at the feature level using canonical correlation analysis (CCA) to determine inter-subject covariations across modalities. As we show both with simulation results and application to real data, multimodal CCA (mCCA) proves to be a flexible and powerful method for discovering associations among various data types. We demonstrate the versatility of the method with application to two datasets, an fMRI and EEG, and an fMRI and sMRI dataset, both collected from patients diagnosed with schizophrenia and healthy controls. CCA results for fMRI and EEG data collected for an auditory oddball task reveal associations of the temporal and motor areas with the N2 and P3 peaks. For the application to fMRI and sMRI data collected for an auditory sensorimotor task, CCA results show an interesting joint relationship between fMRI and gray matter, with patients with schizophrenia showing more functional activity in motor areas and less activity in temporal areas associated with less gray matter as compared to healthy controls. Additionally, we compare our scheme with an independent component analysis based fusion method, joint-ICA that has proven useful for such a study and note that the two methods provide complementary perspectives on data fusion.
NASA Astrophysics Data System (ADS)
Nakahara, Hisashi
2015-02-01
For monitoring temporal changes in subsurface structures I propose to use auto correlation functions of coda waves from local earthquakes recorded at surface receivers, which probably contain more body waves than surface waves. Use of coda waves requires earthquakes resulting in decreased time resolution for monitoring. Nonetheless, it may be possible to monitor subsurface structures in sufficient time resolutions in regions with high seismicity. In studying the 2011 Tohoku-Oki, Japan earthquake (Mw 9.0), for which velocity changes have been previously reported, I try to validate the method. KiK-net stations in northern Honshu are used in this analysis. For each moderate earthquake normalized auto correlation functions of surface records are stacked with respect to time windows in the S-wave coda. Aligning the stacked, normalized auto correlation functions with time, I search for changes in phases arrival times. The phases at lag times of <1 s are studied because changes at shallow depths are focused. Temporal variations in the arrival times are measured at the stations based on the stretching method. Clear phase delays are found to be associated with the mainshock and to gradually recover with time. The amounts of the phase delays are 10 % on average with the maximum of about 50 % at some stations. The deconvolution analysis using surface and subsurface records at the same stations is conducted for validation. The results show the phase delays from the deconvolution analysis are slightly smaller than those from the auto correlation analysis, which implies that the phases on the auto correlations are caused by larger velocity changes at shallower depths. The auto correlation analysis seems to have an accuracy of about several percent, which is much larger than methods using earthquake doublets and borehole array data. So this analysis might be applicable in detecting larger changes. In spite of these disadvantages, this analysis is still attractive because it can be applied to many records on the surface in regions where no boreholes are available.
NASA Astrophysics Data System (ADS)
Florindo, João. Batista
2018-04-01
This work proposes the use of Singular Spectrum Analysis (SSA) for the classification of texture images, more specifically, to enhance the performance of the Bouligand-Minkowski fractal descriptors in this task. Fractal descriptors are known to be a powerful approach to model and particularly identify complex patterns in natural images. Nevertheless, the multiscale analysis involved in those descriptors makes them highly correlated. Although other attempts to address this point was proposed in the literature, none of them investigated the relation between the fractal correlation and the well-established analysis employed in time series. And SSA is one of the most powerful techniques for this purpose. The proposed method was employed for the classification of benchmark texture images and the results were compared with other state-of-the-art classifiers, confirming the potential of this analysis in image classification.
Experimental investigation on secondary combustion characteristics of airbreathing rockets
NASA Astrophysics Data System (ADS)
Mano, Takeshi; Eguchi, Akihiro; Shinohara, Suetsugu; Etou, Takao; Kaneko, Yutaka; Yamamoto, Youichi; Nakagawa, Ichirou
Empirical correlations of the secondary combustion efficiency of the airbreathing rocket were derived. From the results of a series of experiments employing a connected pipe facility, the combustion efficiency was related to dominant parameters. The feasibility of the performance prediction by one-dimensional analysis was also discussed. The analysis was found to be applicable to the flow processes in the secondary combustor, which include two-stream mixing and combustion.
Sandmann, Michael; Schafberg, Michaela; Lippold, Martin; Rohn, Sascha
2018-04-19
Microalgae bear a great potential to produce lipids for biodiesel, feed, or even food applications. To understand the still not well-known single-cell dynamics during lipid production in microalgae, a novel single-cell analytical technology was applied to study a well-established model experiment. Multidimensional single-cell dynamics were investigated with a non-supervised image analysis technique that utilizes data from epi-fluorescence microscopy. Reliability of this technique was successfully proven via reference analysis. The technique developed was used to determine cell size, chlorophyll amount, neutral lipid amount, and deriving properties on a single-cellular level in cultures of the biotechnologically promising alga Acutodesmus obliquus. The results illustrated a high correlation between cell size and chlorophyll amount, but a very low and dynamic correlation between cell size, lipid amount, and lipid density. During growth conditions under nitrogen starvation, cells with low chlorophyll content tend to start the lipid production first and the cell suspension differentiated in two subpopulations with significantly different lipid contents. Such quantitative characterization of single-cell dynamics of lipid synthesizing algae was done for the first time and the potential of such simple technology is highly relevant to other biotechnological applications and to deeper investigate the process of microalgal lipid accumulation.
Gonzalez Viejo, Claudia; Fuentes, Sigfredo; Torrico, Damir D; Dunshea, Frank R
2018-06-03
Traditional methods to assess heart rate (HR) and blood pressure (BP) are intrusive and can affect results in sensory analysis of food as participants are aware of the sensors. This paper aims to validate a non-contact method to measure HR using the photoplethysmography (PPG) technique and to develop models to predict the real HR and BP based on raw video analysis (RVA) with an example application in chocolate consumption using machine learning (ML). The RVA used a computer vision algorithm based on luminosity changes on the different RGB color channels using three face-regions (forehead and both cheeks). To validate the proposed method and ML models, a home oscillometric monitor and a finger sensor were used. Results showed high correlations with the G color channel (R² = 0.83). Two ML models were developed using three face-regions: (i) Model 1 to predict HR and BP using the RVA outputs with R = 0.85 and (ii) Model 2 based on time-series prediction with HR, magnitude and luminosity from RVA inputs to HR values every second with R = 0.97. An application for the sensory analysis of chocolate showed significant correlations between changes in HR and BP with chocolate hardness and purchase intention.
Sun, Min; Gao, ZhiQiang; Zhao, WeiFeng; Deng, LianFeng; Deng, Yan; Zhao, HongMei; Ren, AiXia; Li, Gang; Yang, ZhenPing
2013-01-01
To provide a new way to increase water storage and retention of dryland wheat, a field study was conducted at Wenxi experimental site of Shanxi Agricultural University. The effect of subsoiling in fallow period on soil water storage, accumulation of proline, and formation of grain protein after anthesis were determined. Our results showed that subsoiling in fallow period could increase water storage in the 0-300 cm soil at pre-sowing stage and at anthesis stage with low or medium N application, especially for the 60-160 cm soil. However, the proline content, glutamine synthetase (GS) activity, glutamate dehydrogenase (GDH) activity in flag leaves and grains were all decreased by subsoiling in fallow period. In addition, the content of albumin, gliadin, and total protein in grains were also decreased while globulin content, Glu/Gli, protein yield, and glutelin content were increased. With N application increasing, water storage of soil layers from 20 to 200 cm was decreased at anthesis stage. High N application resulted in the increment of proline content and GS activity in grains. Besides, correlation analysis showed that soil storage in 40-160 cm soil was negatively correlated with proline content in grains; proline content in grains was positively correlated with GS and GDH activity in flag leaves. Contents of albumin, globulin and total protein in grains were positively correlated with proline content in grains and GDH activity in flag leaves. In conclusion, subsoiling in fallow period, together with N application at 150 kg·hm(-2), was beneficial to increase the protein yield and Glu/Gli in grains which improve the quality of wheat.
Sun, Min; Gao, ZhiQiang; Zhao, WeiFeng; Deng, LianFeng; Deng, Yan; Zhao, HongMei; Ren, AiXia; Li, Gang; Yang, ZhenPing
2013-01-01
To provide a new way to increase water storage and retention of dryland wheat, a field study was conducted at Wenxi experimental site of Shanxi Agricultural University. The effect of subsoiling in fallow period on soil water storage, accumulation of proline, and formation of grain protein after anthesis were determined. Our results showed that subsoiling in fallow period could increase water storage in the 0–300 cm soil at pre-sowing stage and at anthesis stage with low or medium N application, especially for the 60–160 cm soil. However, the proline content, glutamine synthetase (GS) activity, glutamate dehydrogenase (GDH) activity in flag leaves and grains were all decreased by subsoiling in fallow period. In addition, the content of albumin, gliadin, and total protein in grains were also decreased while globulin content, Glu/Gli, protein yield, and glutelin content were increased. With N application increasing, water storage of soil layers from 20 to 200 cm was decreased at anthesis stage. High N application resulted in the increment of proline content and GS activity in grains. Besides, correlation analysis showed that soil storage in 40–160 cm soil was negatively correlated with proline content in grains; proline content in grains was positively correlated with GS and GDH activity in flag leaves. Contents of albumin, globulin and total protein in grains were positively correlated with proline content in grains and GDH activity in flag leaves. In conclusion, subsoiling in fallow period, together with N application at 150 kg·hm−2, was beneficial to increase the protein yield and Glu/Gli in grains which improve the quality of wheat. PMID:24098371
NASA Astrophysics Data System (ADS)
Maidaniuc, Andreea; Miculescu, Florin; Voicu, Stefan Ioan; Andronescu, Corina; Miculescu, Marian; Matei, Ecaterina; Mocanu, Aura Catalina; Pencea, Ion; Csaki, Ioana; Machedon-Pisu, Teodor; Ciocan, Lucian Toma
2018-04-01
Hydroxyapatite powders characteristics need to be determined both for quality control purposes and for a proper control of microstructural features of bone reconstruction products. This study combines bulk morphological and compositional analysis methods (XRF, SEM-EDS, FT-IR) with surface-related methods (XPS, contact angle measurements) in order to correlate the characteristics of hydroxyapatite powders derived from bovine bone for its use in medical applications. An experimental approach for correlating the surface and volume composition was designed based on the analysis depth of each spectral method involved in the study. Next, the influences of powder particle size and forming method on the contact angle between water drops and ceramic surface were evaluated for identifying suitable strategies of tuning hydroxyapatite's wettability. The results revealed a preferential arrangement of chemical elements at the surface of hydroxyapatite particles which could induce a favourable material behaviour in terms of sinterability and biological performance.
Correlates of Protective Motivation Theory (PMT) to Adolescents’ Drug Use Intention
Wu, Cynthia Sau Ting; Wong, Ho Ting; Chou, Lai Yan; To, Bobby Pak Wai; Lee, Wai Lok; Loke, Alice Yuen
2014-01-01
Early onset and increasing proliferation of illicit adolescent drug-use poses a global health concern. This study aimed to examine the correlation between Protective Motivation Theory (PMT) measures and the intention to use drugs among adolescents. An exploratory quantitative correlation design and convenience sampling were adopted. A total of 318 students completed a self-reported questionnaire that solicited information related to their demographics and activities, measures of threat appraisal and coping appraisal, and the intention to use drugs. Logistic regression analysis showed that intrinsic and extrinsic rewards were significant predictors of intention. The odds ratios were equal to 2.90 (p < 0.05) and 8.04 (p < 0.001), respectively. The logistic regression model analysis resulted in a high Nagelkerke R2 of 0.49, which suggests that PMT related measures could be used in predicting drug use intention among adolescents. Further research should be conducted with non-school adolescents to confirm the application. PMID:24394215
Correlates of Protective Motivation Theory (PMT) to adolescents' drug use intention.
Wu, Cynthia Sau Ting; Wong, Ho Ting; Chou, Lai Yan; To, Bobby Pak Wai; Lee, Wai Lok; Loke, Alice Yuen
2014-01-03
Early onset and increasing proliferation of illicit adolescent drug-use poses a global health concern. This study aimed to examine the correlation between Protective Motivation Theory (PMT) measures and the intention to use drugs among adolescents. An exploratory quantitative correlation design and convenience sampling were adopted. A total of 318 students completed a self-reported questionnaire that solicited information related to their demographics and activities, measures of threat appraisal and coping appraisal, and the intention to use drugs. Logistic regression analysis showed that intrinsic and extrinsic rewards were significant predictors of intention. The odds ratios were equal to 2.90 (p < 0.05) and 8.04 (p < 0.001), respectively. The logistic regression model analysis resulted in a high Nagelkerke R2 of 0.49, which suggests that PMT related measures could be used in predicting drug use intention among adolescents. Further research should be conducted with non-school adolescents to confirm the application.
Development and test of advanced composite components. Center Directors discretionary fund program
NASA Technical Reports Server (NTRS)
Faile, G.; Hollis, R.; Ledbetter, F.; Maldonado, J.; Sledd, J.; Stuckey, J.; Waggoner, G.; Engler, E.
1985-01-01
This report describes the design, analysis, fabrication, and test of a complex bathtub fitting. Graphite fibers in an epoxy matrix were utilized in manufacturing of 11 components representing four different design and layup concepts. Design allowables were developed for use in the final stress analysis. Strain gage measurements were taken throughout the static load test and correlation of test and analysis data were performed, yielding good understanding of the material behavior and instrumentation requirements for future applications.
Application of Artificial Boundary Conditions in Sensitivity-Based Updating of Finite Element Models
2007-06-01
is known as the impedance matrix[ ]( )Z Ω . [ ] [ ] 1( ) ( )Z H −Ω = Ω (12) where [ ] 2( )Z K M j C ⎡ ⎤Ω = −Ω + Ω⎣ ⎦ (13) A. REDUCED ORDER...D.L. A correlation coefficient for modal vector analysis. Proceedings of 1st International Modal Analysis Conference, 1982, 110-116. Anton , H ... Rorres , C ., (2005). Elementary Linear Algebra. New York: John Wiley and Sons. Avitable, Peter (2001, January) Experimental Modal Analysis, A Simple
Small Crack Growth and Its Influence in Near Alpha-Titanium Alloys
1989-06-01
geometries via finite element and boundary-collocation analysis 8 , 9 . Elastic plastic fracture mechanics ( EPFM ) 1 0 , 1 1 and local crack tip field...correlation was found between experimental and predicted data, general application of the model is not possible as both 0 and rp are sensitive to changes in...cracks at low AK the load reduction schemes should be altered to remove the residual deformations, perhaps via machining or the application of large
Digital communications: Microwave applications
NASA Astrophysics Data System (ADS)
Feher, K.
Transmission concepts and techniques of digital systems are presented; and practical state-of-the-art implementation of digital communications systems by line-of-sight microwaves is described. Particular consideration is given to statistical methods in digital transmission systems analysis, digital modulation methods, microwave amplifiers, system gain, m-ary and QAM microwave systems, correlative techniques and applications to digital radio systems, hybrid systems, digital microwave systems design, diversity and protection switching techniques, measurement techniques, and research and development trends and unsolved problems.
Brünner, Yvonne F; Rodriguez-Raecke, Rea; Mutic, Smiljana; Benedict, Christian; Freiherr, Jessica
2016-10-01
This fMRI study intended to establish 3D-simulated mazes with olfactory and visual cues and examine the effect of intranasally applied insulin on memory performance in healthy subjects. The effect of insulin on hippocampus-dependent brain activation was explored using a double-blind and placebo-controlled design. Following intranasal administration of either insulin (40IU) or placebo, 16 male subjects participated in two experimental MRI sessions with olfactory and visual mazes. Each maze included two separate runs. The first was an encoding maze during which subjects learned eight olfactory or eight visual cues at different target locations. The second was a recall maze during which subjects were asked to remember the target cues at spatial locations. For eleven included subjects in the fMRI analysis we were able to validate brain activation for odor perception and visuospatial tasks. However, we did not observe an enhancement of declarative memory performance in our behavioral data or hippocampal activity in response to insulin application in the fMRI analysis. It is therefore possible that intranasal insulin application is sensitive to the methodological variations e.g. timing of task execution and dose of application. Findings from this study suggest that our method of 3D-simulated mazes is feasible for studying neural correlates of olfactory and visual memory performance. Copyright © 2016 Elsevier Inc. All rights reserved.
Impact of Forecast and Model Error Correlations In 4dvar Data Assimilation
NASA Astrophysics Data System (ADS)
Zupanski, M.; Zupanski, D.; Vukicevic, T.; Greenwald, T.; Eis, K.; Vonder Haar, T.
A weak-constraint 4DVAR data assimilation system has been developed at Cooper- ative Institute for Research in the Atmosphere (CIRA), Colorado State University. It is based on the NCEP's ETA 4DVAR system, and it is fully parallel (MPI coding). The CIRA's 4DVAR system is aimed for satellite data assimilation research, with cur- rent focus on assimilation of cloudy radiances and microwave satellite measurements. Most important improvement over the previous 4DVAR system is a degree of gener- ality introduced into the new algorithm, namely for applications with different NWP models (e.g., RAMS, WRF, ETA, etc.), and for the choice of control variable. In cur- rent applications, the non-hydrostatic RAMS model and its adjoint are used, including all microphysical processess. The control variable includes potential temperature, ve- locity potential and stream function, vertical velocity, and seven mixing ratios with respect to all water phases. Since the statistics of the microphysical components of the control variable is not well known, a special attention will be paid to the impact of the forecast and model (prior) error correlations on the 4DVAR analysis. In particular, the sensitivity of the analysis with respect to decorrelation length will be examined. The prior error covariances are modelled using the compactly-supported, space-limited correlations developed at NASA DAO.
NASA Astrophysics Data System (ADS)
Poroseva, Svetlana V.
2013-11-01
Simulations of turbulent boundary-layer flows are usually conducted using a set of the simplified Reynolds-Averaged Navier-Stokes (RANS) equations obtained by order-of-magnitude analysis (OMA) of the original RANS equations. The resultant equations for the mean-velocity components are closed using the Boussinesq approximation for the Reynolds stresses. In this study OMA is applied to the fourth-order RANS (FORANS) set of equations. The FORANS equations are chosen as they can be closed on the level of the 5th-order correlations without using unknown model coefficients, i.e. no turbulent diffusion modeling is required. New models for the 2nd-, 3rd- and 4th-order velocity-pressure gradient correlations are derived for the current FORANS equations. This set of FORANS equations and models are analyzed for the case of two-dimensional mean flow. The equations include familiar transport terms for the mean-velocity components along with algebraic expressions for velocity correlations of different orders specific to the FORANS approach. Flat plate DNS data (Spalart, 1988) are used to verify these expressions and the areas of the OMA applicability within the boundary layer. The material is based upon work supported by NASA under award NNX12AJ61A.
Wenger, Jérôme; Gérard, Davy; Lenne, Pierre-François; Rigneault, Hervé; Dintinger, José; Ebbesen, Thomas W; Boned, Annie; Conchonaud, Fabien; Marguet, Didier
2006-12-11
Single nanometric apertures in a metallic film are used to develop a simple and robust setup for dual-color fluorescence cross-correlation spectroscopy (FCCS) at high concentrations. If the nanoaperture concept has already proven to be useful for single-species analysis, its extension to the dual-color case brings new interesting specificities. The alignment and overlap of the two excitation beams are greatly simplified. No confocal pinhole is used, relaxing the requirement for accurate correction of chromatic aberrations. Compared to two-photon excitation, nanoapertures have the advantage to work with standard fluorophore constructions having high absorption cross-section and well-known absorption/emission spectra. Thanks to the ultra-low volume analysed within one single aperture, fluorescence correlation analysis can be performed with single molecule resolution at micromolar concentrations, resulting in 3 orders of magnitude gain compared to conventional setups. As applications of this technique, we follow the kinetics of an enzymatic cleavage reaction at 2 muM DNA oligonucleotide concentration.We also demonstrate that FCCS in nanoaper-tures can be applied to the fast screening of a sample for dual-labeled species within 1 s acquisition time. This offers new possibilities for rapid screening applications in biotechnology at high concentrations.
Meyer, Karin; Kirkpatrick, Mark
2005-01-01
Principal component analysis is a widely used 'dimension reduction' technique, albeit generally at a phenotypic level. It is shown that we can estimate genetic principal components directly through a simple reparameterisation of the usual linear, mixed model. This is applicable to any analysis fitting multiple, correlated genetic effects, whether effects for individual traits or sets of random regression coefficients to model trajectories. Depending on the magnitude of genetic correlation, a subset of the principal component generally suffices to capture the bulk of genetic variation. Corresponding estimates of genetic covariance matrices are more parsimonious, have reduced rank and are smoothed, with the number of parameters required to model the dispersion structure reduced from k(k + 1)/2 to m(2k - m + 1)/2 for k effects and m principal components. Estimation of these parameters, the largest eigenvalues and pertaining eigenvectors of the genetic covariance matrix, via restricted maximum likelihood using derivatives of the likelihood, is described. It is shown that reduced rank estimation can reduce computational requirements of multivariate analyses substantially. An application to the analysis of eight traits recorded via live ultrasound scanning of beef cattle is given. PMID:15588566
Zhang, Ji; Li, Bing; Wang, Qi; Wei, Xin; Feng, Weibo; Chen, Yijiu; Huang, Ping; Wang, Zhenyuan
2017-12-21
Postmortem interval (PMI) evaluation remains a challenge in the forensic community due to the lack of efficient methods. Studies have focused on chemical analysis of biofluids for PMI estimation; however, no reports using spectroscopic methods in pericardial fluid (PF) are available. In this study, Fourier transform infrared (FTIR) spectroscopy with attenuated total reflectance (ATR) accessory was applied to collect comprehensive biochemical information from rabbit PF at different PMIs. The PMI-dependent spectral signature was determined by two-dimensional (2D) correlation analysis. The partial least square (PLS) and nu-support vector machine (nu-SVM) models were then established based on the acquired spectral dataset. Spectral variables associated with amide I, amide II, COO - , C-H bending, and C-O or C-OH vibrations arising from proteins, polypeptides, amino acids and carbohydrates, respectively, were susceptible to PMI in 2D correlation analysis. Moreover, the nu-SVM model appeared to achieve a more satisfactory prediction than the PLS model in calibration; the reliability of both models was determined in an external validation set. The study shows the possibility of application of ATR-FTIR methods in postmortem interval estimation using PF samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC
Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less
Locating the source of projectile fluid droplets
NASA Astrophysics Data System (ADS)
Varney, Christopher R.; Gittes, Fred
2011-08-01
The ill-posed projectile problem of finding the source height from spattered droplets of viscous fluid is a longstanding obstacle to accident reconstruction and crime-scene analysis. It is widely known how to infer the impact angle of droplets on a surface from the elongation of their impact profiles. However, the lack of velocity information makes finding the height of the origin from the impact position and angle of individual drops not possible. From aggregate statistics of the spatter and basic equations of projectile motion, we introduce a reciprocal correlation plot that is effective when the polar launch angle is concentrated in a narrow range. The vertical coordinate depends on the orientation of the spattered surface and equals the tangent of the impact angle for a level surface. When the horizontal plot coordinate is twice the reciprocal of the impact distance, we can infer the source height as the slope of the data points in the reciprocal correlation plot. If the distribution of launch angles is not narrow, failure of the method is evident in the lack of linear correlation. We perform a number of experimental trials, as well as numerical calculations and show that the height estimate is relatively insensitive to aerodynamic drag. Besides its possible relevance for crime investigation, reciprocal-plot analysis of spatter may find application to volcanism and other topics and is most immediately applicable for undergraduate science and engineering students in the context of crime-scene analysis.
USDA-ARS?s Scientific Manuscript database
Inelastic neutron scattering (INS) was applied to determine soil carbon content. Due to non-uniform soil carbon depth distribution, the correlation between INS signals with some soil carbon content parameter is not obvious; however, a proportionality between INS signals and average carbon weight per...
Information-Pooling Bias in Collaborative Security Incident Correlation Analysis.
Rajivan, Prashanth; Cooke, Nancy J
2018-03-01
Incident correlation is a vital step in the cybersecurity threat detection process. This article presents research on the effect of group-level information-pooling bias on collaborative incident correlation analysis in a synthetic task environment. Past research has shown that uneven information distribution biases people to share information that is known to most team members and prevents them from sharing any unique information available with them. The effect of such biases on security team collaborations are largely unknown. Thirty 3-person teams performed two threat detection missions involving information sharing and correlating security incidents. Incidents were predistributed to each person in the team based on the hidden profile paradigm. Participant teams, randomly assigned to three experimental groups, used different collaboration aids during Mission 2. Communication analysis revealed that participant teams were 3 times more likely to discuss security incidents commonly known to the majority. Unaided team collaboration was inefficient in finding associations between security incidents uniquely available to each member of the team. Visualizations that augment perceptual processing and recognition memory were found to mitigate the bias. The data suggest that (a) security analyst teams, when conducting collaborative correlation analysis, could be inefficient in pooling unique information from their peers; (b) employing off-the-shelf collaboration tools in cybersecurity defense environments is inadequate; and (c) collaborative security visualization tools developed considering the human cognitive limitations of security analysts is necessary. Potential applications of this research include development of team training procedures and collaboration tool development for security analysts.
Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.
Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse
2017-01-01
Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Wapenaar, Kees; van der Neut, Joost; Ruigrok, Elmer; Draganov, Deyan; Hunziker, Jürg; Slob, Evert; Thorbecke, Jan; Snieder, Roel
2011-06-01
Seismic interferometry, also known as Green's function retrieval by crosscorrelation, has a wide range of applications, ranging from surface-wave tomography using ambient noise, to creating virtual sources for improved reflection seismology. Despite its successful applications, the crosscorrelation approach also has its limitations. The main underlying assumptions are that the medium is lossless and that the wavefield is equipartitioned. These assumptions are in practice often violated: the medium of interest is often illuminated from one side only, the sources may be irregularly distributed, and losses may be significant. These limitations may partly be overcome by reformulating seismic interferometry as a multidimensional deconvolution (MDD) process. We present a systematic analysis of seismic interferometry by crosscorrelation and by MDD. We show that for the non-ideal situations mentioned above, the correlation function is proportional to a Green's function with a blurred source. The source blurring is quantified by a so-called interferometric point-spread function which, like the correlation function, can be derived from the observed data (i.e. without the need to know the sources and the medium). The source of the Green's function obtained by the correlation method can be deblurred by deconvolving the correlation function for the point-spread function. This is the essence of seismic interferometry by MDD. We illustrate the crosscorrelation and MDD methods for controlled-source and passive-data applications with numerical examples and discuss the advantages and limitations of both methods.
NASA Astrophysics Data System (ADS)
Syed Mazlan, S. M. S.; Abdullah, S. R.; Shahidan, S.; Noor, S. R. Mohd
2017-11-01
Concrete durability may be affected by so many factors such as chemical attack and weathering action that reduce the performance and the service life of concrete structures. Low durability Reinforced concrete (RC) can be greatly improved by using Fiber Reinforce Polymer (FRP). FRP is a commonly used composite material for repairing and strengthening RC structures. A review on application of Acoustic Emission (AE) techniques of real time monitoring for various mechanical tests for RC strengthened with FRP involving four-point bending, three-point bending and cyclic loading was carried out and discussed in this paper. Correlations between each AE analyses namely b-value, sentry and intensity analysis on damage characterization also been critically reviewed. From the review, AE monitoring involving RC strengthened with FRP using b-value, sentry and intensity analysis are proven to be successful and efficient method in determining damage characterization. However, application of AE analysis using sentry analysis is still limited compared to b-value and intensity analysis in characterizing damages especially for RC strengthened with FRP specimen.
StereoGene: rapid estimation of genome-wide correlation of continuous or interval feature data.
Stavrovskaya, Elena D; Niranjan, Tejasvi; Fertig, Elana J; Wheelan, Sarah J; Favorov, Alexander V; Mironov, Andrey A
2017-10-15
Genomics features with similar genome-wide distributions are generally hypothesized to be functionally related, for example, colocalization of histones and transcription start sites indicate chromatin regulation of transcription factor activity. Therefore, statistical algorithms to perform spatial, genome-wide correlation among genomic features are required. Here, we propose a method, StereoGene, that rapidly estimates genome-wide correlation among pairs of genomic features. These features may represent high-throughput data mapped to reference genome or sets of genomic annotations in that reference genome. StereoGene enables correlation of continuous data directly, avoiding the data binarization and subsequent data loss. Correlations are computed among neighboring genomic positions using kernel correlation. Representing the correlation as a function of the genome position, StereoGene outputs the local correlation track as part of the analysis. StereoGene also accounts for confounders such as input DNA by partial correlation. We apply our method to numerous comparisons of ChIP-Seq datasets from the Human Epigenome Atlas and FANTOM CAGE to demonstrate its wide applicability. We observe the changes in the correlation between epigenomic features across developmental trajectories of several tissue types consistent with known biology and find a novel spatial correlation of CAGE clusters with donor splice sites and with poly(A) sites. These analyses provide examples for the broad applicability of StereoGene for regulatory genomics. The StereoGene C ++ source code, program documentation, Galaxy integration scripts and examples are available from the project homepage http://stereogene.bioinf.fbb.msu.ru/. favorov@sensi.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Technical Reports Server (NTRS)
Kim, Sang-Wook
1987-01-01
Various experimental, analytical, and numerical analysis methods for flow-solid interaction of a nest of cylinders subjected to cross flows are reviewed. A nest of cylinders subjected to cross flows can be found in numerous engineering applications including the Space Shuttle Maine Engine-Main Injector Assembly (SSME-MIA) and nuclear reactor heat exchangers. Despite its extreme importance in engineering applications, understanding of the flow-solid interaction process is quite limited and design of the tube banks are mostly dependent on experiments and/or experimental correlation equations. For future development of major numerical analysis methods for the flow-solid interaction of a nest of cylinders subjected to cross flow, various turbulence models, nonlinear structural dynamics, and existing laminar flow-solid interaction analysis methods are included.
Ma, Chuang; Wang, Xiangfeng
2012-09-01
One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey's biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses.
Ma, Chuang; Wang, Xiangfeng
2012-01-01
One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey’s biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses. PMID:22797655
Guan, Wei; Cai, Wei-Xiong; Huang, Fu-Yin; Wu, Jia-Sheng
2009-10-01
To explore the application of Diminished Criminal Responsibility Rating Scale (DCRRS) to mental retardation offenders. The DCRRS was used to 121 cases of mental retardation offenders who were divided into three groups according to the degree of their diminished criminal responsibility. There were significant differences in rating score among the three groups (mild group 22.12+/-4.69, moderate group 25.50+/-5.48, major group 27.59+/-5.69), and 17 items had good correlation with the total score of the scale with the correlation coefficient from 0.289 to 0.665. Six factors were extracted by the factor analysis, and 69.392% variation could be explained. The DCRRS has rational items, its total score could show the difference among the three degree diminished criminal responsibility of mental retardation offenders.
[A SAS marco program for batch processing of univariate Cox regression analysis for great database].
Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin
2015-02-01
To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.
NASA Astrophysics Data System (ADS)
Fan, Qingju; Wu, Yonghong
2015-08-01
In this paper, we develop a new method for the multifractal characterization of two-dimensional nonstationary signal, which is based on the detrended fluctuation analysis (DFA). By applying to two artificially generated signals of two-component ARFIMA process and binomial multifractal model, we show that the new method can reliably determine the multifractal scaling behavior of two-dimensional signal. We also illustrate the applications of this method in finance and physiology. The analyzing results exhibit that the two-dimensional signals under investigation are power-law correlations, and the electricity market consists of electricity price and trading volume is multifractal, while the two-dimensional EEG signal in sleep recorded for a single patient is weak multifractal. The new method based on the detrended fluctuation analysis may add diagnostic power to existing statistical methods.
Supersonic/Hypersonic Correlations for In-Cavity Transition and Heating Augmentation
NASA Technical Reports Server (NTRS)
Everhart, Joel L.
2011-01-01
Laminar-entry cavity heating data with a non-laminar boundary layer exit flow have been retrieved from the database developed at Mach 6 and 10 in air on large flat plate models for the Space Shuttle Return-To-Flight Program. Building on previously published fully laminar and fully turbulent analysis methods, new descriptive correlations of the in-cavity floor-averaged heating and endwall maximum heating have been developed for transitional-to-turbulent exit flow. These new local-cavity correlations provide the expected flow and geometry conditions for transition onset; they provide the incremental heating augmentation induced by transitional flow; and, they provide the transitional-to-turbulent exit cavity length. Furthermore, they provide an upper application limit for the previously developed fully-laminar heating correlations. An example is provided that demonstrates simplicity of application. Heating augmentation factors of 12 and 3 above the fully laminar values are shown to exist on the cavity floor and endwall, respectively, if the flow exits in fully tripped-to-turbulent boundary layer state. Cavity floor heating data in geometries installed on the windward surface of 0.075-scale Shuttle wind tunnel models have also been retrieved from the boundary layer transition database developed for the Return-To-Flight Program. These data were independently acquired at Mach 6 and Mach 10 in air, and at Mach 6 in CF4. The correlation parameters for the floor-averaged heating have been developed and they offer an exceptionally positive comparison to previously developed laminar-cavity heating correlations. Non-laminar increments have been extracted from the Shuttle data and they fall on the newly developed transitional in-cavity correlations, and they are bounded by the 95% correlation prediction limits. Because the ratio of specific heats changes along the re-entry trajectory, turning angle into a cavity and boundary layer flow properties may be affected, raising concerns regarding the application validity of the heating augmentation predictions.
Design of pressure-driven microfluidic networks using electric circuit analogy.
Oh, Kwang W; Lee, Kangsun; Ahn, Byungwook; Furlani, Edward P
2012-02-07
This article reviews the application of electric circuit methods for the analysis of pressure-driven microfluidic networks with an emphasis on concentration- and flow-dependent systems. The application of circuit methods to microfluidics is based on the analogous behaviour of hydraulic and electric circuits with correlations of pressure to voltage, volumetric flow rate to current, and hydraulic to electric resistance. Circuit analysis enables rapid predictions of pressure-driven laminar flow in microchannels and is very useful for designing complex microfluidic networks in advance of fabrication. This article provides a comprehensive overview of the physics of pressure-driven laminar flow, the formal analogy between electric and hydraulic circuits, applications of circuit theory to microfluidic network-based devices, recent development and applications of concentration- and flow-dependent microfluidic networks, and promising future applications. The lab-on-a-chip (LOC) and microfluidics community will gain insightful ideas and practical design strategies for developing unique microfluidic network-based devices to address a broad range of biological, chemical, pharmaceutical, and other scientific and technical challenges.
Aeroelastic Flight Data Analysis with the Hilbert-Huang Algorithm
NASA Technical Reports Server (NTRS)
Brenner, Martin J.; Prazenica, Chad
2006-01-01
This report investigates the utility of the Hilbert Huang transform for the analysis of aeroelastic flight data. It is well known that the classical Hilbert transform can be used for time-frequency analysis of functions or signals. Unfortunately, the Hilbert transform can only be effectively applied to an extremely small class of signals, namely those that are characterized by a single frequency component at any instant in time. The recently-developed Hilbert Huang algorithm addresses the limitations of the classical Hilbert transform through a process known as empirical mode decomposition. Using this approach, the data is filtered into a series of intrinsic mode functions, each of which admits a well-behaved Hilbert transform. In this manner, the Hilbert Huang algorithm affords time-frequency analysis of a large class of signals. This powerful tool has been applied in the analysis of scientific data, structural system identification, mechanical system fault detection, and even image processing. The purpose of this report is to demonstrate the potential applications of the Hilbert Huang algorithm for the analysis of aeroelastic systems, with improvements such as localized online processing. Applications for correlations between system input and output, and amongst output sensors, are discussed to characterize the time-varying amplitude and frequency correlations present in the various components of multiple data channels. Online stability analyses and modal identification are also presented. Examples are given using aeroelastic test data from the F-18 Active Aeroelastic Wing airplane, an Aerostructures Test Wing, and pitch plunge simulation.
Aeroelastic Flight Data Analysis with the Hilbert-Huang Algorithm
NASA Technical Reports Server (NTRS)
Brenner, Marty; Prazenica, Chad
2005-01-01
This paper investigates the utility of the Hilbert-Huang transform for the analysis of aeroelastic flight data. It is well known that the classical Hilbert transform can be used for time-frequency analysis of functions or signals. Unfortunately, the Hilbert transform can only be effectively applied to an extremely small class of signals, namely those that are characterized by a single frequency component at any instant in time. The recently-developed Hilbert-Huang algorithm addresses the limitations of the classical Hilbert transform through a process known as empirical mode decomposition. Using this approach, the data is filtered into a series of intrinsic mode functions, each of which admits a well-behaved Hilbert transform. In this manner, the Hilbert-Huang algorithm affords time-frequency analysis of a large class of signals. This powerful tool has been applied in the analysis of scientific data, structural system identification, mechanical system fault detection, and even image processing. The purpose of this paper is to demonstrate the potential applications of the Hilbert-Huang algorithm for the analysis of aeroelastic systems, with improvements such as localized/online processing. Applications for correlations between system input and output, and amongst output sensors, are discussed to characterize the time-varying amplitude and frequency correlations present in the various components of multiple data channels. Online stability analyses and modal identification are also presented. Examples are given using aeroelastic test data from the F/A-18 Active Aeroelastic Wing aircraft, an Aerostructures Test Wing, and pitch-plunge simulation.
Analysis of high vacuum systems using SINDA'85
NASA Technical Reports Server (NTRS)
Spivey, R. A.; Clanton, S. E.; Moore, J. D.
1993-01-01
The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.
SOMBI: Bayesian identification of parameter relations in unstructured cosmological data
NASA Astrophysics Data System (ADS)
Frank, Philipp; Jasche, Jens; Enßlin, Torsten A.
2016-11-01
This work describes the implementation and application of a correlation determination method based on self organizing maps and Bayesian inference (SOMBI). SOMBI aims to automatically identify relations between different observed parameters in unstructured cosmological or astrophysical surveys by automatically identifying data clusters in high-dimensional datasets via the self organizing map neural network algorithm. Parameter relations are then revealed by means of a Bayesian inference within respective identified data clusters. Specifically such relations are assumed to be parametrized as a polynomial of unknown order. The Bayesian approach results in a posterior probability distribution function for respective polynomial coefficients. To decide which polynomial order suffices to describe correlation structures in data, we include a method for model selection, the Bayesian information criterion, to the analysis. The performance of the SOMBI algorithm is tested with mock data. As illustration we also provide applications of our method to cosmological data. In particular, we present results of a correlation analysis between galaxy and active galactic nucleus (AGN) properties provided by the SDSS catalog with the cosmic large-scale-structure (LSS). The results indicate that the combined galaxy and LSS dataset indeed is clustered into several sub-samples of data with different average properties (for example different stellar masses or web-type classifications). The majority of data clusters appear to have a similar correlation structure between galaxy properties and the LSS. In particular we revealed a positive and linear dependency between the stellar mass, the absolute magnitude and the color of a galaxy with the corresponding cosmic density field. A remaining subset of data shows inverted correlations, which might be an artifact of non-linear redshift distortions.
Calibration of Passive Microwave Polarimeters that Use Hybrid Coupler-Based Correlators
NASA Technical Reports Server (NTRS)
Piepmeier, J. R.
2003-01-01
Four calibration algorithms are studied for microwave polarimeters that use hybrid coupler-based correlators: 1) conventional two-look of hot and cold sources, 2) three looks of hot and cold source combinations, 3) two-look with correlated source, and 4) four-look combining methods 2 and 3. The systematic errors are found to depend on the polarimeter component parameters and accuracy of calibration noise temperatures. A case study radiometer in four different remote sensing scenarios was considered in light of these results. Applications for Ocean surface salinity, Ocean surface winds, and soil moisture were found to be sensitive to different systematic errors. Finally, a standard uncertainty analysis was performed on the four-look calibration algorithm, which was found to be most sensitive to the correlated calibration source.
2014-09-18
Converter AES Advance Encryption Standard ANN Artificial Neural Network APS Application Support AUC Area Under the Curve CPA Correlation Power Analysis ...Importance WGN White Gaussian Noise WPAN Wireless Personal Area Networks XEnv Cross-Environment XRx Cross-Receiver xxi ADVANCES IN SCA AND RF-DNA...based tool called KillerBee was released in 2009 that increases the exposure of ZigBee and other IEEE 802.15.4-based Wireless Personal Area Networks
Multivariate meta-analysis for non-linear and other multi-parameter associations
Gasparrini, A; Armstrong, B; Kenward, M G
2012-01-01
In this paper, we formalize the application of multivariate meta-analysis and meta-regression to synthesize estimates of multi-parameter associations obtained from different studies. This modelling approach extends the standard two-stage analysis used to combine results across different sub-groups or populations. The most straightforward application is for the meta-analysis of non-linear relationships, described for example by regression coefficients of splines or other functions, but the methodology easily generalizes to any setting where complex associations are described by multiple correlated parameters. The modelling framework of multivariate meta-analysis is implemented in the package mvmeta within the statistical environment R. As an illustrative example, we propose a two-stage analysis for investigating the non-linear exposure–response relationship between temperature and non-accidental mortality using time-series data from multiple cities. Multivariate meta-analysis represents a useful analytical tool for studying complex associations through a two-stage procedure. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22807043
Correlation mass method for analysis of neutrinos from supernova 1987A
NASA Technical Reports Server (NTRS)
Chiu, Hong-Yee; Chan, Kwing L.; Kondo, Yoji
1988-01-01
Application of a time-energy correlation method to the Kamiokande II (KII) observations of neutrinos apparently emitted from supernova 1987A has yielded a neutrino rest mass of 3.6 eV. A Monte Carlo analysis shows a reconfirming probabilty distribution for the neutrino rest mass peaked at 2.8, and dropping to 50 percent of the peak at 1.4 and 4.8 eV. Although the KII data indicate a very short time scale of emission, over an extended period on the order of 10 sec, both data from the Irvine-Michigan-Brookhaven experiment and the KII data show a tendency for the more energetic neutrinos to be emitted earlier at the source, suggesting the possibility of cooling.
Unitary subsector of generalized minimal models
NASA Astrophysics Data System (ADS)
Behan, Connor
2018-05-01
We revisit the line of nonunitary theories that interpolate between the Virasoro minimal models. Numerical bootstrap applications have brought about interest in the four-point function involving the scalar primary of lowest dimension. Using recent progress in harmonic analysis on the conformal group, we prove the conjecture that global conformal blocks in this correlator appear with positive coefficients. We also compute many such coefficients in the simplest mixed correlator system. Finally, we comment on the status of using global conformal blocks to isolate the truly unitary points on this line.
Digital speckle correlation for nondestructive testing of corrosion
NASA Astrophysics Data System (ADS)
Paiva, Raul D., Jr.; Soga, Diogo; Muramatsu, Mikiya; Hogert, Elsa N.; Landau, Monica R.; Ruiz Gale, Maria F.; Gaggioli, Nestor G.
1999-07-01
This paper describes the use of optical correlation speckle patterns to detect and analyze the metallic corrosion phenomena, and shows the experimental set-up used. We present some new results in the characterization of the corrosion process using a model based in electroerosion phenomena. We also provide valuable information about surface microrelief changes, which is also useful in numerous engineering applications. The results obtained are good enough for showing that our technique is very useful for giving new possibilities to the analysis of the corrosion and oxidation process, particularly in real time.
Nichols, D.J.
2005-01-01
Palynology can be effectively used in coal systems analysis to understand the nature of ancient coal-forming peat mires. Pollen and spores preserved in coal effectively reveal the floristic composition of mires, which differed substantially through geologic time, and contribute to determination of depositional environment and paleo- climate. Such applications are most effective when integrated with paleobotanical and coal-petrographic data. Examples of previous studies of Miocene, Carboniferous, and Paleogene coal beds illustrate the methods and results. Palynological age determinations and correlations of deposits are also important in coal systems analysis to establish stratigraphic setting. Application to studies of coalbed methane generation shows potential because certain kinds of pollen are associated with gas-prone lithotypes. ??2005 Geological Society of America.
Cervical Vertebral Body's Volume as a New Parameter for Predicting the Skeletal Maturation Stages.
Choi, Youn-Kyung; Kim, Jinmi; Yamaguchi, Tetsutaro; Maki, Koutaro; Ko, Ching-Chang; Kim, Yong-Il
2016-01-01
This study aimed to determine the correlation between the volumetric parameters derived from the images of the second, third, and fourth cervical vertebrae by using cone beam computed tomography with skeletal maturation stages and to propose a new formula for predicting skeletal maturation by using regression analysis. We obtained the estimation of skeletal maturation levels from hand-wrist radiographs and volume parameters derived from the second, third, and fourth cervical vertebrae bodies from 102 Japanese patients (54 women and 48 men, 5-18 years of age). We performed Pearson's correlation coefficient analysis and simple regression analysis. All volume parameters derived from the second, third, and fourth cervical vertebrae exhibited statistically significant correlations (P < 0.05). The simple regression model with the greatest R-square indicated the fourth-cervical-vertebra volume as an independent variable with a variance inflation factor less than ten. The explanation power was 81.76%. Volumetric parameters of cervical vertebrae using cone beam computed tomography are useful in regression models. The derived regression model has the potential for clinical application as it enables a simple and quantitative analysis to evaluate skeletal maturation level.
Cervical Vertebral Body's Volume as a New Parameter for Predicting the Skeletal Maturation Stages
Choi, Youn-Kyung; Kim, Jinmi; Maki, Koutaro; Ko, Ching-Chang
2016-01-01
This study aimed to determine the correlation between the volumetric parameters derived from the images of the second, third, and fourth cervical vertebrae by using cone beam computed tomography with skeletal maturation stages and to propose a new formula for predicting skeletal maturation by using regression analysis. We obtained the estimation of skeletal maturation levels from hand-wrist radiographs and volume parameters derived from the second, third, and fourth cervical vertebrae bodies from 102 Japanese patients (54 women and 48 men, 5–18 years of age). We performed Pearson's correlation coefficient analysis and simple regression analysis. All volume parameters derived from the second, third, and fourth cervical vertebrae exhibited statistically significant correlations (P < 0.05). The simple regression model with the greatest R-square indicated the fourth-cervical-vertebra volume as an independent variable with a variance inflation factor less than ten. The explanation power was 81.76%. Volumetric parameters of cervical vertebrae using cone beam computed tomography are useful in regression models. The derived regression model has the potential for clinical application as it enables a simple and quantitative analysis to evaluate skeletal maturation level. PMID:27340668
Rosa, Maria J; Mehta, Mitul A; Pich, Emilio M; Risterucci, Celine; Zelaya, Fernando; Reinders, Antje A T S; Williams, Steve C R; Dazzan, Paola; Doyle, Orla M; Marquand, Andre F
2015-01-01
An increasing number of neuroimaging studies are based on either combining more than one data modality (inter-modal) or combining more than one measurement from the same modality (intra-modal). To date, most intra-modal studies using multivariate statistics have focused on differences between datasets, for instance relying on classifiers to differentiate between effects in the data. However, to fully characterize these effects, multivariate methods able to measure similarities between datasets are needed. One classical technique for estimating the relationship between two datasets is canonical correlation analysis (CCA). However, in the context of high-dimensional data the application of CCA is extremely challenging. A recent extension of CCA, sparse CCA (SCCA), overcomes this limitation, by regularizing the model parameters while yielding a sparse solution. In this work, we modify SCCA with the aim of facilitating its application to high-dimensional neuroimaging data and finding meaningful multivariate image-to-image correspondences in intra-modal studies. In particular, we show how the optimal subset of variables can be estimated independently and we look at the information encoded in more than one set of SCCA transformations. We illustrate our framework using Arterial Spin Labeling data to investigate multivariate similarities between the effects of two antipsychotic drugs on cerebral blood flow.
Application and Evaluation of an Expert Judgment Elicitation Procedure for Correlations.
Zondervan-Zwijnenburg, Mariëlle; van de Schoot-Hubeek, Wenneke; Lek, Kimberley; Hoijtink, Herbert; van de Schoot, Rens
2017-01-01
The purpose of the current study was to apply and evaluate a procedure to elicit expert judgments about correlations, and to update this information with empirical data. The result is a face-to-face group elicitation procedure with as its central element a trial roulette question that elicits experts' judgments expressed as distributions. During the elicitation procedure, a concordance probability question was used to provide feedback to the experts on their judgments. We evaluated the elicitation procedure in terms of validity and reliability by means of an application with a small sample of experts. Validity means that the elicited distributions accurately represent the experts' judgments. Reliability concerns the consistency of the elicited judgments over time. Four behavioral scientists provided their judgments with respect to the correlation between cognitive potential and academic performance for two separate populations enrolled at a specific school in the Netherlands that provides special education to youth with severe behavioral problems: youth with autism spectrum disorder (ASD), and youth with diagnoses other than ASD. Measures of face-validity, feasibility, convergent validity, coherence, and intra-rater reliability showed promising results. Furthermore, the current study illustrates the use of the elicitation procedure and elicited distributions in a social science application. The elicited distributions were used as a prior for the correlation, and updated with data for both populations collected at the school of interest. The current study shows that the newly developed elicitation procedure combining the trial roulette method with the elicitation of correlations is a promising tool, and that the results of the procedure are useful as prior information in a Bayesian analysis.
Sato, Koichi; Fukushi, Kiyoshi; Shinotoh, Hitoshi; Shimada, Hitoshi; Hirano, Shigeki; Tanaka, Noriko; Suhara, Tetsuya; Irie, Toshiaki; Ito, Hiroshi
2013-11-16
Recently, we reported an information density theory and an analysis of three-parameter plus shorter scan than conventional method (3P+) for the amyloid-binding ligand [11C]Pittsburgh compound B (PIB) as an example of a non-highly reversible positron emission tomography (PET) ligand. This article describes an extension of 3P + analysis to noninvasive '3P++' analysis (3P + plus use of a reference tissue for input function). In 3P++ analysis for [11C]PIB, the cerebellum was used as a reference tissue (negligible specific binding). Fifteen healthy subjects (NC) and fifteen Alzheimer's disease (AD) patients participated. The k3 (index of receptor density) values were estimated with 40-min PET data and three-parameter reference tissue model and were compared with that in 40-min 3P + analysis as well as standard 90-min four-parameter (4P) analysis with arterial input function. Simulation studies were performed to explain k3 biases observed in 3P++ analysis. Good model fits of 40-min PET data were observed in both reference and target regions-of-interest (ROIs). High linear intra-subject (inter-15 ROI) correlations of k3 between 3P++ (Y-axis) and 3P + (X-axis) analyses were shown in one NC (r2 = 0.972 and slope = 0.845) and in one AD (r2 = 0.982, slope = 0.655), whereas inter-subject k3 correlations in a target region (left lateral temporal cortex) from 30 subjects (15 NC + 15 AD) were somewhat lower (r2 = 0.739 and slope = 0.461). Similar results were shown between 3P++ and 4P analyses: r2 = 0.953 for intra-subject k3 in NC, r2 = 0.907 for that in AD and r2 = 0.711 for inter-30 subject k3. Simulation studies showed that such lower inter-subject k3 correlations and significant negative k3 biases were not due to unstableness of 3P++ analysis but rather to inter-subject variation of both k2 (index of brain-to-blood transport) and k3 (not completely negligible) in the reference region. In [11C]PIB, the applicability of 3P++ analysis may be restricted to intra-subject comparison such as follow-up studies. The 3P++ method itself is thought to be robust and may be more applicable to other non-highly reversible PET ligands with ideal reference tissue.
Time-resolved metabolomics reveals metabolic modulation in rice foliage
Sato, Shigeru; Arita, Masanori; Soga, Tomoyoshi; Nishioka, Takaaki; Tomita, Masaru
2008-01-01
Background To elucidate the interaction of dynamics among modules that constitute biological systems, comprehensive datasets obtained from "omics" technologies have been used. In recent plant metabolomics approaches, the reconstruction of metabolic correlation networks has been attempted using statistical techniques. However, the results were unsatisfactory and effective data-mining techniques that apply appropriate comprehensive datasets are needed. Results Using capillary electrophoresis mass spectrometry (CE-MS) and capillary electrophoresis diode-array detection (CE-DAD), we analyzed the dynamic changes in the level of 56 basic metabolites in plant foliage (Oryza sativa L. ssp. japonica) at hourly intervals over a 24-hr period. Unsupervised clustering of comprehensive metabolic profiles using Kohonen's self-organizing map (SOM) allowed classification of the biochemical pathways activated by the light and dark cycle. The carbon and nitrogen (C/N) metabolism in both periods was also visualized as a phenotypic linkage map that connects network modules on the basis of traditional metabolic pathways rather than pairwise correlations among metabolites. The regulatory networks of C/N assimilation/dissimilation at each time point were consistent with previous works on plant metabolism. In response to environmental stress, glutathione and spermidine fluctuated synchronously with their regulatory targets. Adenine nucleosides and nicotinamide coenzymes were regulated by phosphorylation and dephosphorylation. We also demonstrated that SOM analysis was applicable to the estimation of unidentifiable metabolites in metabolome analysis. Hierarchical clustering of a correlation coefficient matrix could help identify the bottleneck enzymes that regulate metabolic networks. Conclusion Our results showed that our SOM analysis with appropriate metabolic time-courses effectively revealed the synchronous dynamics among metabolic modules and elucidated the underlying biochemical functions. The application of discrimination of unidentified metabolites and the identification of bottleneck enzymatic steps even to non-targeted comprehensive analysis promise to facilitate an understanding of large-scale interactions among components in biological systems. PMID:18564421
NASA Astrophysics Data System (ADS)
Peng, Yu-Hao; Heintz, Ryan; Wang, Zhuo; Guo, Yumei; Myers, Kalisa; Scremin, Oscar; Maarek, Jean-Michel; Holschneider, Daniel
2014-12-01
Current rodent connectome projects are revealing brain structural connectivity with unprecedented resolution and completeness. How subregional structural connectivity relates to subregional functional interactions is an emerging research topic. We describe a method for standardized, mesoscopic-level data sampling from autoradiographic coronal sections of the rat brain, and for correlation-based analysis and intuitive display of cortico-cortical functional connectivity (FC) on a flattened cortical map. A graphic user interface “Cx-2D” allows for the display of significant correlations of individual regions-of-interest, as well as graph theoretical metrics across the cortex. Cx-2D was tested on an autoradiographic data set of cerebral blood flow (CBF) of rats that had undergone bilateral striatal lesions, followed by 4 weeks of aerobic exercise training or no exercise. Effects of lesioning and exercise on cortico-cortical FC were examined during a locomotor challenge in this rat model of Parkinsonism. Subregional FC analysis revealed a rich functional reorganization of the brain in response to lesioning and exercise that was not apparent in a standard analysis focused on CBF of isolated brain regions. Lesioned rats showed diminished degree centrality of lateral primary motor cortex, as well as neighboring somatosensory cortex--changes that were substantially reversed in lesioned rats following exercise training. Seed analysis revealed that exercise increased positive correlations in motor and somatosensory cortex, with little effect in non-sensorimotor regions such as visual, auditory, and piriform cortex. The current analysis revealed that exercise partially reinstated sensorimotor FC lost following dopaminergic deafferentation. Cx-2D allows for standardized data sampling from images of brain slices, as well as analysis and display of cortico-cortical FC in the rat cerebral cortex with potential applications in a variety of autoradiographic and histologic studies.
System Biology Approach: Gene Network Analysis for Muscular Dystrophy.
Censi, Federica; Calcagnini, Giovanni; Mattei, Eugenio; Giuliani, Alessandro
2018-01-01
Phenotypic changes at different organization levels from cell to entire organism are associated to changes in the pattern of gene expression. These changes involve the entire genome expression pattern and heavily rely upon correlation patterns among genes. The classical approach used to analyze gene expression data builds upon the application of supervised statistical techniques to detect genes differentially expressed among two or more phenotypes (e.g., normal vs. disease). The use of an a posteriori, unsupervised approach based on principal component analysis (PCA) and the subsequent construction of gene correlation networks can shed a light on unexpected behaviour of gene regulation system while maintaining a more naturalistic view on the studied system.In this chapter we applied an unsupervised method to discriminate DMD patient and controls. The genes having the highest absolute scores in the discrimination between the groups were then analyzed in terms of gene expression networks, on the basis of their mutual correlation in the two groups. The correlation network structures suggest two different modes of gene regulation in the two groups, reminiscent of important aspects of DMD pathogenesis.
Benali, Anouar; Shulenburger, Luke; Krogel, Jaron T.; ...
2016-06-07
The Magneli phase Ti 4O 7 is an important transition metal oxide with a wide range of applications because of its interplay between charge, spin, and lattice degrees of freedom. At low temperatures, it has non-trivial magnetic states very close in energy, driven by electronic exchange and correlation interactions. We have examined three low- lying states, one ferromagnetic and two antiferromagnetic, and calculated their energies as well as Ti spin moment distributions using highly accurate Quantum Monte Carlo methods. We compare our results to those obtained from density functional theory- based methods that include approximate corrections for exchange and correlation.more » Our results confirm the nature of the states and their ordering in energy, as compared with density-functional theory methods. However, the energy differences and spin distributions differ. Here, a detailed analysis suggests that non-local exchange-correlation functionals, in addition to other approximations such as LDA+U to account for correlations, are needed to simultaneously obtain better estimates for spin moments, distributions, energy differences and energy gaps.« less
Finite-Difference Time-Domain Analysis of Tapered Photonic Crystal Fiber
NASA Astrophysics Data System (ADS)
Ali, M. I. Md; Sanusidin, S. N.; Yusof, M. H. M.
2018-03-01
This paper brief about the simulation of tapered photonic crystal fiber (PCF) LMA-8 single-mode type based on correlation of scattering pattern at wavelength of 1.55 μm, analyzation of transmission spectrum at wavelength over the range of 1.0 until 2.5 μm and correlation of transmission spectrum with the refractive index change in photonic crystal holes with respect to taper size of 0.1 until 1.0 using Optiwave simulation software. The main objective is to simulate using Finite-Difference Time-Domain (FDTD) technique of tapered LMA-8 PCF and for sensing application by improving the capabilities of PCF without collapsing the crystal holes. The types of FDTD techniques used are scattering pattern and transverse transmission and principal component analysis (PCA) used as a mathematical tool to model the data obtained by MathCad software. The simulation results showed that there is no obvious correlation of scattering pattern at a wavelength of 1.55 μm, a correlation obtained between taper sizes with a transverse transmission and there is a parabolic relationship between the refractive index changes inside the crystal structure.
The time-frequency method of signal analysis in internal combustion engine diagnostics
NASA Astrophysics Data System (ADS)
Avramchuk, V. S.; Kazmin, V. P.; Faerman, V. A.; Le, V. T.
2017-01-01
The paper presents the results of the study of applicability of time-frequency correlation functions to solving the problems of internal combustion engine fault diagnostics. The proposed methods are theoretically justified and experimentally tested. In particular, the method’s applicability is illustrated by the example of specially generated signals that simulate the vibration of an engine both during the normal operation and in the case of a malfunction in the system supplying fuel to the cylinders. This method was confirmed during an experiment with an automobile internal combustion engine. The study offers the main findings of the simulation and the experiment and highlights certain characteristic features of time-frequency autocorrelation functions that allow one to identify malfunctions in an engine’s cylinder. The possibility in principle of using time-frequency correlation functions in function testing of the internal combustion engine is demonstrated. The paper’s conclusion proposes further research directions including the application of the method to diagnosing automobile gearboxes.
Definition of Beam Diameter for Electron Beam Welding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burgardt, Paul; Pierce, Stanley W.; Dvornak, Matthew John
It is useful to characterize the dimensions of the electron beam during process development for electron beam welding applications. Analysis of the behavior of electron beam welds is simplest when a single number can be assigned to the beam properties that describes the size of the beam spot; this value we generically call the “beam diameter”. This approach has worked well for most applications and electron beam welding machines with the weld dimensions (width and depth) correlating well with the beam diameter. However, in recent weld development for a refractory alloy, Ta-10W, welded with a low voltage electron beam machinemore » (LVEB), it was found that the weld dimensions (weld penetration and weld width) did not correlate well with the beam diameter and especially with the experimentally determined sharp focus point. These data suggest that the presently used definition of beam diameter may not be optimal for all applications. The possible reasons for this discrepancy and a suggested possible alternative diameter definition is the subject of this paper.« less
Investigation of short cavity CRDS noise terms by optical correlation
NASA Astrophysics Data System (ADS)
Griffin, Steven T.; Fathi, Jason
2013-05-01
Cavity Ring Down Spectroscopy (CRDS) has been identified as having significant potential for Department of Defense security and sensing applications. Significant factors in the development of new sensor architectures are portability, robustness and economy. A significant factor in new CRDS sensor architectures is cavity length. Prior publication has examined the role of cavity length in sensing modality both from the standpoint of the system's design and the identification of potential difficulties presented by novel approaches. Two of interest here are new noise terms that have been designated turbulence-like and speckle-like in prior publication. In the prior publication the theoretical and some empirical data was presented. This presentation addresses the automation of the experimental apparatus, new data analysis, and implications regarding the significance of the two noise terms. This is accomplished through an Analog-to- Digital Conversion (ADC) from the output of a custom designed optical correlator. Details of the unique application of the developed instrument and implications for short cavity (portable) CRDS applications are presented.
Enhanced correlation of received power-signal fluctuations in bidirectional optical links
NASA Astrophysics Data System (ADS)
Minet, Jean; Vorontsov, Mikhail A.; Polnau, Ernst; Dolfi, Daniel
2013-02-01
A study of the correlation between the power signals received at both ends of bidirectional free-space optical links is presented. By use of the quasi-optical approximation, we show that an ideal (theoretically 100%) power-signal correlation can be achieved in optical links with specially designed monostatic transceivers based on single-mode fiber collimators. The theoretical prediction of enhanced correlation is supported both by experiments conducted over a 7 km atmospheric path and wave optics numerical analysis of the corresponding bidirectional optical link. In the numerical simulations, we also compare correlation properties of received power signals for different atmospheric conditions and for optical links with monostatic and bistatic geometries based on single-mode fiber collimator and on power-in-the-bucket transceiver types. Applications of the observed phenomena for signal fading mitigation and turbulence-enhanced communication link security in free-space laser communication links are discussed.
Dynamical Analysis of an SEIT Epidemic Model with Application to Ebola Virus Transmission in Guinea.
Li, Zhiming; Teng, Zhidong; Feng, Xiaomei; Li, Yingke; Zhang, Huiguo
2015-01-01
In order to investigate the transmission mechanism of the infectious individual with Ebola virus, we establish an SEIT (susceptible, exposed in the latent period, infectious, and treated/recovery) epidemic model. The basic reproduction number is defined. The mathematical analysis on the existence and stability of the disease-free equilibrium and endemic equilibrium is given. As the applications of the model, we use the recognized infectious and death cases in Guinea to estimate parameters of the model by the least square method. With suitable parameter values, we obtain the estimated value of the basic reproduction number and analyze the sensitivity and uncertainty property by partial rank correlation coefficients.
Percolation analysis for cosmic web with discrete points
NASA Astrophysics Data System (ADS)
Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung
2018-01-01
Percolation analysis has long been used to quantify the connectivity of the cosmic web. Most of the previous work is based on density fields on grids. By smoothing into fields, we lose information about galaxy properties like shape or luminosity. The lack of mathematical modeling also limits our understanding for the percolation analysis. To overcome these difficulties, we have studied percolation analysis based on discrete points. Using a friends-of-friends (FoF) algorithm, we generate the S -b b relation, between the fractional mass of the largest connected group (S ) and the FoF linking length (b b ). We propose a new model, the probability cloud cluster expansion theory to relate the S -b b relation with correlation functions. We show that the S -b b relation reflects a combination of all orders of correlation functions. Using N-body simulation, we find that the S -b b relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with halo abundance matching (HAM), we have generated a mock galaxy catalog. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalog with the latest galaxy catalog from Sloan Digital Sky Survey (SDSS) Data Release (DR)12, we have found significant differences in their S -b b relations. This indicates that the mock galaxy catalog cannot accurately retain higher-order correlation functions than the two-point correlation function, which reveals the limit of the HAM method. As a new measurement, the S -b b relation is applicable to a wide range of data types, fast to compute, and robust against redshift distortion and incompleteness and contains information of all orders of correlation functions.
Hur'iev, S O; Novykov, F M; Shuryhin, O Iu; Ivanov, V I
2011-04-01
There were examined 131 injured persons, suffering penetrating abdominal wounding and hepatic injury. Correlation analysis was done, basing on studying of the results of the injured persons state estimation, using prognostic scales, aiming to prognosticate the traumatic process course.
Fluorescence correlation spectroscopy: principles and applications.
Bacia, Kirsten; Haustein, Elke; Schwille, Petra
2014-07-01
Fluorescence correlation spectroscopy (FCS) is used to study the movements and the interactions of biomolecules at extremely dilute concentrations, yielding results with good spatial and temporal resolutions. Using a number of technical developments, FCS has become a versatile technique that can be used to study a variety of sample types and can be advantageously combined with other methods. Unlike other fluorescence-based techniques, the analysis of FCS data is not based on the average intensity of the fluorescence emission but examines the minute intensity fluctuations caused by spontaneous deviations from the mean at thermal equilibrium. These fluctuations can result from variations in local concentrations owing to molecular mobility or from characteristic intermolecular or intramolecular reactions of fluorescently labeled biomolecules present at low concentrations. Here, we provide a basic introduction to FCS, including its technical development and theoretical basis, experimental setup of an FCS system, adjustment of a setup, data acquisition, and analysis of FCS measurements. Finally, the application of FCS to the study of lipid bilayer membranes and to living cells is discussed. © 2014 Cold Spring Harbor Laboratory Press.
Applications of temporal kernel canonical correlation analysis in adherence studies.
John, Majnu; Lencz, Todd; Ferbinteanu, Janina; Gallego, Juan A; Robinson, Delbert G
2017-10-01
Adherence to medication is often measured as a continuous outcome but analyzed as a dichotomous outcome due to lack of appropriate tools. In this paper, we illustrate the use of the temporal kernel canonical correlation analysis (tkCCA) as a method to analyze adherence measurements and symptom levels on a continuous scale. The tkCCA is a novel method developed for studying the relationship between neural signals and hemodynamic response detected by functional MRI during spontaneous activity. Although the tkCCA is a powerful tool, it has not been utilized outside the application that it was originally developed for. In this paper, we simulate time series of symptoms and adherence levels for patients with a hypothetical brain disorder and show how the tkCCA can be used to understand the relationship between them. We also examine, via simulations, the behavior of the tkCCA under various missing value mechanisms and imputation methods. Finally, we apply the tkCCA to a real data example of psychotic symptoms and adherence levels obtained from a study based on subjects with a first episode of schizophrenia, schizophreniform or schizoaffective disorder.
NASA Technical Reports Server (NTRS)
Jacobson, Allan S.; Berkin, Andrew L.
1995-01-01
The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration system resulting from a NASA Jet Propulsion Laboratory (JPL) program of research into the application of graphical methods for rapidly accessing, displaying, and analyzing large multi variate multidisciplinary data sets. Running under UNIX it is an integrated multi-application executing environment using a data-linking paradigm to dynamically interconnect and control multiple windows containing a variety of displays and manipulators. This paradigm, resulting in a system similar to a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but leads to a highly intuitive, easy-to-learn user interface. It provides great flexibility in rapidly interacting with large masses of complex data to detect trends, correlations, and anomalies. The system, containing an expanding suite of non-domain-specific applications, provides for the ingestion of a variety of data base formats and hard -copy output of all displays. Remote networked workstations running LinkWinds may be interconnected, providing a multiuser science environment (MUSE) for collaborative data exploration by a distributed science team. The system is being developed in close collaboration with investigators in a variety of science disciplines using both archived and real-time data. It is currently being used to support the Microwave Limb Sounder (MLS) in orbit aboard the Upper Atmosphere Research Satellite (UARS). This paper describes the application of LinkWinds to this data to rapidly detect features, such as the ozone hole configuration, and to analyze correlations between chemical constituents of the atmosphere.
An improved method for bivariate meta-analysis when within-study correlations are unknown.
Hong, Chuan; D Riley, Richard; Chen, Yong
2018-03-01
Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.
Erdal, Barbaros Selnur; Yildiz, Vedat; King, Mark A.; Patterson, Andrew T.; Knopp, Michael V.; Clymer, Bradley D.
2012-01-01
Background: Chest CT scans are commonly used to clinically assess disease severity in patients presenting with pulmonary sarcoidosis. Despite their ability to reliably detect subtle changes in lung disease, the utility of chest CT scans for guiding therapy is limited by the fact that image interpretation by radiologists is qualitative and highly variable. We sought to create a computerized CT image analysis tool that would provide quantitative and clinically relevant information. Methods: We established that a two-point correlation analysis approach reduced the background signal attendant to normal lung structures, such as blood vessels, airways, and lymphatics while highlighting diseased tissue. This approach was applied to multiple lung fields to generate an overall lung texture score (LTS) representing the quantity of diseased lung parenchyma. Using deidentified lung CT scan and pulmonary function test (PFT) data from The Ohio State University Medical Center’s Information Warehouse, we analyzed 71 consecutive CT scans from patients with sarcoidosis for whom simultaneous matching PFTs were available to determine whether the LTS correlated with standard PFT results. Results: We found a high correlation between LTS and FVC, total lung capacity, and diffusing capacity of the lung for carbon monoxide (P < .0001 for all comparisons). Moreover, LTS was equivalent to PFTs for the detection of active lung disease. The image analysis protocol was conducted quickly (< 1 min per study) on a standard laptop computer connected to a publicly available National Institutes of Health ImageJ toolkit. Conclusions: The two-point image analysis tool is highly practical and appears to reliably assess lung disease severity. We predict that this tool will be useful for clinical and research applications. PMID:22628487
Sicras-Mainar, Antoni; Velasco-Velasco, Soledad; Navarro-Artieda, Ruth; Aguado Jodar, Alba; Plana-Ripoll, Oleguer; Hermosilla-Pérez, Eduardo; Bolibar-Ribas, Bonaventura; Prados-Torres, Alejandra; Violan-Fors, Concepción
2013-04-01
The study aims to obtain the mean relative weights (MRWs) of the cost of care through the retrospective application of the adjusted clinical groups (ACGs) in several primary health care (PHC) centres in Catalonia (Spain) in routine clinical practice. This is a retrospective study based on computerized medical records. All patients attended by 13 PHC teams in 2008 were included. The principle measurements were: demographic variables (age and sex), dependent variables (number of diagnoses and total costs), and case-mix or co-morbidity variables (International Classification of Primary Care). The costs model for each patient was established by differentiating the fix costs from the variable costs. In the bivariate analysis, the Student's t, analysis of variance, chi-squared, Pearson's linear correlation and Mann-Whitney-Wilcoxon tests were used. In order to compare the MRW of the present study with those of the United States (US), the concordance [intraclass correlation coefficient (ICC) and concordance correlation coefficient (CCC)] and the correlation (coefficient of determination: R²) were measured. The total number of patients studied was 227,235, and the frequentation was 5.9 visits/habitant/year) and with a mean diagnoses number of 4.5 (3.2). The distribution of costs was €148.7 million, of which 29.1% were fixed costs. The mean total cost per patient/year was €654.2 (851.7), which was considered to be the reference MRW. Relationship between study-MRW and US-MRW: ICC was 0.40 [confidential interval (CI) 95%: 0.21-0.60] and the CCC was 0.42 (CI 95%: 0.35-0.49). The correlation between the US MRW and the MRW of the present study can be seen; the adjusted R² value is 0.691. The explanatory power of the ACG classification was 36.9% for the total costs. The R² of the total cost without considering outliers was 56.9%. The methodology has been shown appropriate for promoting the calculation of the MRW for each category of the classification. The results provide a possible practical application in PHC clinical management. © 2012 Blackwell Publishing Ltd.
Wang, Luman; Mo, Qiaochu; Wang, Jianxin
2015-01-01
Most current gene coexpression databases support the analysis for linear correlation of gene pairs, but not nonlinear correlation of them, which hinders precisely evaluating the gene-gene coexpression strengths. Here, we report a new database, MIrExpress, which takes advantage of the information theory, as well as the Pearson linear correlation method, to measure the linear correlation, nonlinear correlation, and their hybrid of cell-specific gene coexpressions in immune cells. For a given gene pair or probe set pair input by web users, both mutual information (MI) and Pearson correlation coefficient (r) are calculated, and several corresponding values are reported to reflect their coexpression correlation nature, including MI and r values, their respective rank orderings, their rank comparison, and their hybrid correlation value. Furthermore, for a given gene, the top 10 most relevant genes to it are displayed with the MI, r, or their hybrid perspective, respectively. Currently, the database totally includes 16 human cell groups, involving 20,283 human genes. The expression data and the calculated correlation results from the database are interactively accessible on the web page and can be implemented for other related applications and researches. PMID:26881263
Wang, Luman; Mo, Qiaochu; Wang, Jianxin
2015-01-01
Most current gene coexpression databases support the analysis for linear correlation of gene pairs, but not nonlinear correlation of them, which hinders precisely evaluating the gene-gene coexpression strengths. Here, we report a new database, MIrExpress, which takes advantage of the information theory, as well as the Pearson linear correlation method, to measure the linear correlation, nonlinear correlation, and their hybrid of cell-specific gene coexpressions in immune cells. For a given gene pair or probe set pair input by web users, both mutual information (MI) and Pearson correlation coefficient (r) are calculated, and several corresponding values are reported to reflect their coexpression correlation nature, including MI and r values, their respective rank orderings, their rank comparison, and their hybrid correlation value. Furthermore, for a given gene, the top 10 most relevant genes to it are displayed with the MI, r, or their hybrid perspective, respectively. Currently, the database totally includes 16 human cell groups, involving 20,283 human genes. The expression data and the calculated correlation results from the database are interactively accessible on the web page and can be implemented for other related applications and researches.
First results from a combined analysis of CERN computing infrastructure metrics
NASA Astrophysics Data System (ADS)
Duellmann, Dirk; Nieke, Christian
2017-10-01
The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.
Correlational approach to study interactions between dust Brownian particles in a plasma
NASA Astrophysics Data System (ADS)
Lisin, E. A.; Vaulina, O. S.; Petrov, O. F.
2018-01-01
A general approach to the correlational analysis of Brownian motion of strongly coupled particles in open dissipative systems is described. This approach can be applied to the theoretical description of various non-ideal statistically equilibrium systems (including non-Hamiltonian systems), as well as for the analysis of experimental data. In this paper, we consider an application of the correlational approach to the problem of experimental exploring the wake-mediated nonreciprocal interactions in complex plasmas. We derive simple analytic equations, which allows one to calculate the gradients of forces acting on a microparticle due to each of other particles as well as the gradients of external field, knowing only the information on time-averaged correlations of particles displacements and velocities. We show the importance of taking dissipative and random processes into account, without which consideration of a system with a nonreciprocal interparticle interaction as linearly coupled oscillators leads to significant errors in determining the characteristic frequencies in a system. In the examples of numerical simulations, we demonstrate that the proposed original approach could be an effective instrument in exploring the longitudinal wake structure of a microparticle in a plasma. Unlike the previous attempts to study the wake-mediated interactions in complex plasmas, our method does not require any external perturbations and is based on Brownian motion analysis only.
Control of separation and quantitative analysis by GC-FTIR
NASA Astrophysics Data System (ADS)
Semmoud, A.; Huvenne, Jean P.; Legrand, P.
1992-03-01
Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Mann, Michael J.
1992-01-01
A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.
Fluorescence fluctuation spectroscopy for clinical applications
NASA Astrophysics Data System (ADS)
Olson, Eben
Fluorescence correlation spectroscopy (FCS) and the related techniques of brightness analysis have become standard tools in biological and biophysical research. By analyzing the statistics of fluorescence emitted from a restricted volume, a number of parameters including concentrations, diffusion coefficients and chemical reaction rates can be determined. The single-molecule sensitivity, spectral selectivity, small sample volume and non-perturbative measurement mechanism of FCS make it an excellent technique for the study of molecular interactions. However, its adoption outside of the research laboratory has been limited. Potential reasons for this include the cost and complexity of the required apparatus. In this work, the application of fluorescence fluctuation analysis to several clinical problems is considered. Optical designs for FCS instruments which reduce the cost and increase alignment tolerance are presented. Brightness analysis of heterogenous systems, with application to the characterization of protein aggregates and multimer distributions, is considered. Methods for FCS-based assays of two clinically relevant proteins, von Willebrand factor and haptoglobin, are presented as well.
2013-01-01
Background Metabolomics has become increasingly popular in the study of disease phenotypes and molecular pathophysiology. One branch of metabolomics that encompasses the high-throughput screening of cellular metabolism is metabolic profiling. In the present study, the metabolic profiles of different tumour cells from colorectal carcinoma and breast adenocarcinoma were exposed to hypoxic and normoxic conditions and these have been compared to reveal the potential metabolic effects of hypoxia on the biochemistry of the tumour cells; this may contribute to their survival in oxygen compromised environments. In an attempt to analyse the complex interactions between metabolites beyond routine univariate and multivariate data analysis methods, correlation analysis has been integrated with a human metabolic reconstruction to reveal connections between pathways that are associated with normoxic or hypoxic oxygen environments. Results Correlation analysis has revealed statistically significant connections between metabolites, where differences in correlations between cells exposed to different oxygen levels have been highlighted as markers of hypoxic metabolism in cancer. Network mapping onto reconstructed human metabolic models is a novel addition to correlation analysis. Correlated metabolites have been mapped onto the Edinburgh human metabolic network (EHMN) with the aim of interlinking metabolites found to be regulated in a similar fashion in response to oxygen. This revealed novel pathways within the metabolic network that may be key to tumour cell survival at low oxygen. Results show that the metabolic responses to lowering oxygen availability can be conserved or specific to a particular cell line. Network-based correlation analysis identified conserved metabolites including malate, pyruvate, 2-oxoglutarate, glutamate and fructose-6-phosphate. In this way, this method has revealed metabolites not previously linked, or less well recognised, with respect to hypoxia before. Lactate fermentation is one of the key themes discussed in the field of hypoxia; however, malate, pyruvate, 2-oxoglutarate, glutamate and fructose-6-phosphate, which are connected by a single pathway, may provide a more significant marker of hypoxia in cancer. Conclusions Metabolic networks generated for each cell line were compared to identify conserved metabolite pathway responses to low oxygen environments. Furthermore, we believe this methodology will have general application within metabolomics. PMID:24153255
Reachability Analysis Applied to Space Situational Awareness
NASA Astrophysics Data System (ADS)
Holzinger, M.; Scheeres, D.
Several existing and emerging applications of Space Situational Awareness (SSA) relate directly to spacecraft Rendezvous, Proximity Operations, and Docking (RPOD) and Formation / Cluster Flight (FCF). When multiple Resident Space Ob jects (RSOs) are in vicinity of one another with appreciable periods between observations, correlating new RSO tracks to previously known objects becomes a non-trivial problem. A particularly difficult sub-problem is seen when long breaks in observations are coupled with continuous, low- thrust maneuvers. Reachability theory, directly related to optimal control theory, can compute contiguous reachability sets for known or estimated control authority and can support such RSO search and correlation efforts in both ground and on-board settings. Reachability analysis can also directly estimate the minimum control authority of a given RSO. For RPOD and FCF applications, emerging mission concepts such as fractionation drastically increase system complexity of on-board autonomous fault management systems. Reachability theory, as applied to SSA in RPOD and FCF applications, can involve correlation of nearby RSO observations, control authority estimation, and sensor track re-acquisition. Additional uses of reachability analysis are formation reconfiguration, worst-case passive safety, and propulsion failure modes such as a "stuck" thruster. Existing reachability theory is applied to RPOD and FCF regimes. An optimal control policy is developed to maximize the reachability set and optimal control law discontinuities (switching) are examined. The Clohessy-Wiltshire linearized equations of motion are normalized to accentuate relative control authority for spacecraft propulsion systems at both Low Earth Orbit (LEO) and Geostationary Earth Orbit (GEO). Several examples with traditional and low thrust propulsion systems in LEO and GEO are explored to illustrate the effects of relative control authority on the time-varying reachability set surface. Both monopropellant spacecraft at LEO and Hall thruster spacecraft at GEO are shown to be strongly actuated while Hall thruster spacecraft at LEO are found to be weakly actuated. Weaknesses with the current implementation are discussed and future numerical improvements and analytical efforts are discussed.
Wide-Band Monolithic Acoustoelectric Memory Correlators.
1982-11-01
piezoelectric and non- earlier analysis of thin- oxide varactors . The new analysis ex- conducting. Tapped structures which satisfy this criterion are plains...for tapped LiNbO3/metal- oxide - important realization. The logical consequence is that only silicon [26] structures is, in fact, not applicable here. It...Clarke, "The GaAs SAW depletion layer of’ the diode array. A more complex structure, diode storage correlalor," in 1980 Ultrasonics Synp. Proc., pp a GaAs
Aznar, Margarita; Arroyo, Teresa
2007-09-21
The purge-and-trap extraction method, coupled to a gas chromatograph with mass spectrometry detection, has been applied to the determination of 26 aromatic volatiles in wine. The method was optimized, validated and applied to the analyses of 40 red and white wines from 7 different Spanish regions. Principal components analyses of data showed the correlation between wines of similar origin.
Audit of admission to medical school: II--Shortlisting and interviews.
McManus, I C; Richards, P
1984-01-01
Analysis of shortlisting of applicants for interview at St Mary's Hospital Medical School showed that factor analysis could reduce the selection criteria to three independent scales--"academic ability," "interests," and "community service"--all of which contributed to the interview decision. Early applicants scored more highly on all three factors but were still at a greater advantage in selection for interview than would have been predicted. The dean's judgment of priority for interview from the UCCA form was found to predict a candidate's chance of acceptance at other medical schools besides St Mary's. Analysis of interviewing showed high correlations among interviewers in their assessments, although there was evidence of influence by the chairmen. Factor analysis showed three major factors--academic suitability, non-academic suitability, and health--of which non academic suitability was the major determinant of interview success. Non academic suitability was related to personality (high extraversion and low psychoticism) and to the choices made on the UCCA form. The system of admission interviews enabled greater emphasis to be put on broader interests and achievements than if selection had been on the basis of UCCA application form alone. PMID:6437522
NASA Astrophysics Data System (ADS)
Bratchenko, Ivan A.; Artemyev, Dmitry N.; Myakinin, Oleg O.; Khristoforova, Yulia A.; Moryatov, Alexander A.; Kozlov, Sergey V.; Zakharov, Valery P.
2017-02-01
The differentiation of skin melanomas and basal cell carcinomas (BCCs) was demonstrated based on combined analysis of Raman and autofluorescence spectra stimulated by visible and NIR lasers. It was ex vivo tested on 39 melanomas and 40 BCCs. Six spectroscopic criteria utilizing information about alteration of melanin, porphyrins, flavins, lipids, and collagen content in tumor with a comparison to healthy skin were proposed. The measured correlation between the proposed criteria makes it possible to define weakly correlated criteria groups for discriminant analysis and principal components analysis application. It was shown that the accuracy of cancerous tissues classification reaches 97.3% for a combined 6-criteria multimodal algorithm, while the accuracy determined separately for each modality does not exceed 79%. The combined 6-D method is a rapid and reliable tool for malignant skin detection and classification.
Bilenko, Natalia Y; Gallant, Jack L
2016-01-01
In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Pyrcca to implement cross-subject comparison in a natural movie functional magnetic resonance imaging (fMRI) experiment by finding a data-driven set of functional response patterns that are similar across individuals. We validate this cross-subject comparison method in Pyrcca by predicting responses to novel natural movies across subjects. Finally, we show how Pyrcca can reveal retinotopic organization in brain responses to natural movies without the need for an explicit model.
Bilenko, Natalia Y.; Gallant, Jack L.
2016-01-01
In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Pyrcca to implement cross-subject comparison in a natural movie functional magnetic resonance imaging (fMRI) experiment by finding a data-driven set of functional response patterns that are similar across individuals. We validate this cross-subject comparison method in Pyrcca by predicting responses to novel natural movies across subjects. Finally, we show how Pyrcca can reveal retinotopic organization in brain responses to natural movies without the need for an explicit model. PMID:27920675
Sakakibara, Eisuke; Homae, Fumitaka; Kawasaki, Shingo; Nishimura, Yukika; Takizawa, Ryu; Koike, Shinsuke; Kinoshita, Akihide; Sakurada, Hanako; Yamagishi, Mika; Nishimura, Fumichika; Yoshikawa, Akane; Inai, Aya; Nishioka, Masaki; Eriguchi, Yosuke; Matsuoka, Jun; Satomura, Yoshihiro; Okada, Naohiro; Kakiuchi, Chihiro; Araki, Tsuyoshi; Kan, Chiemi; Umeda, Maki; Shimazu, Akihito; Uga, Minako; Dan, Ippeita; Hashimoto, Hideki; Kawakami, Norito; Kasai, Kiyoto
2016-11-15
Multichannel near-infrared spectroscopy (NIRS) is a functional neuroimaging modality that enables easy-to-use and noninvasive measurement of changes in blood oxygenation levels. We developed a clinically-applicable method for estimating resting state functional connectivity (RSFC) with NIRS using a partial correlation analysis to reduce the influence of extraneural components. Using a multi-distance probe arrangement NIRS, we measured resting state brain activity for 8min in 17 healthy participants. Independent component analysis was used to extract shallow and deep signals from the original NIRS data. Pearson's correlation calculated from original signals was significantly higher than that calculated from deep signals, while partial correlation calculated from original signals was comparable to that calculated from deep (cerebral-tissue) signals alone. To further test the validity of our method, we also measured 8min of resting state brain activity using a whole-head NIRS arrangement consisting of 17 cortical regions in 80 healthy participants. Significant RSFC between neighboring, interhemispheric homologous, and some distant ipsilateral brain region pairs was revealed. Additionally, females exhibited higher RSFC between interhemispheric occipital region-pairs, in addition to higher connectivity between some ipsilateral pairs in the left hemisphere, when compared to males. The combined results of the two component experiments indicate that partial correlation analysis is effective in reducing the influence of extracerebral signals, and that NIRS is able to detect well-described resting state networks and sex-related differences in RSFC. Copyright © 2016 Elsevier Inc. All rights reserved.
Uncertainty Quantification Techniques of SCALE/TSUNAMI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don
2011-01-01
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less
NASA Astrophysics Data System (ADS)
Nakahara, H.
2013-12-01
For monitoring temporal changes in subsurface structures, I propose to use auto correlation functions of coda waves from local earthquakes recorded at surface receivers, which probably contain more body waves than surface waves. Because the use of coda waves requires earthquakes, time resolution for monitoring decreases. But at regions with high seismicity, it may be possible to monitor subsurface structures in sufficient time resolutions. Studying the 2011 Tohoku-Oki (Mw 9.0), Japan, earthquake for which velocity changes have been already reported by previous studies, I try to validate the method. KiK-net stations in northern Honshu are used in the analysis. For each moderate earthquake, normalized auto correlation functions of surface records are stacked with respect to time windows in S-wave coda. Aligning the stacked normalized auto correlation functions with time, I search for changes in arrival times of phases. The phases at lag times of less than 1s are studied because changes at shallow depths are focused. Based on the stretching method, temporal variations in the arrival times are measured at the stations. Clear phase delays are found to be associated with the mainshock and to gradually recover with time. Amounts of the phase delays are in the order of 10% on average with the maximum of about 50% at some stations. For validation, the deconvolution analysis using surface and subsurface records at the same stations are conducted. The results show that the phase delays from the deconvolution analysis are slightly smaller than those from the auto correlation analysis, which implies that the phases on the auto correlations are caused by larger velocity changes at shallower depths. The auto correlation analysis seems to have an accuracy of about several percents, which is much larger than methods using earthquake doublets and borehole array data. So this analysis might be applicable to detect larger changes. In spite of these disadvantages, this analysis is still attractive because it can be applied to many records on the surface in regions where no boreholes are available. Acknowledgements: Seismograms recorded by KiK-net managed by National Research Institute for Earth Science and Disaster Prevention (NIED) were used in this study. This study was partially supported by JST J-RAPID program and JSPS KAKENHI Grant Numbers 24540449 and 23540449.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
NASA Astrophysics Data System (ADS)
Pujiwati, Arie; Nakamura, K.; Watanabe, N.; Komai, T.
2018-02-01
Multivariate analysis is applied to investigate geochemistry of several trace elements in top soils and their relation with the contamination source as the influence of coal mines in Jorong, South Kalimantan. Total concentration of Cd, V, Co, Ni, Cr, Zn, As, Pb, Sb, Cu and Ba was determined in 20 soil samples by the bulk analysis. Pearson correlation is applied to specify the linear correlation among the elements. Principal Component Analysis (PCA) and Cluster Analysis (CA) were applied to observe the classification of trace elements and contamination sources. The results suggest that contamination loading is contributed by Cr, Cu, Ni, Zn, As, and Pb. The elemental loading mostly affects the non-coal mining area, for instances the area near settlement and agricultural land use. Moreover, the contamination source is classified into the areas that are influenced by the coal mining activity, the agricultural types, and the river mixing zone. Multivariate analysis could elucidate the elemental loading and the contamination sources of trace elements in the vicinity of coal mine area.
Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS
Brown, C. S.; Zhang, Hongbin
2016-05-24
Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less
NASA Astrophysics Data System (ADS)
Waugh, Rachael C.; Dulieu-Barton, Janice M.; Quinn, S.
2015-03-01
Thermoelastic stress analysis (TSA) is an established active thermographic approach which uses the thermoelastic effect to correlate the temperature change that occurs as a material is subjected to elastic cyclic loading to the sum of the principal stresses on the surface of the component. Digital image correlation (DIC) tracks features on the surface of a material to establish a displacement field of a component subjected to load, which can then be used to calculate the strain field. The application of both DIC and TSA on a composite plate representative of aircraft secondary structure subject to resonant frequency loading using a portable loading device, i.e. `remote loading' is described. Laboratory based loading for TSA and DIC is typically imparted using a test machine, however in the current work a vibration loading system is used which is able to excite the component of interest at resonant frequency which enables TSA and DIC to be carried out. The accuracy of the measurements made under remote loading of both of the optical techniques applied is discussed. The data are compared to extract complimentary information from the two techniques. This work forms a step towards a combined strain based non-destructive evaluation procedure able to identify and quantify the effect of defects more fully, particularly when examining component performance in service applications.
Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P
2017-01-11
Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.
Information flow and causality as rigorous notions ab initio
NASA Astrophysics Data System (ADS)
Liang, X. San
2016-11-01
Information flow or information transfer the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality is firmly rooted in the dynamical system that lies beneath. The principle of nil causality that reads, an event is not causal to another if the evolution of the latter is independent of the former, which transfer entropy analysis and Granger causality test fail to verify in many situations, turns out to be a proven theorem here. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems, both deterministic and stochastic. They have been obtained explicitly in closed form, and put to applications with the benchmark systems such as the Kaplan-Yorke map, Rössler system, baker transformation, Hénon map, and stochastic potential flow. Besides unraveling the causal relations as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, providing a mathematical basis for the long-standing philosophical debate over causation versus correlation.
Characterization of microcracks by application of digital image correlation to SPM images
NASA Astrophysics Data System (ADS)
Keller, Juergen; Gollhardt, Astrid; Vogel, Dietmar; Michel, Bernd
2004-07-01
With the development of micro- and nanotechnological products such as sensors, MEMS/NEMS and their broad application in a variety of market segments new reliability issues will arise. The increasing interface-to-volume ratio in highly integrated systems and nanoparticle filled materials and unsolved questions of size effect of nanomaterials are challenges for experimental reliability evaluation. To fulfill this needs the authors developed the nanoDAC method (nano Deformation Analysis by Correlation), which allows the determination and evaluation of 2D displacement fields based on scanning probe microscopy (SPM) data. In-situ SPM scans of the analyzed object are carried out at different thermo-mechanical load states. The obtained topography-, phase- or error-images are compared utilizing grayscale cross correlation algorithms. This allows the tracking of local image patterns of the analyzed surface structure. The measurement results of the nanoDAC method are full-field displacement and strain fields. Due to the application of SPM equipment deformations in the micro-, nanometer range can be easily detected. The method can be performed on bulk materials, thin films and on devices i.e microelectronic components, sensors or MEMS/NEMS. Furthermore, the characterization and evaluation of micro- and nanocracks or defects in bulk materials, thin layers and at material interfaces can be carried out.
Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad
2015-12-22
There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach's alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Cronbach's α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001), fast food restaurants (r = 0.729; p < 0.0001), parks (r = 0.773; p < 0.0001) and sidewalks (r = 0.648; p < 0.0001) within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023), median household incomes (r = -0.181; p < 0.0001), and owner occupied rates (r = -0.440; p < 0.0001). However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect of clinical trials.
Siegle, Cristhina Bonilha Huster; Dos Santos Cardoso de Sá, Cristina
2018-02-01
Exposure to HIV during pregnancy is a risks to development. Exposed child should have assessed its development since birth. Alberta Infant Motor Scale is a tool which assess gross motor skills, with easy application and low cost. Up to now, this scale had not proven its validity for the population exposed to HIV. It's necessary to compare its with a gold standard tool, Bayley scale, which assess gross and fine motor skills, has a high cost and longer application time required. Studies compare results of Alberta with Bayley's total motor score (gross + fine). However, it's also necessary to compare Alberta's result with only Bayley's gross motor result, because it's what both evaluate in common. to verify the concurrent validity of AIMS in infants exposed to HIV; to verify the correlation of AIMS and BSITD III for this population and to compare if these coefficients differ in the central age groups and extremities of the AIMS. 82 infants exposed to HIV evaluated in 1st, 2nd, 3rd, 4th, 8th, 12th, 15th, 16th, 17th and 18th months, with Alberta Infant Motor Scale and Bayley Scale (motor subscale). For analysis of concurrent validity, results of raw scores of the scales were compared with the correlation analysis. First analysis: Alberta's score with Bayley's total (gross + fine) motor score. Second analysis: Alberta's score with Bayley's gross motor score. In the first correlation analysis, results were: r = 0.62 in 1 st month, r = 0.64 in 2nd month, r = 0.08 in 3rd month, r = 0.45 in 4th month; r = 0.62 in 8th month, r = 0.60 in the 12th month. In the second correlation analysis, results were: r = 0.69 in 1 st month; r = 0.58 in 2nd month; r = 0.25 in 3rd month; r = 0.45 in the 4th month; r = 0.77 in 8th month; r = 0.73 in 12th month. Analyzes of the 15th, 16th, 17th and 18th months couldn't be performed because at these ages all the children had already reached the maximum score in the AIMS. Results were significant and indicate correlation between scales. Found results agree with other studies that found high correlations between the scales in premature and risk groups. However, these studies compare results of gross motor skills assessments with gross and fine motor skills assessments. Our results show that correlation only between the gross motor skills have higher coefficient values, and we believe this is the best way to compare the scales, with what both assessed in common. Alberta scale has correlation with Bayley scale in assessing of children exposed to HIV, and can be a substitute to Bayley in assessing of these children. Results are stronger when comparing only what both scales assess in common. Copyright © 2018 Elsevier Inc. All rights reserved.
A symmetric multivariate leakage correction for MEG connectomes
Colclough, G.L.; Brookes, M.J.; Smith, S.M.; Woolrich, M.W.
2015-01-01
Ambiguities in the source reconstruction of magnetoencephalographic (MEG) measurements can cause spurious correlations between estimated source time-courses. In this paper, we propose a symmetric orthogonalisation method to correct for these artificial correlations between a set of multiple regions of interest (ROIs). This process enables the straightforward application of network modelling methods, including partial correlation or multivariate autoregressive modelling, to infer connectomes, or functional networks, from the corrected ROIs. Here, we apply the correction to simulated MEG recordings of simple networks and to a resting-state dataset collected from eight subjects, before computing the partial correlations between power envelopes of the corrected ROItime-courses. We show accurate reconstruction of our simulated networks, and in the analysis of real MEGresting-state connectivity, we find dense bilateral connections within the motor and visual networks, together with longer-range direct fronto-parietal connections. PMID:25862259
Ivanusic, Daniel; Denner, Joachim; Bannert, Norbert
2016-08-01
This unit provides a guide and detailed protocol for studying membrane protein-protein interactions (PPI) using the acceptor-sensitized Förster resonance electron transfer (FRET) method in combination with the proximity ligation assay (PLA). The protocol in this unit is focused on the preparation of FRET-PLA samples and the detection of correlative FRET/PLA signals as well as on the analysis of FRET-PLA data and interpretation of correlative results when using cyan fluorescent protein (CFP) as a FRET donor and yellow fluorescent protein (YFP) as a FRET acceptor. The correlative application of FRET and PLA combines two powerful tools for monitoring PPI, yielding results that are more reliable than with either technique alone. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
A novel method for real-time edge-enhancement and its application to pattern recognition
NASA Astrophysics Data System (ADS)
Ge, Huayong; Bai, Enjian; Fan, Hong
2010-11-01
The coupling gain coefficient g is redefined and deduced based on coupling theory, the variant of coupling gain coefficient g for different ΓL and r is analyzed. A new optical system is proposed for image edge-enhancement. It recycles the back signal to amplify the edge signal, which has the advantages of high throughput efficiency and brightness. The optical system is designed and built, and the edge-enhanced image of hand bone is captured electronically by CCD camera. The principle of optical correlation is demonstrated, 3-D correlation distribution of letter H with and without edge-enhancement is simulated, the discrimination capability Iac and the full-width at half maximum intensity (FWHM) are compared for two kinds of correlators. The analysis shows that edge-enhancement preprocessing can improve the performance of correlator effectively.
Moore, Eric J; Price, Daniel L; Van Abel, Kathryn M; Carlson, Matthew L
2015-02-01
Application to otolaryngology-head and neck surgery residency is highly competitive, and the interview process strives to select qualified applicants with a high aptitude for the specialty. Commonly employed criteria for applicant selection have failed to show correlation with proficiency during residency training. We evaluate the correlation between the results of a surgical aptitude test administered to otolaryngology resident applicants and their performance during residency. Retrospective study at an academic otolaryngology-head and neck surgery residency program. Between 2007 and 2013, 224 resident applicants participated in a previously described surgical aptitude test administered at a microvascular surgical station. The composite score and attitudinal scores for 24 consecutive residents who matched at our institution were recorded, and their residency performance was analyzed by faculty survey on a five-point scale. The composite and attitudinal scores were analyzed for correlation with residency performance score by regression analysis. Twenty-four residents were evaluated for overall quality as a clinician by eight faculty members who were blinded to the results of surgical aptitude testing. The results of these surveys showed good inter-rater reliability. Both the overall aptitude test scores and the subset attitudinal score showed reliability in predicting performance during residency training. The goal of the residency selection process is to evaluate the candidate's potential for success in residency and beyond. The results of this study suggest that a simple-to-administer clinical skills test may have predictive value for success in residency and clinician quality. 4. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
A grid for a precise analysis of daily activities.
Wojtasik, V; Olivier, C; Lekeu, F; Quittre, A; Adam, S; Salmon, E
2010-01-01
Assessment of daily living activities is essential in patients with Alzheimer's disease. Most current tools quantitatively assess overall ability but provide little qualitative information on individual difficulties. Only a few tools allow therapists to evaluate stereotyped activities and record different types of errors. We capitalised on the Kitchen Activity Assessment to design a widely applicable analysis grid that provides both qualitative and quantitative data on activity performance. A cooking activity was videotaped in 15 patients with dementia and assessed according to the different steps in the execution of the task. The evaluations obtained with our grid showed good correlations between raters, between versions of the grid and between sessions. Moreover, the degree of independence obtained with our analysis of the task correlated with the Kitchen Activity Assessment score and with a global score of cognitive functioning. We conclude that assessment of a daily living activity with this analysis grid is reproducible and relatively independent of the therapist, and thus provides quantitative and qualitative information useful for both evaluating and caring for demented patients.
Dispersion analysis of passive surface-wave noise generated during hydraulic-fracturing operations
Forghani-Arani, Farnoush; Willis, Mark; Snieder, Roel; Haines, Seth S.; Behura, Jyoti; Batzle, Mike; Davidson, Michael
2014-01-01
Surface-wave dispersion analysis is useful for estimating near-surface shear-wave velocity models, designing receiver arrays, and suppressing surface waves. Here, we analyze whether passive seismic noise generated during hydraulic-fracturing operations can be used to extract surface-wave dispersion characteristics. Applying seismic interferometry to noise measurements, we extract surface waves by cross-correlating several minutes of passive records; this approach is distinct from previous studies that used hours or days of passive records for cross-correlation. For comparison, we also perform dispersion analysis for an active-source array that has some receivers in common with the passive array. The active and passive data show good agreement in the dispersive character of the fundamental-mode surface-waves. For the higher mode surface waves, however, active and passive data resolve the dispersive properties at different frequency ranges. To demonstrate an application of dispersion analysis, we invert the observed surface-wave dispersion characteristics to determine the near-surface, one-dimensional shear-wave velocity.
Cui, Yang; Wang, Silong; Yan, Shaokui
2016-01-01
Phi coefficient directly depends on the frequencies of occurrence of organisms and has been widely used in vegetation ecology to analyse the associations of organisms with site groups, providing a characterization of ecological preference, but its application in soil ecology remains rare. Based on a single field experiment, this study assessed the applicability of phi coefficient in indicating the habitat preferences of soil fauna, through comparing phi coefficient-induced results with those of ordination methods in charactering soil fauna-habitat(factors) relationships. Eight different habitats of soil fauna were implemented by reciprocal transfer of defaunated soil cores between two types of subtropical forests. Canonical correlation analysis (CCorA) showed that ecological patterns of fauna-habitat relationships and inter-fauna taxa relationships expressed, respectively, by phi coefficients and predicted abundances calculated from partial redundancy analysis (RDA), were extremely similar, and a highly significant relationship between the two datasets was observed (Pillai's trace statistic = 1.998, P = 0.007). In addition, highly positive correlations between phi coefficients and predicted abundances for Acari, Collembola, Nematode and Hemiptera were observed using linear regression analysis. Quantitative relationships between habitat preferences and soil chemical variables were also obtained by linear regression, which were analogous to the results displayed in a partial RDA biplot. Our results suggest that phi coefficient could be applicable on a local scale in evaluating habitat preferences of soil fauna at coarse taxonomic levels, and that the phi coefficient-induced information, such as ecological preferences and the associated quantitative relationships with habitat factors, will be largely complementary to the results of ordination methods. The application of phi coefficient in soil ecology may extend our knowledge about habitat preferences and distribution-abundance relationships, which will benefit the understanding of biodistributions and variations in community compositions in the soil. Similar studies in other places and scales apart from our local site will be need for further evaluation of phi coefficient.
Cui, Yang; Wang, Silong; Yan, Shaokui
2016-01-01
Phi coefficient directly depends on the frequencies of occurrence of organisms and has been widely used in vegetation ecology to analyse the associations of organisms with site groups, providing a characterization of ecological preference, but its application in soil ecology remains rare. Based on a single field experiment, this study assessed the applicability of phi coefficient in indicating the habitat preferences of soil fauna, through comparing phi coefficient-induced results with those of ordination methods in charactering soil fauna-habitat(factors) relationships. Eight different habitats of soil fauna were implemented by reciprocal transfer of defaunated soil cores between two types of subtropical forests. Canonical correlation analysis (CCorA) showed that ecological patterns of fauna-habitat relationships and inter-fauna taxa relationships expressed, respectively, by phi coefficients and predicted abundances calculated from partial redundancy analysis (RDA), were extremely similar, and a highly significant relationship between the two datasets was observed (Pillai's trace statistic = 1.998, P = 0.007). In addition, highly positive correlations between phi coefficients and predicted abundances for Acari, Collembola, Nematode and Hemiptera were observed using linear regression analysis. Quantitative relationships between habitat preferences and soil chemical variables were also obtained by linear regression, which were analogous to the results displayed in a partial RDA biplot. Our results suggest that phi coefficient could be applicable on a local scale in evaluating habitat preferences of soil fauna at coarse taxonomic levels, and that the phi coefficient-induced information, such as ecological preferences and the associated quantitative relationships with habitat factors, will be largely complementary to the results of ordination methods. The application of phi coefficient in soil ecology may extend our knowledge about habitat preferences and distribution-abundance relationships, which will benefit the understanding of biodistributions and variations in community compositions in the soil. Similar studies in other places and scales apart from our local site will be need for further evaluation of phi coefficient. PMID:26930593
A model of return intervals between earthquake events
NASA Astrophysics Data System (ADS)
Zhou, Yu; Chechkin, Aleksei; Sokolov, Igor M.; Kantz, Holger
2016-06-01
Application of the diffusion entropy analysis and the standard deviation analysis to the time sequence of the southern California earthquake events from 1976 to 2002 uncovered scaling behavior typical for anomalous diffusion. However, the origin of such behavior is still under debate. Some studies attribute the scaling behavior to the correlations in the return intervals, or waiting times, between aftershocks or mainshocks. To elucidate a nature of the scaling, we applied specific reshulffling techniques to eliminate correlations between different types of events and then examined how it affects the scaling behavior. We demonstrate that the origin of the scaling behavior observed is the interplay between mainshock waiting time distribution and the structure of clusters of aftershocks, but not correlations in waiting times between the mainshocks and aftershocks themselves. Our findings are corroborated by numerical simulations of a simple model showing a very similar behavior. The mainshocks are modeled by a renewal process with a power-law waiting time distribution between events, and aftershocks follow a nonhomogeneous Poisson process with the rate governed by Omori's law.
Spurgeon, David J; Rowland, Philip; Ainsworth, Gillian; Rothery, Peter; Long, Sara; Black, Helaina I J
2008-05-01
Concentrations of seven metals were measured in over 1000 samples as part of an integrated survey. Sixteen metal pairs were significantly positively correlated. Cluster analysis identified two clusters. Metals from the largest (Cr, Cu, Ni, V, Zn), but not the smallest (Cd, Pb) cluster were significantly negatively correlated with spatial location and soil pH and organic matter content. Cd and Pb were not correlated with these parameters, due possibly to the masking effect of recent extensive release. Analysis of trends with soil properties in different habitats indicated that general trends may not necessarily be applicable to all areas. A risk assessment indicated that Zn poses the most widespread direct risk to soil fauna and Cd the least. Any risks associated with high metal concentrations are, however, likely to be greatest in habitats such as arable and horticultural, improved grassland and built up areas where soil metal concentrations are more frequently elevated.
Creep-rupture reliability analysis
NASA Technical Reports Server (NTRS)
Peralta-Duran, A.; Wirsching, P. H.
1984-01-01
A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.
Gao, Jianbo; Hu, Jing; Mao, Xiang; Perc, Matjaž
2012-01-01
Culturomics was recently introduced as the application of high-throughput data collection and analysis to the study of human culture. Here, we make use of these data by investigating fluctuations in yearly usage frequencies of specific words that describe social and natural phenomena, as derived from books that were published over the course of the past two centuries. We show that the determination of the Hurst parameter by means of fractal analysis provides fundamental insights into the nature of long-range correlations contained in the culturomic trajectories, and by doing so offers new interpretations as to what might be the main driving forces behind the examined phenomena. Quite remarkably, we find that social and natural phenomena are governed by fundamentally different processes. While natural phenomena have properties that are typical for processes with persistent long-range correlations, social phenomena are better described as non-stationary, on–off intermittent or Lévy walk processes. PMID:22337632
[Mathematical exploration of essence of herbal properties based on "Three-Elements" theory].
Jin, Rui; Zhao, Qian; Zhang, Bing
2014-10-01
Herbal property theory of traditional Chinese medicines is the theoretical guidance on authentication of medicinal plants, herborization, preparation of herbal medicines for decoction and clinical application, with important theoretical value and prac- tical significance. Our research team proposed the "three-element" theory for herbal properties for the first time, conducted a study by using combined methods of philology, chemistry, pharmacology and mathematics, and then drew the research conclusion that herbal properties are defined as the chemical compositions-based comprehensive expression with complex and multi-level (positive/negative) biological effects in specific organism state. In this paper, researchers made a systematic mathematical analysis in four aspects--the correlation between herbal properties and chemical component factors, the correlation between herbal properties and organism state fac- tor, the correlation between herbal properties and biological effect factor and the integration study of the three elements, proposed future outlook, and provided reference to mathematical studies and mathematical analysis of herbal properties.
Kevill, Dennis Neil; Kim, Chang-Bae; D'Souza, Malcolm John
2018-03-01
A Grunwald-Winstein treatment of the specific rates of solvolysis of α-bromoisobutyrophenone in 100% methanol and in several aqueous ethanol, methanol, acetone, 2,2,2-trifluoroethanol (TFE), and 1,1,1,3,3,3-hexafluoro-2-propanol (HFIP) mixtures gives a good logarithmic correlation against a linear combination of N T (solvent nucleophilicity) and Y Br (solvent ionizing power) values. The l and m sensitivity values are compared to those previously reported for α-bromoacetophenone and to those obtained from parallel treatments of literature specific rate values for the solvolyses of several tertiary mesylates containing a C(=O)R group attached at the α-carbon. Kinetic data obtained earlier by Pasto and Sevenair for the solvolyses of the same substrate in 75% aqueous ethanol (by weight) in the presence of silver perchlorate and perchloric acid are analyzed using multiple regression analysis.
A simple program to measure and analyse tree rings using Excel, R and SigmaScan
Hietz, Peter
2011-01-01
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835
Arsenyev, P A; Trezvov, V V; Saratovskaya, N V
1997-01-01
This work represents a method, which allows to determine phase composition of calcium hydroxylapatite basing on its infrared spectrum. The method uses factor analysis of the spectral data of calibration set of samples to determine minimal number of factors required to reproduce the spectra within experimental error. Multiple linear regression is applied to establish correlation between factor scores of calibration standards and their properties. The regression equations can be used to predict the property value of unknown sample. The regression model was built for determination of beta-tricalcium phosphate content in hydroxylapatite. Statistical estimation of quality of the model was carried out. Application of the factor analysis on spectral data allows to increase accuracy of beta-tricalcium phosphate determination and expand the range of determination towards its less concentration. Reproducibility of results is retained.
A simple program to measure and analyse tree rings using Excel, R and SigmaScan.
Hietz, Peter
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.
NASA Astrophysics Data System (ADS)
Chowdhry, Bhawani Shankar; White, Neil M.; Jeswani, Jai Kumar; Dayo, Khalil; Rathi, Manorma
2009-07-01
Disasters affecting infrastructure, such as the 2001 earthquakes in India, 2005 in Pakistan, 2008 in China and the 2004 tsunami in Asia, provide a common need for intelligent buildings and smart civil structures. Now, imagine massive reductions in time to get the infrastructure working again, realtime information on damage to buildings, massive reductions in cost and time to certify that structures are undamaged and can still be operated, reductions in the number of structures to be rebuilt (if they are known not to be damaged). Achieving these ideas would lead to huge, quantifiable, long-term savings to government and industry. Wireless sensor networks (WSNs) can be deployed in buildings to make any civil structure both smart and intelligent. WSNs have recently gained much attention in both public and research communities because they are expected to bring a new paradigm to the interaction between humans, environment, and machines. This paper presents the deployment of WSN nodes in the Top Quality Centralized Instrumentation Centre (TQCIC). We created an ad hoc networking application to collect real-time data sensed from the nodes that were randomly distributed throughout the building. If the sensors are relocated, then the application automatically reconfigures itself in the light of the new routing topology. WSNs are event-based systems that rely on the collective effort of several micro-sensor nodes, which are continuously observing a physical phenomenon. WSN applications require spatially dense sensor deployment in order to achieve satisfactory coverage. The degree of spatial correlation increases with the decreasing inter-node separation. Energy consumption is reduced dramatically by having only those sensor nodes with unique readings transmit their data. We report on an algorithm based on a spatial correlation technique that assures high QoS (in terms of SNR) of the network as well as proper utilization of energy, by suppressing redundant data transmission. The visualization and analysis of WSN data are presented in a Windows-based user interface.
Naoe, Shoji; Tayasu, Ichiro; Masaki, Takashi; Koike, Shinsuke
2016-10-01
Vertical seed dispersal, which plays a key role in plant escape and/or expansion under climate change, was recently evaluated for the first time using negative correlation between altitudes and oxygen isotope ratio of seeds. Although this method is innovative, its applicability to other plants is unknown. To explore the applicability of the method, we regressed altitudes on δ 18 O of seeds of five woody species constituting three families in temperate forests in central Japan. Because climatic factors, including temperature and precipitation that influence δ 18 O of plant materials, demonstrate intensive seasonal fluctuation in the temperate zone, we also evaluated the effect of fruiting season of each species on δ 18 O of seeds using generalized linear mixed models (GLMM). Negative correlation between altitudes and δ 18 O of seeds was found in four of five species tested. The slope of regression lines tended to be lower in late-fruiting species. The GLMM analysis revealed that altitudes and date of fruiting peak negatively affected δ 18 O of seeds. These results indicate that the estimation of vertical seed dispersal using δ 18 O of seeds can be applicable for various species, not just confined to specific taxa, by identifying the altitudes of plants that produced seeds. The results also suggest that the regression line between altitudes and δ 18 O of seeds is rather species specific and that vertical seed dispersal in late-fruiting species is estimated at a low resolution due to their small regression slopes. A future study on the identification of environmental factors and plant traits that cause a difference in δ 18 O of seeds, combined with an improvement of analysis, will lead to effective evaluation of vertical seed dispersal in various species and thereby promote our understanding about the mechanism and ecological functions of vertical seed dispersal.
Pan, Wei; Hu, Yuan-Jia; Wang, Yi-Tao
2011-08-01
The structure of international flow of acupuncture knowledge was explored in this article so as to promote the globalization of acupuncture technology innovation. Statistical methods were adopted to reveal geographical distribution of acupuncture patents in the U.S.A. and the influencing factors of cumulative advantage of acupuncture techniques as well as innovation value of application of acupuncture patents. Social network analysis was also utilized to establish a global innovation network of acupuncture technology. The result shows that the cumulative strength on acupuncture technology correlates with the patent retention period. The innovative value of acupuncture invention correlates with the frequency of patent citation. And the U. S. A. and Canada seize central positions in the global acupuncture information and technology delivery system.
Physicochemical properties of quinoa starch.
Li, Guantian; Wang, Sunan; Zhu, Fan
2016-02-10
Physicochemical properties of quinoa starches isolated from 26 commercial samples from a wide range of collection were studied. Swelling power (SP), water solubility index (WSI), amylose leaching (AML), enzyme susceptibility, pasting, thermal and textural properties were analyzed. Apparent amylose contents (AAM) ranged from 7.7 to 25.7%. Great variations in the diverse physicochemical properties were observed. Correlation analysis showed that AAM was the most significant factor related to AML, WSI, and pasting parameters. Correlations among diverse physicochemical parameters were analyzed. Principal component analysis using twenty three variables were used to visualize the difference among samples. Six principal components were extracted which could explain 88.8% of the total difference. The wide variations in physicochemical properties could contribute to innovative utilization of quinoa starch for food and non-food applications. Copyright © 2015 Elsevier Ltd. All rights reserved.
Introduction and application of the multiscale coefficient of variation analysis.
Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh
2017-10-01
Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.
Calibration of groundwater vulnerability mapping using the generalized reduced gradient method.
Elçi, Alper
2017-12-01
Groundwater vulnerability assessment studies are essential in water resources management. Overlay-and-index methods such as DRASTIC are widely used for mapping of groundwater vulnerability, however, these methods mainly suffer from a subjective selection of model parameters. The objective of this study is to introduce a calibration procedure that results in a more accurate assessment of groundwater vulnerability. The improvement of the assessment is formulated as a parameter optimization problem using an objective function that is based on the correlation between actual groundwater contamination and vulnerability index values. The non-linear optimization problem is solved with the generalized-reduced-gradient (GRG) method, which is numerical algorithm based optimization method. To demonstrate the applicability of the procedure, a vulnerability map for the Tahtali stream basin is calibrated using nitrate concentration data. The calibration procedure is easy to implement and aims the maximization of correlation between observed pollutant concentrations and groundwater vulnerability index values. The influence of each vulnerability parameter in the calculation of the vulnerability index is assessed by performing a single-parameter sensitivity analysis. Results of the sensitivity analysis show that all factors are effective on the final vulnerability index. Calibration of the vulnerability map improves the correlation between index values and measured nitrate concentrations by 19%. The regression coefficient increases from 0.280 to 0.485. It is evident that the spatial distribution and the proportions of vulnerability class areas are significantly altered with the calibration process. Although the applicability of the calibration method is demonstrated on the DRASTIC model, the applicability of the approach is not specific to a certain model and can also be easily applied to other overlay-and-index methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Zilkens, Christoph; Miese, Falk; Kim, Young-Jo; Jäger, Marcus; Mamisch, Tallal C; Hosalkar, Harish; Antoch, Gerald; Krauspe, Rüdiger; Bittersohl, Bernd
2014-01-01
To investigate the potential of delayed gadolinium-enhanced magnetic resonance imaging in cartilage (dGEMRIC) after intra-articular (ia) contrast agent administration at 3 Tesla (T), a paired study comparing intravenous (iv) dGEMRIC (standard) with ia-dGEMRIC was performed. Thirty-five symptomatic patients with suspected cartilage damage underwent ia- and iv-dGEMRIC. MRI was performed with a 3T system wherein the interval between both measurements was 2 weeks. For iv-dGEMRIC, FDA approved Gd-DOTA(-) was injected intravenously 45 min before the MRI scan. For ia-dGEMRIC, 10-20 mL of a 2 mM solution of Gd- DOTA(-) was injected under fluoroscopic guidance 30 min before the MRI scan. Both ia- and iv-dGEMRIC demonstrated the typical T1Gd pattern in hip joint cartilage with increasing values toward the superior regions in acetabular cartilage reflecting the higher glycosaminoglycan (GAG) content in the main weight-bearing area. Correlation analysis revealed a moderate correlation between both techniques (r = 0.439, P-value < 0.001), whereas the T1Gd values for iv-dGEMRIC were significantly higher than those for ia-dGEMRIC. This corresponds with the Bland-Altman plot analysis, which revealed a systemic bias (higher T1Gd values after iv gadolinium application) of ∼70 ms. Ia-dGEMRIC was able to reveal the characteristic T1Gd pattern in hip joint cartilage confirming the sensitivity of ia-dGEMRIC for GAG. In addition, there was a significant correlation between iv-dGEMRIC and ia-dGEMRIC. However, the T1Gd values after ia contrast media application were significantly lower than those after iv application that has to be considered for future studies. Copyright © 2013 Wiley Periodicals, Inc.
Calibration of groundwater vulnerability mapping using the generalized reduced gradient method
NASA Astrophysics Data System (ADS)
Elçi, Alper
2017-12-01
Groundwater vulnerability assessment studies are essential in water resources management. Overlay-and-index methods such as DRASTIC are widely used for mapping of groundwater vulnerability, however, these methods mainly suffer from a subjective selection of model parameters. The objective of this study is to introduce a calibration procedure that results in a more accurate assessment of groundwater vulnerability. The improvement of the assessment is formulated as a parameter optimization problem using an objective function that is based on the correlation between actual groundwater contamination and vulnerability index values. The non-linear optimization problem is solved with the generalized-reduced-gradient (GRG) method, which is numerical algorithm based optimization method. To demonstrate the applicability of the procedure, a vulnerability map for the Tahtali stream basin is calibrated using nitrate concentration data. The calibration procedure is easy to implement and aims the maximization of correlation between observed pollutant concentrations and groundwater vulnerability index values. The influence of each vulnerability parameter in the calculation of the vulnerability index is assessed by performing a single-parameter sensitivity analysis. Results of the sensitivity analysis show that all factors are effective on the final vulnerability index. Calibration of the vulnerability map improves the correlation between index values and measured nitrate concentrations by 19%. The regression coefficient increases from 0.280 to 0.485. It is evident that the spatial distribution and the proportions of vulnerability class areas are significantly altered with the calibration process. Although the applicability of the calibration method is demonstrated on the DRASTIC model, the applicability of the approach is not specific to a certain model and can also be easily applied to other overlay-and-index methods.
Wienke, B R; O'Leary, T R
2008-05-01
Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application.
NASA Astrophysics Data System (ADS)
Li, Zhixiong; Yan, Xinping; Wang, Xuping; Peng, Zhongxiao
2016-06-01
In the complex gear transmission systems, in wind turbines a crack is one of the most common failure modes and can be fatal to the wind turbine power systems. A single sensor may suffer with issues relating to its installation position and direction, resulting in the collection of weak dynamic responses of the cracked gear. A multi-channel sensor system is hence applied in the signal acquisition and the blind source separation (BSS) technologies are employed to optimally process the information collected from multiple sensors. However, literature review finds that most of the BSS based fault detectors did not address the dependence/correlation between different moving components in the gear systems; particularly, the popular used independent component analysis (ICA) assumes mutual independence of different vibration sources. The fault detection performance may be significantly influenced by the dependence/correlation between vibration sources. In order to address this issue, this paper presents a new method based on the supervised order tracking bounded component analysis (SOTBCA) for gear crack detection in wind turbines. The bounded component analysis (BCA) is a state of art technology for dependent source separation and is applied limitedly to communication signals. To make it applicable for vibration analysis, in this work, the order tracking has been appropriately incorporated into the BCA framework to eliminate the noise and disturbance signal components. Then an autoregressive (AR) model built with prior knowledge about the crack fault is employed to supervise the reconstruction of the crack vibration source signature. The SOTBCA only outputs one source signal that has the closest distance with the AR model. Owing to the dependence tolerance ability of the BCA framework, interfering vibration sources that are dependent/correlated with the crack vibration source could be recognized by the SOTBCA, and hence, only useful fault information could be preserved in the reconstructed signal. The crack failure thus could be precisely identified by the cyclic spectral correlation analysis. A series of numerical simulations and experimental tests have been conducted to illustrate the advantages of the proposed SOTBCA method for fatigue crack detection. Comparisons to three representative techniques, i.e. Erdogan's BCA (E-BCA), joint approximate diagonalization of eigen-matrices (JADE), and FastICA, have demonstrated the effectiveness of the SOTBCA. Hence the proposed approach is suitable for accurate gear crack detection in practical applications.
SMS/GOES cell and battery data analysis report
NASA Technical Reports Server (NTRS)
Armantrout, J. D.
1977-01-01
The nickel-cadmium battery design developed for the Synchronous Meteorological Satellite (SMS) and Geostationary Operational Environmental Satellite (GOES) provided background and guidelines for future development, manufacture, and application of spacecraft batteries. SMS/GOES battery design, development, qualification testing, acceptance testing, and life testing/mission performance characteristics were evaluated for correlation with battery cell manufacturing process variables.
Medical University admission test: a confirmatory factor analysis of the results.
Luschin-Ebengreuth, Marion; Dimai, Hans P; Ithaler, Daniel; Neges, Heide M; Reibnegger, Gilbert
2016-05-01
The Graz Admission Test has been applied since the academic year 2006/2007. The validity of the Test was demonstrated by a significant improvement of study success and a significant reduction of dropout rate. The purpose of this study was a detailed analysis of the internal correlation structure of the various components of the Graz Admission Test. In particular, the question investigated was whether or not the various test parts constitute a suitable construct which might be designated as "Basic Knowledge in Natural Science." This study is an observational investigation, analyzing the results of the Graz Admission Test for the study of human medicine and dentistry. A total of 4741 applicants were included in the analysis. Principal component factor analysis (PCFA) as well as techniques from structural equation modeling, specifically confirmatory factor analysis (CFA), were employed to detect potential underlying latent variables governing the behavior of the measured variables. PCFA showed good clustering of the science test parts, including also text comprehension. A putative latent variable "Basic Knowledge in Natural Science," investigated by CFA, was indeed shown to govern the response behavior of the applicants in biology, chemistry, physics, and mathematics as well as text comprehension. The analysis of the correlation structure of the various test parts confirmed that the science test parts together with text comprehension constitute a satisfactory instrument for measuring a latent construct variable "Basic Knowledge in Natural Science." The present results suggest the fundamental importance of basic science knowledge for results obtained in the framework of the admission process for medical universities.
Kyeong, Sunghyon; Park, Seonjeong; Cheon, Keun-Ah; Kim, Jae-Jin; Song, Dong-Ho; Kim, Eunjoo
2015-01-01
Attention-deficit/hyperactivity disorder (ADHD) is currently diagnosed by a diagnostic interview, mainly based on subjective reports from parents or teachers. It is necessary to develop methods that rely on objectively measureable neurobiological data to assess brain-behavior relationship in patients with ADHD. We investigated the application of a topological data analysis tool, Mapper, to analyze the brain functional connectivity data from ADHD patients. To quantify the disease severity using the neuroimaging data, the decomposition of individual functional networks into normal and disease components by the healthy state model (HSM) was performed, and the magnitude of the disease component (MDC) was computed. Topological data analysis using Mapper was performed to distinguish children with ADHD (n = 196) from typically developing controls (TDC) (n = 214). In the topological data analysis, the partial clustering results of patients with ADHD and normal subjects were shown in a chain-like graph. In the correlation analysis, the MDC showed a significant increase with lower intelligence scores in TDC. We also found that the rates of comorbidity in ADHD significantly increased when the deviation of the functional connectivity from HSM was large. In addition, a significant correlation between ADHD symptom severity and MDC was found in part of the dataset. The application of HSM and topological data analysis methods in assessing the brain functional connectivity seem to be promising tools to quantify ADHD symptom severity and to reveal the hidden relationship between clinical phenotypic variables and brain connectivity.
Tracking moving targets behind a scattering medium via speckle correlation.
Guo, Chengfei; Liu, Jietao; Wu, Tengfei; Zhu, Lei; Shao, Xiaopeng
2018-02-01
Tracking moving targets behind a scattering medium is a challenge, and it has many important applications in various fields. Owing to the multiple scattering, instead of the object image, only a random speckle pattern can be received on the camera when light is passing through highly scattering layers. Significantly, an important feature of a speckle pattern has been found, and it showed the target information can be derived from the speckle correlation. In this work, inspired by the notions used in computer vision and deformation detection, by specific simulations and experiments, we demonstrate a simple object tracking method, in which by using the speckle correlation, the movement of a hidden object can be tracked in the lateral direction and axial direction. In addition, the rotation state of the moving target can also be recognized by utilizing the autocorrelation of a speckle. This work will be beneficial for biomedical applications in the fields of quantitative analysis of the working mechanisms of a micro-object and the acquisition of dynamical information of the micro-object motion.
Determination of the key parameters affecting historic communications satellite trends
NASA Technical Reports Server (NTRS)
Namkoong, D.
1984-01-01
Data representing 13 series of commercial communications satellites procured between 1968 and 1982 were analyzed to determine the factors that have contributed to the general reduction over time of the per circuit cost of communications satellites. The model by which the data were analyzed was derived from a general telecommunications application and modified to be more directly applicable for communications satellites. In this model satellite mass, bandwidth-years, and technological change were the variable parameters. A linear, least squares, multiple regression routine was used to obtain the measure of significance of the model. Correlation was measured by coefficient of determination (R super 2) and t-statistic. The results showed that no correlation could be established with satellite mass. Bandwidth-year however, did show a significant correlation. Technological change in the bandwidth-year case was a significant factor in the model. This analysis and the conclusions derived are based on mature technologies, i.e., satellite designs that are evolutions of earlier designs rather than the first of a new generation. The findings, therefore, are appropriate to future satellites only if they are a continuation of design evolution.
Application of High Speed Digital Image Correlation in Rocket Engine Hot Fire Testing
NASA Technical Reports Server (NTRS)
Gradl, Paul R.; Schmidt, Tim
2016-01-01
Hot fire testing of rocket engine components and rocket engine systems is a critical aspect of the development process to understand performance, reliability and system interactions. Ground testing provides the opportunity for highly instrumented development testing to validate analytical model predictions and determine necessary design changes and process improvements. To properly obtain discrete measurements for model validation, instrumentation must survive in the highly dynamic and extreme temperature application of hot fire testing. Digital Image Correlation has been investigated and being evaluated as a technique to augment traditional instrumentation during component and engine testing providing further data for additional performance improvements and cost savings. The feasibility of digital image correlation techniques were demonstrated in subscale and full scale hotfire testing. This incorporated a pair of high speed cameras to measure three-dimensional, real-time displacements and strains installed and operated under the extreme environments present on the test stand. The development process, setup and calibrations, data collection, hotfire test data collection and post-test analysis and results are presented in this paper.
Lerner, Thomas R.; Burden, Jemima J.; Nkwe, David O.; Pelchen-Matthews, Annegret; Domart, Marie-Charlotte; Durgan, Joanne; Weston, Anne; Jones, Martin L.; Peddie, Christopher J.; Carzaniga, Raffaella; Florey, Oliver; Marsh, Mark; Gutierrez, Maximiliano G.
2017-01-01
ABSTRACT The processes of life take place in multiple dimensions, but imaging these processes in even three dimensions is challenging. Here, we describe a workflow for 3D correlative light and electron microscopy (CLEM) of cell monolayers using fluorescence microscopy to identify and follow biological events, combined with serial blockface scanning electron microscopy to analyse the underlying ultrastructure. The workflow encompasses all steps from cell culture to sample processing, imaging strategy, and 3D image processing and analysis. We demonstrate successful application of the workflow to three studies, each aiming to better understand complex and dynamic biological processes, including bacterial and viral infections of cultured cells and formation of entotic cell-in-cell structures commonly observed in tumours. Our workflow revealed new insight into the replicative niche of Mycobacterium tuberculosis in primary human lymphatic endothelial cells, HIV-1 in human monocyte-derived macrophages, and the composition of the entotic vacuole. The broad application of this 3D CLEM technique will make it a useful addition to the correlative imaging toolbox for biomedical research. PMID:27445312
Lee, Ji Hyun; Lim, Hye Kyung; Park, Eunyoung; Song, Junyoung; Lee, Hee Song; Ko, Jooyeon; Kim, Minyoung
2013-04-01
To obtain reliability and applicability of the Korean version Bayley Scale of Infant Development-II (BSID-II) in evaluating the developmental status of children with cerebral palsy (CP). The inter-rater reliability of BSID-II scores from 68 children with CP (46 boys and 22 girls; mean age, 32.54±16.76 months; age range, 4 to 78 months) was evaluated by 10 pediatric occupational therapists. Patients were classified in several ways according to age group, typology, and the severity of motor impairment by the level of the Gross Motor Function Classification System (GMFCS). The measures were performed by video analysis, and the results of intraclass correlation (ICC) were obtained for each of the above classifications. To evaluate the clinical applicability of BSID-II for CP, its correlation with the Gross Motor Function Measure (GMFM), which has been known as the standard motor assessment for CP, was investigated. ICC was 0.99 for the Mental scale and 0.98 for the Motor scale in all subjects. The values of ICC ranged from 0.92 to 0.99 for each age group, 0.93 to 0.99 for each typology, and 0.99 to 1.00 for each GMFCS level. A strong positive correlation was found between the BSID-II Motor raw score and the GMFM total score (r=0.84, p<0.001), and a moderate correlation was observed between the BSID-II Mental raw score and the GMFM total score (r=0.65, p<0.001). The Korean version of BSID-II is a reliable tool to measure the functional status of children with CP. The raw scores of BSID-II showed a great correlation with GMFM, indicating validity of this measure for children with CP on clinical basis.
Propagation Characteristics of International Space Station Wireless Local Area Network
NASA Technical Reports Server (NTRS)
Sham, Catherine C.; Hwn, Shian U.; Loh, Yin-Chung
2005-01-01
This paper describes the application of the Uniform Geometrical Theory of Diffraction (UTD) for Space Station Wireless Local Area Networks (WLANs) indoor propagation characteristics analysis. The verification results indicate good correlation between UTD computed and measured signal strength. It is observed that the propagation characteristics are quite different in the Space Station modules as compared with those in the typical indoor WLANs environment, such as an office building. The existing indoor propagation models are not readily applicable to the Space Station module environment. The Space Station modules can be regarded as oversized imperfect waveguides. Two distinct propagation regions separated by a breakpoint exist. The propagation exhibits the guided wave characteristics. The propagation loss in the Space Station, thus, is much smaller than that in the typical office building. The path loss model developed in this paper is applicable for Space Station WLAN RF coverage and link performance analysis.
Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng
2013-05-01
Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.
Diagrammatic analysis of correlations in polymer fluids: Cluster diagrams via Edwards' field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morse, David C.
2006-10-15
Edwards' functional integral approach to the statistical mechanics of polymer liquids is amenable to a diagrammatic analysis in which free energies and correlation functions are expanded as infinite sums of Feynman diagrams. This analysis is shown to lead naturally to a perturbative cluster expansion that is closely related to the Mayer cluster expansion developed for molecular liquids by Chandler and co-workers. Expansion of the functional integral representation of the grand-canonical partition function yields a perturbation theory in which all quantities of interest are expressed as functionals of a monomer-monomer pair potential, as functionals of intramolecular correlation functions of non-interacting molecules,more » and as functions of molecular activities. In different variants of the theory, the pair potential may be either a bare or a screened potential. A series of topological reductions yields a renormalized diagrammatic expansion in which collective correlation functions are instead expressed diagrammatically as functionals of the true single-molecule correlation functions in the interacting fluid, and as functions of molecular number density. Similar renormalized expansions are also obtained for a collective Ornstein-Zernicke direct correlation function, and for intramolecular correlation functions. A concise discussion is given of the corresponding Mayer cluster expansion, and of the relationship between the Mayer and perturbative cluster expansions for liquids of flexible molecules. The application of the perturbative cluster expansion to coarse-grained models of dense multi-component polymer liquids is discussed, and a justification is given for the use of a loop expansion. As an example, the formalism is used to derive a new expression for the wave-number dependent direct correlation function and recover known expressions for the intramolecular two-point correlation function to first-order in a renormalized loop expansion for coarse-grained models of binary homopolymer blends and diblock copolymer melts.« less
NASA Technical Reports Server (NTRS)
Casas, J. C.; Campbell, S. A.
1981-01-01
The applicability of the gas filter correlation radiometer (GFCR) to the measurement of tropospheric carbon monoxide gas was investigated. An assessment of the GFRC measurement system to a regional measurement program was conducted through extensive aircraft flight-testing of several versions of the GFRC. Investigative work in the following areas is described: flight test planning and coordination, acquisition of verifying CO measurements, determination and acquisition of supporting meteorological data requirements, and development of supporting computational software.
The ABC (in any D) of logarithmic CFT
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro
2017-10-01
Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.
Fine structure of transient waves in a random medium: The correlation and spectral density functions
NASA Technical Reports Server (NTRS)
Wenzel, Alan R.
1994-01-01
This is essentially a progress report on a theoretical investigation of the propagation of transient waves in a random medium. The emphasis in this study is on applications to sonic-boom propagation, particularly as regards the effect of atmospheric turbulence on the sonic-boom waveform. The analysis is general, however, and is applicable to other types of waves besides sonic-boom waves. The phenomenon of primary concern in this investigation is the fine structure of the wave. A figure is used to illustrate what is meant by finestructure.
Application of remote sensing for fishery resources assessment and monitoring. [Gulf of Mexico
NASA Technical Reports Server (NTRS)
Savastano, K. J. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The distribution and abundance of white marlin correlated with the chlorophyll, water temperature, and Secchi depth sea truth measurements. Results of correlation analyses for dolphin were inconclusive. Predicition models for white marlin were developed using stepwise multiple regression and discriminant function analysis techniques which demonstrated a potential for increasing the probability of game fishing success. The S190A and B imagery was density sliced/color enhanced with white marlin location superimposed on the image, but no density/white marlin relationship could be established.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramovici, E.; Northwood, D.O.; Shehata, M.T.
1999-01-01
The contents include Analysis of In-Service Failures (tutorials, transportation industry, corrosion and materials degradation, electronic and advanced materials); 1998 Sorby Award Lecture by Kay Geels, Struers A/S (Metallographic Preparation from Sorby to the Present); Advances in Microstructural Characterization (characterization techniques using high resolution and focused ion beam, characterization of microstructural clustering and correlation with performance); Advanced Applications (advanced alloys and intermetallic compounds, plasma spray coatings and other surface coatings, corrosion, and materials degradation).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Kook In; Lee, In Gyu; Hwang, Wan Sik, E-mail: mhshin@kau.ac.kr, E-mail: whwang@kau.ac.kr
The oxidation properties of graphene oxide (GO) are systematically correlated with their chemical sensing properties. Based on an impedance analysis, the equivalent circuit models of the capacitive sensors are established, and it is demonstrated that capacitive operations are related to the degree of oxidation. This is also confirmed by X-ray diffraction and Raman analysis. Finally, highly sensitive stacked GO sensors are shown to detect humidity in capacitive mode, which can be useful in various applications requiring low power consumption.
Applications of the Cambridge Structural Database in organic chemistry and crystal chemistry.
Allen, Frank H; Motherwell, W D Samuel
2002-06-01
The Cambridge Structural Database (CSD) and its associated software systems have formed the basis for more than 800 research applications in structural chemistry, crystallography and the life sciences. Relevant references, dating from the mid-1970s, and brief synopses of these papers are collected in a database, DBUse, which is freely available via the CCDC website. This database has been used to review research applications of the CSD in organic chemistry, including supramolecular applications, and in organic crystal chemistry. The review concentrates on applications that have been published since 1990 and covers a wide range of topics, including structure correlation, conformational analysis, hydrogen bonding and other intermolecular interactions, studies of crystal packing, extended structural motifs, crystal engineering and polymorphism, and crystal structure prediction. Applications of CSD information in studies of crystal structure precision, the determination of crystal structures from powder diffraction data, together with applications in chemical informatics, are also discussed.
Study of ripple formation in unidirectionally-tensioned membranes
NASA Technical Reports Server (NTRS)
Lopez, Bernardo C.; Lih, Shyh-Shiuh; Leifer, Jack; Guzman, Gladys
2004-01-01
The study of membrane behavior is one of the areas of interest in the development of ultralightweight and lightweight structures for space applications. Utilization of membranes as loadcarrying components or support structure for antenna patch-arrays, collectors, sun-shades and solar-sail reflective surfaces brings about a variety of challenges that require understanding of the ripple-formation phenomenology, development of reliable test and analysis techniques, and solution methods for challenges related to the intended applications. This paper presents interim results from a study on the behavior of unidirectionally tensioned flat and singly-curved membranes. It focuses on preliminary experimental work to explore formation of ripples' and on finite element analysis (FEA) to correlate and predict their formation on thin polyimide membrane models.
Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy
NASA Astrophysics Data System (ADS)
Sugiyama, Naruhisa; Shirakawa, Tomohiro
2017-07-01
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.
Correlation and agreement of a digital and conventional method to measure arch parameters.
Nawi, Nes; Mohamed, Alizae Marny; Marizan Nor, Murshida; Ashar, Nor Atika
2018-01-01
The aim of the present study was to determine the overall reliability and validity of arch parameters measured digitally compared to conventional measurement. A sample of 111 plaster study models of Down syndrome (DS) patients were digitized using a blue light three-dimensional (3D) scanner. Digital and manual measurements of defined parameters were performed using Geomagic analysis software (Geomagic Studio 2014 software, 3D Systems, Rock Hill, SC, USA) on digital models and with a digital calliper (Tuten, Germany) on plaster study models. Both measurements were repeated twice to validate the intraexaminer reliability based on intraclass correlation coefficients (ICCs) using the independent t test and Pearson's correlation, respectively. The Bland-Altman method of analysis was used to evaluate the agreement of the measurement between the digital and plaster models. No statistically significant differences (p > 0.05) were found between the manual and digital methods when measuring the arch width, arch length, and space analysis. In addition, all parameters showed a significant correlation coefficient (r ≥ 0.972; p < 0.01) between all digital and manual measurements. Furthermore, a positive agreement between digital and manual measurements of the arch width (90-96%), arch length and space analysis (95-99%) were also distinguished using the Bland-Altman method. These results demonstrate that 3D blue light scanning and measurement software are able to precisely produce 3D digital model and measure arch width, arch length, and space analysis. The 3D digital model is valid to be used in various clinical applications.
Feasibility study of parallel optical correlation-decoding analysis of lightning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Descour, M.R.; Sweatt, W.C.; Elliott, G.R.
The optical correlator described in this report is intended to serve as an attention-focusing processor. The objective is to narrowly bracket the range of a parameter value that characterizes the correlator input. The input is a waveform collected by a satellite-borne receiver. In the correlator, this waveform is simultaneously correlated with an ensemble of ionosphere impulse-response functions, each corresponding to a different total-electron-count (TEC) value. We have found that correlation is an effective method of bracketing the range of TEC values likely to be represented by the input waveform. High accuracy in a computational sense is not required of themore » correlator. Binarization of the impulse-response functions and the input waveforms prior to correlation results in a lower correlation-peak-to-background-fluctuation (signal-to-noise) ratio than the peak that is obtained when all waveforms retain their grayscale values. The results presented in this report were obtained by means of an acousto-optic correlator previously developed at SNL as well as by simulation. An optical-processor architecture optimized for 1D correlation of long waveforms characteristic of this application is described. Discussions of correlator components, such as optics, acousto-optic cells, digital micromirror devices, laser diodes, and VCSELs are included.« less
Point-point and point-line moving-window correlation spectroscopy and its applications
NASA Astrophysics Data System (ADS)
Zhou, Qun; Sun, Suqin; Zhan, Daqi; Yu, Zhiwu
2008-07-01
In this paper, we present a new extension of generalized two-dimensional (2D) correlation spectroscopy. Two new algorithms, namely point-point (P-P) correlation and point-line (P-L) correlation, have been introduced to do the moving-window 2D correlation (MW2D) analysis. The new method has been applied to a spectral model consisting of two different processes. The results indicate that P-P correlation spectroscopy can unveil the details and re-constitute the entire process, whilst the P-L can provide general feature of the concerned processes. Phase transition behavior of dimyristoylphosphotidylethanolamine (DMPE) has been studied using MW2D correlation spectroscopy. The newly proposed method verifies that the phase transition temperature is 56 °C, same as the result got from a differential scanning calorimeter. To illustrate the new method further, a lysine and lactose mixture has been studied under thermo perturbation. Using the P-P MW2D, the Maillard reaction of the mixture was clearly monitored, which has been very difficult using conventional display of FTIR spectra.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Heeg, Jennifer; Perry, Boyd, III
1990-01-01
Time-correlated gust loads are time histories of two or more load quantities due to the same disturbance time history. Time correlation provides knowledge of the value (magnitude and sign) of one load when another is maximum. At least two analysis methods have been identified that are capable of computing maximized time-correlated gust loads for linear aircraft. Both methods solve for the unit-energy gust profile (gust velocity as a function of time) that produces the maximum load at a given location on a linear airplane. Time-correlated gust loads are obtained by re-applying this gust profile to the airplane and computing multiple simultaneous load responses. Such time histories are physically realizable and may be applied to aircraft structures. Within the past several years there has been much interest in obtaining a practical analysis method which is capable of solving the analogous problem for nonlinear aircraft. Such an analysis method has been the focus of an international committee of gust loads specialists formed by the U.S. Federal Aviation Administration and was the topic of a panel discussion at the Gust and Buffet Loads session at the 1989 SDM Conference in Mobile, Alabama. The kinds of nonlinearities common on modern transport aircraft are indicated. The Statical Discrete Gust method is capable of being, but so far has not been, applied to nonlinear aircraft. To make the method practical for nonlinear applications, a search procedure is essential. Another method is based on Matched Filter Theory and, in its current form, is applicable to linear systems only. The purpose here is to present the status of an attempt to extend the matched filter approach to nonlinear systems. The extension uses Matched Filter Theory as a starting point and then employs a constrained optimization algorithm to attack the nonlinear problem.
Poetzsch, Michael; Steuer, Andrea E; Roemmelt, Andreas T; Baumgartner, Markus R; Kraemer, Thomas
2014-12-02
Single hair analysis normally requires extensive sample preparation microscale protocols including time-consuming steps like segmentation and extraction. Matrix assisted laser desorption and ionization mass spectrometric imaging (MALDI-MSI) was shown to be an alternative tool in single hair analysis, but still, questions remain. Therefore, an investigation of MALDI-MSI in single hair analysis concerning the extraction process, usage of internal standard (IS), and influences on the ionization processes were systematically investigated to enable the reliable application to hair analysis. Furthermore, single dose detection, quantitative correlation to a single hair, and hair strand LC-MS/MS results were performed, and the performance was compared to LC-MS/MS single hair monitoring. The MALDI process was shown to be independent from natural hair color and not influenced by the presence of melanin. Ionization was shown to be reproducible along and in between different hair samples. MALDI image intensities in single hair and hair snippets showed good semiquantitative correlation to zolpidem hair concentrations obtained from validated routine LC-MS/MS methods. MALDI-MSI is superior to LC-MS/MS analysis when a fast, easy, and cheap sample preparation is necessary, whereas LC-MS/MS showed higher sensitivity with the ability of single dose detection for zolpidem. MALDI-MSI and LC-MS/MS segmental single hair analysis showed good correlation, and both are suitable for consumption monitoring of drugs of abuse with a high time resolution.
Koerner, Tess K; Zhang, Yang
2017-02-27
Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers.
NASA Astrophysics Data System (ADS)
Wang, Duan; Podobnik, Boris; Horvatić, Davor; Stanley, H. Eugene
2011-04-01
We propose a modified time lag random matrix theory in order to study time-lag cross correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross correlations between the returns. The magnitude of the cross correlations constitutes “bad news” for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time-lag cross correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross correlations between returns (or magnitudes) can be modeled with the autocorrelations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find ten indices that are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
Wang, Duan; Podobnik, Boris; Horvatić, Davor; Stanley, H Eugene
2011-04-01
We propose a modified time lag random matrix theory in order to study time-lag cross correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross correlations between the returns. The magnitude of the cross correlations constitutes "bad news" for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time-lag cross correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross correlations between returns (or magnitudes) can be modeled with the autocorrelations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find ten indices that are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
Zhang, Xinyu; Tang, Yuqian; Shi, Yao; He, Nianpeng; Wen, Xuefa; Yu, Qiang; Zheng, Chunyu; Sun, Xiaomin; Qiu, Weiwen
2016-09-06
We used a seven-year urea gradient applied field experiment to investigate the effects of nitrogen (N) applications on soil N hydrolytic enzyme activity and ammonia-oxidizing microbial abundance in a typical steppe ecosystem in Inner Mongolia. The results showed that N additions inhibited the soil N-related hydrolytic enzyme activities, especially in 392 kg N ha(-1 )yr(-1) treatment. As N additions increased, the amoA gene copy ratios of ammonia-oxidizing archaea (AOA) to ammonia-oxidizing bacteria (AOB) decreased from 1.13 to 0.65. Pearson correlation analysis showed that the AOA gene copies were negatively related with NH4(+)-N content. However, the AOB gene copies were positively correlated with NO3(-)-N content. Moderate N application rates (56-224 kg N ha(-1 )yr(-1)) accompanied by P additions are beneficial to maintaining the abundance of AOB, as opposed to the inhibition of highest N application rate (392 kg N ha(-1 )yr(-1)) on the abundance of AOB. This study suggests that the abundance of AOB and AOA would not decrease unless N applications exceed 224 kg N ha(-1 )yr(-1) in temperate grasslands in Inner Mongolia.
A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal
Mohapatra, Biswajit
2018-01-01
Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis. PMID:29854361
A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal.
Nayak, Suraj K; Bit, Arindam; Dey, Anilesh; Mohapatra, Biswajit; Pal, Kunal
2018-01-01
Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis.
Hanson, Jeffery A; Yang, Haw
2008-11-06
The statistical properties of the cross correlation between two time series has been studied. An analytical expression for the cross correlation function's variance has been derived. On the basis of these results, a statistically robust method has been proposed to detect the existence and determine the direction of cross correlation between two time series. The proposed method has been characterized by computer simulations. Applications to single-molecule fluorescence spectroscopy are discussed. The results may also find immediate applications in fluorescence correlation spectroscopy (FCS) and its variants.
Coastal dune systems and disturbance factors: monitoring and analysis in central Italy.
De Luca, Elena; Novelli, Claudia; Barbato, Fabio; Menegoni, Patrizia; Iannetta, Massimo; Nascetti, Giuseppe
2011-12-01
This study describes the conservation status of dune systems in relation to disturbance factors in the coastal stretch of the Viterbo province, Latium Region, Italy. Particular emphasis was given to the bioindication value of plant communities and their sequence. Each plant community was considered as a "habitat" in accordance with Annex I of the Directive 92/43/EU. Stress factors, such as sand dynamic and erosion, and anthropogenic pressures, such as trampling and bathing settlements, influence the sequence of habitats and weaken the system of relations that makes these coenoses to occur in extreme conditions. The choice to carry out surveys along wide transects, recording different data, allowed to explore the use of habitats as bioindicators. Comparing sites characterized by the same extension in a homogeneous area, it was possible to expand the use of canonical correspondence analysis (CCA) as a tool to correlate habitat composition and disturbance factors. The application of CCA showed a high correlation of degradation and habitat loss with coastal erosion, trampling and presence of waste. Furthermore, floristic surveys allowed the application of different biodiversity indices to quantify species richness of sampled areas. The conservation status of the sites investigated was found to be diverse, from the total disappearance of the mobile dune habitats to their complete sequence. The proposed methodology has been useful to fulfill the objective of the work and is applicable to other case studies in the Mediterranean.
Application of 3D-QSAR in the rational design of receptor ligands and enzyme inhibitors.
Mor, Marco; Rivara, Silvia; Lodola, Alessio; Lorenzi, Simone; Bordi, Fabrizio; Plazzi, Pier Vincenzo; Spadoni, Gilberto; Bedini, Annalida; Duranti, Andrea; Tontini, Andrea; Tarzia, Giorgio
2005-11-01
Quantitative structure-activity relationships (QSARs) are frequently employed in medicinal chemistry projects, both to rationalize structure-activity relationships (SAR) for known series of compounds and to help in the design of innovative structures endowed with desired pharmacological actions. As a difference from the so-called structure-based drug design tools, they do not require the knowledge of the biological target structure, but are based on the comparison of drug structural features, thus being defined ligand-based drug design tools. In the 3D-QSAR approach, structural descriptors are calculated from molecular models of the ligands, as interaction fields within a three-dimensional (3D) lattice of points surrounding the ligand structure. These descriptors are collected in a large X matrix, which is submitted to multivariate analysis to look for correlations with biological activity. Like for other QSARs, the reliability and usefulness of the correlation models depends on the validity of the assumptions and on the quality of the data. A careful selection of compounds and pharmacological data can improve the application of 3D-QSAR analysis in drug design. Some examples of the application of CoMFA and CoMSIA approaches to the SAR study and design of receptor or enzyme ligands is described, pointing the attention to the fields of melatonin receptor ligands and FAAH inhibitors.
Applicant Characteristics Associated with Successful Matching into Otolaryngology
Hauser, Leah J.; Gebhard, Grant M.; Blumhagen, Rachel; Carlson, Nichole E.; Cabrera-Muffly, Cristina
2016-01-01
Objective To identify resident applicant characteristics that increase the odds of matching to Otolaryngology residency. Study Design Cross-sectional analysis. Methods Residency applications to our institution from 2009 through 2013 were reviewed. The available data represented 81.1% of applicants to Otolaryngology programs nationwide. Online public records were searched to determine whether an applicant matched to an Otolaryngology residency position. Factors that were significantly associated with the odds of matching were determined using logistic regression. Results A total of 1,479 unique applications were analyzed. On univariate analysis, 27 demographic, academic, personal, medical school, prior training, and application-specific factors were associated with the odds of matching into Otolaryngology. On multivariate analysis, indicators of academic achievement, including AOA status, whether applicant received awards, and publications were significantly associated with the odds of matching (OR 2.03, 1.39, 1.66, respectively). The odds of matching increased with increasing Step 1 scores (p<0.001). Attending a medical school ranked by the US News & World Report and being a US citizen born in the US significantly increased odds of matching (OR 1.55 and 2.04, respectively), while being a non-US Senior significantly decreased the odds of matching (OR 0.33). Conclusion Multiple factors are associated with successfully matching into an Otolaryngology residency. While this information allows medical students to determine the strength of their application, these criteria have not been correlated with resident success. We urge selection committees to begin identifying applicant selection methods that reflect the values we want to cultivate in our future colleagues. PMID:27767217
Application of artificial neural network to fMRI regression analysis.
Misaki, Masaya; Miyauchi, Satoru
2006-01-15
We used an artificial neural network (ANN) to detect correlations between event sequences and fMRI (functional magnetic resonance imaging) signals. The layered feed-forward neural network, given a series of events as inputs and the fMRI signal as a supervised signal, performed a non-linear regression analysis. This type of ANN is capable of approximating any continuous function, and thus this analysis method can detect any fMRI signals that correlated with corresponding events. Because of the flexible nature of ANNs, fitting to autocorrelation noise is a problem in fMRI analyses. We avoided this problem by using cross-validation and an early stopping procedure. The results showed that the ANN could detect various responses with different time courses. The simulation analysis also indicated an additional advantage of ANN over non-parametric methods in detecting parametrically modulated responses, i.e., it can detect various types of parametric modulations without a priori assumptions. The ANN regression analysis is therefore beneficial for exploratory fMRI analyses in detecting continuous changes in responses modulated by changes in input values.
Measuring User Similarity Using Electric Circuit Analysis: Application to Collaborative Filtering
Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan
2012-01-01
We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user–item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems PMID:23145095
Measuring user similarity using electric circuit analysis: application to collaborative filtering.
Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan
2012-01-01
We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user-item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C. S.; Zhang, Hongbin
Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less
Liu, Fang; Shen, Changqing; He, Qingbo; Zhang, Ao; Liu, Yongbin; Kong, Fanrang
2014-01-01
A fault diagnosis strategy based on the wayside acoustic monitoring technique is investigated for locomotive bearing fault diagnosis. Inspired by the transient modeling analysis method based on correlation filtering analysis, a so-called Parametric-Mother-Doppler-Wavelet (PMDW) is constructed with six parameters, including a center characteristic frequency and five kinematic model parameters. A Doppler effect eliminator containing a PMDW generator, a correlation filtering analysis module, and a signal resampler is invented to eliminate the Doppler effect embedded in the acoustic signal of the recorded bearing. Through the Doppler effect eliminator, the five kinematic model parameters can be identified based on the signal itself. Then, the signal resampler is applied to eliminate the Doppler effect using the identified parameters. With the ability to detect early bearing faults, the transient model analysis method is employed to detect localized bearing faults after the embedded Doppler effect is eliminated. The effectiveness of the proposed fault diagnosis strategy is verified via simulation studies and applications to diagnose locomotive roller bearing defects. PMID:24803197
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
Megalithic Monument of Abuli, Georgia, and Possible Astronomical Signi cance
NASA Astrophysics Data System (ADS)
Jijelava, Badri; Simonia, Irakli
2016-08-01
Background/Objectives: In recent years, in purpose of investigation of the artefacts, the ancient culture and religion, based on the astronomy knowledge play significant role. The aim of this work is to identify the orientations of the religious megalithic complexes and their correlation to the celestial luminaries. Methods/Statistical Analysis: We harmonized the archeological data, ethnographical, historical information and restoration of ancient celestial sphere (using special astronomy application), which give us possibility to identify the correlations between the acronychal or helical rising/ set of luminaries and directions of megalithic objects. Very often such connections are stored in a current folklore too. Findings: This technique of investigations give us more clear understanding of ancient universe. Using this method, we can receive latent information about the ancient Gods - Luminaries, clarify current mythology, date of the megalithic complex. Application/Improvements: This method of investigation is an additional instrument for archeological investigations,
Atomically precise edge chlorination of nanographenes and its application in graphene nanoribbons
Tan, Yuan-Zhi; Yang, Bo; Parvez, Khaled; Narita, Akimitsu; Osella, Silvio; Beljonne, David; Feng, Xinliang; Müllen, Klaus
2013-01-01
Chemical functionalization is one of the most powerful and widely used strategies to control the properties of nanomaterials, particularly in the field of graphene. However, the ill-defined structure of the present functionalized graphene inhibits atomically precise structural characterization and structure-correlated property modulation. Here we present a general edge chlorination protocol for atomically precise functionalization of nanographenes at different scales from 1.2 to 3.4 nm and its application in graphene nanoribbons. The well-defined edge chlorination is unambiguously confirmed by X-ray single-crystal analysis, which also discloses the characteristic non-planar molecular shape and detailed bond lengths of chlorinated nanographenes. Chlorinated nanographenes and graphene nanoribbons manifest enhanced solution processability associated with decreases in the optical band gap and frontier molecular orbital energy levels, exemplifying the structure-correlated property modulation by precise edge chlorination. PMID:24212200
Levitte, D.; Eckstein, Y.
1978-01-01
Analysis of twenty-one thermal springs emerging along the Jordan-Dead Sea Rift Valley in Israel indicates a very good correlation between the concentration of dissolved silica and the temperature of the spring orifice. Dissolution of quartz was identified as the apparent source of the silica in the water. Application of the silica geothermometer for mixed systems suggests that the springs in the Tiberias Lake Basin are supplied with hot water from deep reservoir (or reservoirs) at a temperature of 115??C (239??F). The same temperature was postulated earlier by the application of the Na-K-Ca hydro-geothermometer to a group of thermal springs in the same basin. The temperature of the reservoir supplying hot brines to the springs emerging along the western shore of the Dead Sea is estimated at 90??C (194??F).
Yamamoto, Shinya; Bamba, Takeshi; Sano, Atsushi; Kodama, Yukako; Imamura, Miho; Obata, Akio; Fukusaki, Eiichiro
2012-08-01
Soy sauces, produced from different ingredients and brewing processes, have variations in components and quality. Therefore, it is extremely important to comprehend the relationship between components and the sensory attributes of soy sauces. The current study sought to perform metabolite profiling in order to devise a method of assessing the attributes of soy sauces. Quantitative descriptive analysis (QDA) data for 24 soy sauce samples were obtained from well selected sensory panelists. Metabolite profiles primarily concerning low-molecular-weight hydrophilic components were based on gas chromatography with time-of-flightmass spectrometry (GC/TOFMS). QDA data for soy sauces were accurately predicted by projection to latent structure (PLS), with metabolite profiles serving as explanatory variables and QDA data set serving as a response variable. Moreover, analysis of correlation between matrices of metabolite profiles and QDA data indicated contributing compounds that were highly correlated with QDA data. Especially, it was indicated that sugars are important components of the tastes of soy sauces. This new approach which combines metabolite profiling with QDA is applicable to analysis of sensory attributes of food as a result of the complex interaction between its components. This approach is effective to search important compounds that contribute to the attributes. Copyright © 2012 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Robertson, David S; Prevost, A Toby; Bowden, Jack
2016-10-01
The problem of selection bias has long been recognized in the analysis of two-stage trials, where promising candidates are selected in stage 1 for confirmatory analysis in stage 2. To efficiently correct for bias, uniformly minimum variance conditionally unbiased estimators (UMVCUEs) have been proposed for a wide variety of trial settings, but where the population parameter estimates are assumed to be independent. We relax this assumption and derive the UMVCUE in the multivariate normal setting with an arbitrary known covariance structure. One area of application is the estimation of odds ratios (ORs) when combining a genome-wide scan with a replication study. Our framework explicitly accounts for correlated single nucleotide polymorphisms, as might occur due to linkage disequilibrium. We illustrate our approach on the measurement of the association between 11 genetic variants and the risk of Crohn's disease, as reported in Parkes and others (2007. Sequence variants in the autophagy gene IRGM and multiple other replicating loci contribute to Crohn's disease susceptibility. Nat. Gen. 39: (7), 830-832.), and show that the estimated ORs can vary substantially if both selection and correlation are taken into account. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag
2017-02-01
Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.
Paule, M A; Memon, S A; Lee, B-Y; Umer, S R; Lee, C-H
2014-01-01
Stormwater runoff quality is sensitive to land use and land cover (LULC) change. It is difficult to understand their relationship in predicting the pollution potential and developing watershed management practices to eliminate or reduce the pollution risk. In this study, the relationship between LULC change and stormwater runoff quality in two separate monitoring sites comprising a construction area (Site 1) and mixed land use (Site 2) was analyzed using geographic information system (GIS), event mean concentration (EMC), and correlation analysis. It was detected that bare land area increased, while other land use areas such as agriculture, commercial, forest, grassland, parking lot, residential, and road reduced. Based on the analyses performed, high maximum range and average EMCs were found in Site 2 for most of the water pollutants. Also, urban areas and increased conversion of LULC into bare land corresponded to degradation of stormwater quality. Correlation analysis between LULC and stormwater quality showed the influence of different factors such as farming practices, geographical location, and amount of precipitation, vegetation loss, and anthropogenic activities in monitoring sites. This research found that GIS application was an efficient tool for monthly monitoring, validation and statistical analysis of LULC change in the study area.
Surface and finite size effect on fluctuations dynamics in nanoparticles with long-range order
NASA Astrophysics Data System (ADS)
Morozovska, A. N.; Eliseev, E. A.
2010-02-01
The influence of surface and finite size on the dynamics of the order parameter fluctuations and critical phenomena in the three-dimensional (3D)-confined systems with long-range order was not considered theoretically. In this paper, we study the influence of surface and finite size on the dynamics of the order parameter fluctuations in the particles of arbitrary shape. We consider concrete examples of the spherical and cylindrical ferroic nanoparticles within Landau-Ginzburg-Devonshire phenomenological approach. Allowing for the strong surface energy contribution in micro and nanoparticles, the analytical expressions derived for the Ornstein-Zernike correlator of the long-range order parameter spatial-temporal fluctuations, dynamic generalized susceptibility, relaxation times, and correlation radii discrete spectra are different from those known for bulk systems. Obtained analytical expressions for the correlation function of the order parameter spatial-temporal fluctuations in micro and nanosized systems can be useful for the quantitative analysis of the dynamical structural factors determined from magnetic resonance diffraction and scattering spectra. Besides the practical importance of the correlation function for the analysis of the experimental data, derived expressions for the fluctuations strength determine the fundamental limits of phenomenological theories applicability for 3D-confined systems.
2013-01-01
Background Recently, we reported an information density theory and an analysis of three-parameter plus shorter scan than conventional method (3P+) for the amyloid-binding ligand [11C]Pittsburgh compound B (PIB) as an example of a non-highly reversible positron emission tomography (PET) ligand. This article describes an extension of 3P + analysis to noninvasive ‘3P++’ analysis (3P + plus use of a reference tissue for input function). Methods In 3P++ analysis for [11C]PIB, the cerebellum was used as a reference tissue (negligible specific binding). Fifteen healthy subjects (NC) and fifteen Alzheimer's disease (AD) patients participated. The k3 (index of receptor density) values were estimated with 40-min PET data and three-parameter reference tissue model and were compared with that in 40-min 3P + analysis as well as standard 90-min four-parameter (4P) analysis with arterial input function. Simulation studies were performed to explain k3 biases observed in 3P++ analysis. Results Good model fits of 40-min PET data were observed in both reference and target regions-of-interest (ROIs). High linear intra-subject (inter-15 ROI) correlations of k3 between 3P++ (Y-axis) and 3P + (X-axis) analyses were shown in one NC (r2 = 0.972 and slope = 0.845) and in one AD (r2 = 0.982, slope = 0.655), whereas inter-subject k3 correlations in a target region (left lateral temporal cortex) from 30 subjects (15 NC + 15 AD) were somewhat lower (r2 = 0.739 and slope = 0.461). Similar results were shown between 3P++ and 4P analyses: r2 = 0.953 for intra-subject k3 in NC, r2 = 0.907 for that in AD and r2 = 0.711 for inter-30 subject k3. Simulation studies showed that such lower inter-subject k3 correlations and significant negative k3 biases were not due to unstableness of 3P++ analysis but rather to inter-subject variation of both k2 (index of brain-to-blood transport) and k3 (not completely negligible) in the reference region. Conclusions In [11C]PIB, the applicability of 3P++ analysis may be restricted to intra-subject comparison such as follow-up studies. The 3P++ method itself is thought to be robust and may be more applicable to other non-highly reversible PET ligands with ideal reference tissue. PMID:24238306
Soneson, Charlotte; Lilljebjörn, Henrik; Fioretos, Thoas; Fontes, Magnus
2010-04-15
With the rapid development of new genetic measurement methods, several types of genetic alterations can be quantified in a high-throughput manner. While the initial focus has been on investigating each data set separately, there is an increasing interest in studying the correlation structure between two or more data sets. Multivariate methods based on Canonical Correlation Analysis (CCA) have been proposed for integrating paired genetic data sets. The high dimensionality of microarray data imposes computational difficulties, which have been addressed for instance by studying the covariance structure of the data, or by reducing the number of variables prior to applying the CCA. In this work, we propose a new method for analyzing high-dimensional paired genetic data sets, which mainly emphasizes the correlation structure and still permits efficient application to very large data sets. The method is implemented by translating a regularized CCA to its dual form, where the computational complexity depends mainly on the number of samples instead of the number of variables. The optimal regularization parameters are chosen by cross-validation. We apply the regularized dual CCA, as well as a classical CCA preceded by a dimension-reducing Principal Components Analysis (PCA), to a paired data set of gene expression changes and copy number alterations in leukemia. Using the correlation-maximizing methods, regularized dual CCA and PCA+CCA, we show that without pre-selection of known disease-relevant genes, and without using information about clinical class membership, an exploratory analysis singles out two patient groups, corresponding to well-known leukemia subtypes. Furthermore, the variables showing the highest relevance to the extracted features agree with previous biological knowledge concerning copy number alterations and gene expression changes in these subtypes. Finally, the correlation-maximizing methods are shown to yield results which are more biologically interpretable than those resulting from a covariance-maximizing method, and provide different insight compared to when each variable set is studied separately using PCA. We conclude that regularized dual CCA as well as PCA+CCA are useful methods for exploratory analysis of paired genetic data sets, and can be efficiently implemented also when the number of variables is very large.
Shyn, Paul B; Bird, Jeffery R; Koch, R Marie; Tatli, Servet; Levesque, Vincent M; Catalano, Paul J; Silverman, Stuart G
2016-09-01
To determine whether total energy (TE) reaching the microwave (MW) applicator or net energy (NE) exiting the applicator (after correcting for reflectivity) correlates better with hepatic MW ablation zone dimensions than manufacturer-provided chart predictions. Single-applicator, nonoverlapping ablations of 93 liver tumors (0.7-5.9 cm) were performed in 52 adult patients. TE and NE were recorded for each ablation. Long axis diameter (LAD), short axis diameter (SAD), and volume (V) of each ablation zone were measured on magnetic resonance imaging or computed tomography after the procedure and retrospectively compared with TE; NE; and manufacturer-provided chart predictions of LAD, SAD, and V using correlation and regression analyses. For treated tumors, mean (± SD) TE and NE were 49.8 kJ (± 22.7) and 36.4 kJ (± 19.4). Mean LAD, SAD, and V were 5.8 cm (± 1.3), 3.7 cm (± 0.8), and 44.1 cm(3) (± 25.4). Correlation coefficients (95% confidence interval) with LAD, SAD, and V were 0.46 (0.28, 0.61), 0.52 (0.36, 0.66), and 0.52 (0.36, 0.66) for TE; 0.42 (0.24, 0.58), 0.55 (0.39, 0.68), and 0.53 (0.36, 0.66) for NE; and 0.51 (0.34, 0.65), 0.63 (0.49, 0.74), and 0.60 (0.45, 0.73) for chart predictions. Using regression analysis and controlling for TE, SAD was 0.34 cm larger in patients with cirrhosis than in patients without cirrhosis. Correcting for reflectivity did not substantially improve correlation of energy values with MW ablation zone size parameters and did not outperform manufacturer-provided chart predictions. Correlations were moderate and variable using all methods. The results suggest a disproportionate influence of tissue factors on MW ablation results. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.
Widjaja, Effendi; Tan, Boon Hong; Garland, Marc
2006-03-01
Two-dimensional (2D) correlation spectroscopy has been extensively applied to analyze various vibrational spectroscopic data, especially infrared and Raman. However, when it is applied to real-world experimental data, which often contains various imperfections (such as noise interference, baseline fluctuations, and band-shifting) and highly overlapping bands, many artifacts and misleading features in synchronous and asynchronous maps will emerge, and this will lead to difficulties with interpretation. Therefore, an approach that counters many artifacts and therefore leads to simplified interpretation of 2D correlation analysis is certainly useful. In the present contribution, band-target entropy minimization (BTEM) is employed as a spectral pretreatment to handle many of the artifact problems before the application of 2D correlation analysis. BTEM is employed to elucidate the pure component spectra of mixtures and their corresponding concentration profiles. Two alternate forms of analysis result. In the first, the normally vxv problem is converted to an equivalent nvxnv problem, where n represents the number of species present. In the second, the pure component spectra are transformed into simple distributions, and an equivalent and less computationally intensive nv'xnv' problem results (v'
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamışlıoğlu, Miraç, E-mail: m.kamislioglu@gmail.com; Külahcı, Fatih, E-mail: fatihkulahci@firat.edu.tr
Nonlinear time series analysis techniques have large application areas on the geoscience and geophysics fields. Modern nonlinear methods are provided considerable evidence for explain seismicity phenomena. In this study nonlinear time series analysis, fractal analysis and spectral analysis have been carried out for researching the chaotic behaviors of release radon gas ({sup 222}Rn) concentration occurring during seismic events. Nonlinear time series analysis methods (Lyapunov exponent, Hurst phenomenon, correlation dimension and false nearest neighbor) were applied for East Anatolian Fault Zone (EAFZ) Turkey and its surroundings where there are about 35,136 the radon measurements for each region. In this paper weremore » investigated of {sup 222}Rn behavior which it’s used in earthquake prediction studies.« less
Heat transfer and flow friction correlations for perforated plate matrix heat exchangers
NASA Astrophysics Data System (ADS)
Ratna Raju, L.; Kumar, S. Sunil; Chowdhury, K.; Nandi, T. K.
2017-02-01
Perforated plate matrix heat exchangers (MHE) are constructed of high conductivity perforated plates stacked alternately with low conductivity spacers. They are being increasingly used in many cryogenic applications including Claude cycle or Reversed Brayton cycle cryo-refrigerators and liquefiers. Design of high NTU (number of (heat) transfer unit) cryogenic MHEs requires accurate heat transfer coefficient and flow friction factor. Thermo-hydraulic behaviour of perforated plates strongly depends on the geometrical parameters. Existing correlations, however, are mostly expressed as functions of Reynolds number only. This causes, for a given configuration, significant variations in coefficients from one correlation to the other. In this paper we present heat transfer and flow friction correlations as functions of all geometrical and other controlling variables. A FluentTM based numerical model has been developed for heat transfer and pressure drop studies over a stack of alternately arranged perforated plates and spacers. The model is validated with the data from literature. Generalized correlations are obtained through regression analysis over a large number of computed data.
NASA Astrophysics Data System (ADS)
Ouerhani, Y.; Alfalou, A.; Desthieux, M.; Brosseau, C.
2017-02-01
We present a three-step approach based on the commercial VIAPIX® module for road traffic sign recognition and identification. Firstly, detection in a scene of all objects having characteristics of traffic signs is performed. This is followed by a first-level recognition based on correlation which consists in making a comparison between each detected object with a set of reference images of a database. Finally, a second level of identification allows us to confirm or correct the previous identification. In this study, we perform a correlation-based analysis by combining and adapting the Vander Lugt correlator with the nonlinear joint transformation correlator (JTC). Of particular significance, this approach permits to make a reliable decision on road traffic sign identification. We further discuss a robust scheme allowing us to track a detected road traffic sign in a video sequence for the purpose of increasing the decision performance of our system. This approach can have broad practical applications in the maintenance and rehabilitation of transportation infrastructure, or for drive assistance.
Deerinck, Thomas J.
2009-01-01
Fluorescent quantum dots are emerging as an important tool for imaging cells and tissues, and their unique optical and physical properties have captured the attention of the research community. The most common types of commercially available quantum dots consist of a nanocrystalline semiconductor core composed of cadmium selenide with a zinc sulfide capping layer and an outer polymer layer to facilitate conjugation to targeting biomolecules such as immunoglobulins. They exhibit high fluorescent quantum yields and have large absorption cross-sections, possess excellent photostability, and can be synthesized so that their narrow-band fluorescence emission can occur in a wide spectrum of colors. These properties make them excellent candidates for serving as multiplexing molecular beacons using a variety of imaging modalities including highly correlated microscopies. Whereas much attention has been focused on quantum-dot applications for live-cell imaging, we have sought to characterize and exploit their utility for enabling simultaneous multiprotein immunolabeling in fixed cells and tissues. Considerations for their application to immunolabeling for correlated light and electron microscopic analysis are discussed. PMID:18337229
NASA Astrophysics Data System (ADS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
Zeng, Guohui; Teng, Yaoshu; Zhu, Jin; Zhu, Darong; Yang, Bin; Hu, Linping; Chen, Manman; Fu, Xiao
2018-01-01
The objective of the present study was to investigate the clinical application of magnetic resonance imaging (MRI)-respiratory gating technology for assessing illness severity in children with obstructive sleep apnea hypopnea syndrome (OSAHS).MRI-respiratory gating technology was used to scan the nasopharyngeal cavities of 51 children diagnosed with OSAHS during 6 respiratory phases. Correlations between the ratio of the area of the adenoid to the area of the nasopalatine pharyngeal cavity (Sa/Snp), with the main indexes of polysomnography (PSG), were analyzed. Receiver operator characteristic (ROC) curve and Kappa analysis were used to determine the diagnostic accuracy of Sa/Snp in pediatric OSAHS.The Sa/Snp was positively correlated with the apnea hypopnea index (AHI) (P < .001) and negatively correlated with the lowest oxygen saturation of blood during sleep (LaSO2) (P < .001). ROC analysis in the 6 respiratory phases showed that the area under the curve (AUC) of the Sa/Snp in the end-expiratory phase was the largest (0.992, P < .001), providing a threshold of 69.5% for the diagnosis of severe versus slight-moderate OSAHS in children. Consistency analysis with the AHI showed a diagnosis accordance rate of 96.0% in severe pediatric OSAHS and 96.2% in slight-moderate pediatric OSAHS (Kappa = 0.922, P < .001).Stenosis of the nasopalatine pharyngeal cavity in children with adenoidal hypertrophy was greatest at the end-expiration phase during sleep. The end-expiratory Sa/Snp obtained by a combination of MRI and respiratory gating technology has potential as an important imaging index for diagnosing and evaluating severity in pediatric OSAHS.
Fuel system technology overview
NASA Technical Reports Server (NTRS)
Friedman, R.
1980-01-01
Fuel system research and technology studies are being conducted to investigate the correlations and interactions of aircraft fuel system design and environment with applicable characteristics of the fuel. Topics include: (1) analysis of in-flight fuel temperatures; (2) fuel systems for high freezing point fuels; (3) experimental study of low temperature pumpability; (4) full scale fuel tank simulation; and (5) rapid freezing point measurement.
An atlas of November 1978 synthetic aperture radar digitized imagery for oil spill studies
NASA Technical Reports Server (NTRS)
Maurer, H. E.; Oderman, W.; Crosswell, W. F.
1982-01-01
A data set is described which consists of digitized synthetic aperture radar (SAR) imagery plus correlative data and some preliminary analysis results. This data set should be of value to experimenters who are interested in the SAR instrument and its application to the detection and monitoring of oil on water and other distributed targets.
Thermo-electrochemical evaluation of lithium-ion batteries for space applications
NASA Astrophysics Data System (ADS)
Walker, W.; Yayathi, S.; Shaw, J.; Ardebili, H.
2015-12-01
Advanced energy storage and power management systems designed through rigorous materials selection, testing and analysis processes are essential to ensuring mission longevity and success for space exploration applications. Comprehensive testing of Boston Power Swing 5300 lithium-ion (Li-ion) cells utilized by the National Aeronautics and Space Administration (NASA) to power humanoid robot Robonaut 2 (R2) is conducted to support the development of a test-correlated Thermal Desktop (TD) Systems Improved Numerical Differencing Analyzer (SINDA) (TD-S) model for evaluation of power system thermal performance. Temperature, current, working voltage and open circuit voltage measurements are taken during nominal charge-discharge operations to provide necessary characterization of the Swing 5300 cells for TD-S model correlation. Building from test data, embedded FORTRAN statements directly simulate Ohmic heat generation of the cells during charge-discharge as a function of surrounding temperature, local cell temperature and state of charge. The unique capability gained by using TD-S is demonstrated by simulating R2 battery thermal performance in example orbital environments for hypothetical extra-vehicular activities (EVA) exterior to a small satellite. Results provide necessary demonstration of this TD-S technique for thermo-electrochemical analysis of Li-ion cells operating in space environments.
NASA Technical Reports Server (NTRS)
Demerdash, N. A.; Nehl, T. W.
1979-01-01
A comprehensive digital model for the analysis of the dynamic-instantaneous performance of a power conditioner fed samarium-cobalt permanent magnet brushless DC motor is presented. The particular power conditioner-machine system at hand, for which this model was developed, is a component of an actual prototype electromechanical actuator built for NASA-JSC as a possible alternative to hydraulic actuators as part of feasibility studies for the shuttle orbiter applications. Excellent correlation between digital simulated and experimentally obtained performance data was achieved for this specific prototype. This is reported on in this paper. Details of one component of the model, its applications and the corresponding results are given in this paper.
Using the CanMEDS roles when interviewing for an ophthalmology residency program.
Hamel, Patrick; Boisjoly, Hélène; Corriveau, Christine; Fallaha, Nicole; Lahoud, Salim; Luneau, Katie; Olivier, Sébastien; Rouleau, Jacinthe; Toffoli, Daniela
2007-04-01
To improve the admissions process for the Université de Montréal (UdeM) ophthalmology residency program, the interview structure was modified to encompass the seven CanMEDS roles introduced by the Royal College of Physicians and Surgeons of Canada (RCPSC). These roles include an applicant's abilities as a communicator, collaborator, manager, health advocate, professional, scholar, and medical expert. In this retrospective pilot study, the records of all applicants were reviewed by 8 members of the admissions committee, with a high intraclass correlation coefficient of 0.814. Four 2-person interview teams were then formed. The first 3 groups asked the applicants specific questions based on 2-3 of the CanMEDS roles, marking their impressions of each candidate on a visual analogue scale. The last group answered candidates' questions about the program but assigned no mark. The intraclass correlations for the teams were 0.900, 0.739, and 0.585, demonstrating acceptable interrater reliability for 2 of the teams. Pearson correlation coefficients between groups of interviewers were considered adequate at 0.562, 0.432, and 0.417 (p < 0.05). For each interviewer, the Pearson correlation coefficient between record marking and interview scoring was either not statistically significant or very low. By basing the 2006 interview process on the CanMEDS roles defined by the RCPSC, information was obtained about the candidates that could not have been retrieved by a review of the medical students' records alone. Reliability analysis confirmed that this new method of conducting interviews provided sound and reliable judging and rating consistency between all members of the admissions committee.
Ruan, Yunze; Xue, Chao; Zhang, Jian; Li, Rong; Shen, Qirong
2014-01-01
Our previous work demonstrated that application of a bio-organic fertilizer (BIO) to a banana mono-culture orchard with serious Fusarium wilt disease effectively decreased the number of soil Fusarium sp. and controlled the soil-borne disease. Because bacteria are an abundant and diverse group of soil organisms that responds to soil health, deep 16 S rRNA pyrosequencing was employed to characterize the composition of the bacterial community to investigate how it responded to BIO or the application of other common composts and to explore the potential correlation between bacterial community, BIO application and Fusarium wilt disease suppression. After basal quality control, 137,646 sequences and 9,388 operational taxonomic units (OTUs) were obtained from the 15 soil samples. Proteobacteria, Acidobacteria, Bacteroidetes, Gemmatimonadetes and Actinobacteria were the most frequent phyla and comprised up to 75.3% of the total sequences. Compared to the other soil samples, BIO-treated soil revealed higher abundances of Gemmatimonadetes and Acidobacteria, while Bacteroidetes were found in lower abundance. Meanwhile, on genus level, higher abundances compared to other treatments were observed for Gemmatimonas and Gp4. Correlation and redundancy analysis showed that the abundance of Gemmatimonas and Sphingomonas and the soil total nitrogen and ammonium nitrogen content were higher after BIO application, and they were all positively correlated with disease suppression. Cumulatively, the reduced Fusarium wilt disease incidence that was seen after BIO was applied for 1-year might be attributed to the general suppression based on a shift within the bacteria soil community, including specific enrichment of Gemmatimonas and Sphingomonas. PMID:24871319
Science with High Spatial Resolution Far-Infrared Data
NASA Technical Reports Server (NTRS)
Terebey, Susan (Editor); Mazzarella, Joseph M. (Editor)
1994-01-01
The goal of this workshop was to discuss new science and techniques relevant to high spatial resolution processing of far-infrared data, with particular focus on high resolution processing of IRAS data. Users of the maximum correlation method, maximum entropy, and other resolution enhancement algorithms applicable to far-infrared data gathered at the Infrared Processing and Analysis Center (IPAC) for two days in June 1993 to compare techniques and discuss new results. During a special session on the third day, interested astronomers were introduced to IRAS HIRES processing, which is IPAC's implementation of the maximum correlation method to the IRAS data. Topics discussed during the workshop included: (1) image reconstruction; (2) random noise; (3) imagery; (4) interacting galaxies; (5) spiral galaxies; (6) galactic dust and elliptical galaxies; (7) star formation in Seyfert galaxies; (8) wavelet analysis; and (9) supernova remnants.
Interactive effects of team cohesion on perceived efficacy in semi-professional sport.
Marcos, Francisco Miguel Leo; Miguel, Pedro Antonio Sánchez; Oliva, David Sánchez; Calvo, Tomás García
2010-01-01
The present study examined the relationships among cohesion, self-efficacy, coaches' perceptions of their players' efficacy at the individual level and athletes' perceptions of their teammates' efficacy. Participants (n = 76) recruited from four semi- professional soccer and basketball teams completed cohesiveness and efficacy questionnaires who. Data were analyzed through a correlational methodology. Results indicated significant correlations between self-efficacy and task cohesion and social cohesion. Regression analysis results suggest task cohesion positively related to coaches and teammate's perception of efficacy. These results have implications for practitioners in terms of the importance of team building to enhance team cohesion and feelings of efficacy. Key pointsThis paper increases the knowledge about soccer and basketball match analysis.Give normative values to establish practice and match objectives.Give applications ideas to connect research with coaches' practice.
Dort, Jonathan M; Trickey, Amber W; Kallies, Kara J; Joshi, Amit R T; Sidwell, Richard A; Jarman, Benjamin T
2015-01-01
This study evaluated characteristics of applicants selected for interview and ranked by independent general surgery residency programs and assessed independent program application volumes, interview selection, rank list formation, and match success. Demographic and academic information was analyzed for 2014-2015 applicants. Applicant characteristics were compared by ranking status using univariate and multivariable statistical techniques. Characteristics independently associated with whether or not an applicant was ranked were identified using multivariable logistic regression modeling with backward stepwise variable selection and cluster-correlated robust variance estimates to account for correlations among individuals who applied to multiple programs. The Electronic Residency Application Service was used to obtain applicant data and program match outcomes at 33 independent surgery programs. All applicants selected to interview at 33 participating independent general surgery residency programs were included in the study. Applicants were 60% male with median age of 26 years. Birthplace was well distributed. Most applicants (73%) had ≥1 academic publication. Median United States Medical Licensing Exams (USMLE) Step 1 score was 228 (interquartile range: 218-240), and median USMLE Step 2 clinical knowledge score was 241 (interquartile range: 231-250). Residency programs in some regions more often ranked applicants who attended medical school within the same region. On multivariable analysis, significant predictors of ranking by an independent residency program were: USMLE scores, medical school region, and birth region. Independent programs received an average of 764 applications (range: 307-1704). On average, 12% interviews, and 81% of interviewed applicants were ranked. Most programs (84%) matched at least 1 applicant ranked in their top 10. Participating independent programs attract a large volume of applicants and have high standards in the selection process. This information can be used by surgery residency applicants to gauge their candidacy at independent programs. Independent programs offer a select number of interviews, rank most applicants that they interview, and successfully match competitive applicants. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Magazù, Salvatore; Mezei, Ferenc; Migliardo, Federica
2018-05-01
In a variety of applications of inelastic neutron scattering spectroscopy the goal is to single out the elastic scattering contribution from the total scattered spectrum as a function of momentum transfer and sample environment parameters. The elastic part of the spectrum is defined in such a case by the energy resolution of the spectrometer. Variable elastic energy resolution offers a way to distinguish between elastic and quasi-elastic intensities. Correlation spectroscopy lends itself as an efficient, high intensity approach for accomplishing this both at continuous and pulsed neutron sources. On the one hand, in beam modulation methods the Liouville theorem coupling between intensity and resolution is relaxed and time-of-flight velocity analysis of the neutron velocity distribution can be performed with 50 % duty factor exposure for all available resolutions. On the other hand, the (quasi)elastic part of the spectrum generally contains the major part of the integrated intensity at a given detector, and thus correlation spectroscopy can be applied with most favorable signal to statistical noise ratio. The novel spectrometer CORELLI at SNS is an example for this type of application of the correlation technique at a pulsed source. On a continuous neutron source a statistical chopper can be used for quasi-random time dependent beam modulation and the total time-of-flight of the neutron from the statistical chopper to detection is determined by the analysis of the correlation between the temporal fluctuation of the neutron detection rate and the statistical chopper beam modulation pattern. The correlation analysis can either be used for the determination of the incoming neutron velocity or for the scattered neutron velocity, depending of the position of the statistical chopper along the neutron trajectory. These two options are considered together with an evaluation of spectrometer performance compared to conventional spectroscopy, in particular for variable resolution elastic neutron scattering (RENS) studies of relaxation processes and the evolution of mean square displacements. A particular focus of our analysis is the unique feature of correlation spectroscopy of delivering high and resolution independent beam intensity, thus the same statistical chopper scan contains both high intensity and high resolution information at the same time, and can be evaluated both ways. This flexibility for variable resolution data handling represents an additional asset for correlation spectroscopy in variable resolution work. Changing the beam width for the same statistical chopper allows us to additionally trade resolution for intensity in two different experimental runs, similarly for conventional single slit chopper spectroscopy. The combination of these two approaches is a capability of particular value in neutron spectroscopy studies requiring variable energy resolution, such as the systematic study of quasi-elastic scattering and mean square displacement. Furthermore the statistical chopper approach is particularly advantageous for studying samples with low scattering intensity in the presence of a high, sample independent background.
Kobayashi, Yoshikazu; Habara, Masaaki; Ikezazki, Hidekazu; Chen, Ronggang; Naito, Yoshinobu; Toko, Kiyoshi
2010-01-01
Effective R&D and strict quality control of a broad range of foods, beverages, and pharmaceutical products require objective taste evaluation. Advanced taste sensors using artificial-lipid membranes have been developed based on concepts of global selectivity and high correlation with human sensory score. These sensors respond similarly to similar basic tastes, which they quantify with high correlations to sensory score. Using these unique properties, these sensors can quantify the basic tastes of saltiness, sourness, bitterness, umami, astringency and richness without multivariate analysis or artificial neural networks. This review describes all aspects of these taste sensors based on artificial lipid, ranging from the response principle and optimal design methods to applications in the food, beverage, and pharmaceutical markets. PMID:22319306
Yoon, Hyung-In; Bae, Ji-Won; Park, Ji-Man; Chun, Youn-Sic; Kim, Mi-Ae; Kim, Minji
2016-11-07
To assess if color measurement with intraoral scanner correlates with digital colorimeter and to evaluate the possibility of application of a digital scanner for shade selection. The L*a*b* values of the five shade tabs (A1, A2, A3, A3.5, and A4) were obtained with an intraoral scanner (TRIOS Pod) and a colorimeter (ShadeEye). Both devices were calibrated according to the manufacturer's instructions before measurements. Color measurement values were compared with paired t-test, and a Pearson's correlation analysis was performed to evaluate the relationship of two methods. The L*a*b* values of the colorimeter were significantly different from those of the digital scanner (p < 0.001). The L* and b* values of both methods were strongly correlated with each other (both p < 0.05). The device repeatability in both methods were reported to be excellent (p < 0.05). Within the limitations of this study, color measurements with digital intraoral scanners and computer-assisted image analysis were in accordance with those of the colorimeter with respect to L* and b* values; however, all the coordinates of shade tabs were significantly different between two methods. The digital intraoral scanner may not be used as the primary method of color selection in clinical practices, considering significant differences in color parameters with colorimeter. The scanner's capability in shade selection should be further evaluated. © 2016 by the American College of Prosthodontists.
Relationship between student selection criteria and learner success for medical dosimetry students.
Baker, Jamie; Tucker, Debra; Raynes, Edilberto; Aitken, Florence; Allen, Pamela
2016-01-01
Medical dosimetry education occupies a specialized branch of allied health higher education. Noted international shortages of health care workers, reduced university funding, limitations on faculty staffing, trends in learner attrition, and increased enrollment of nontraditional students force medical dosimetry educational leadership to reevaluate current admission practices. Program officials wish to select medical dosimetry students with the best chances of successful graduation. The purpose of the quantitative ex post facto correlation study was to investigate the relationship between applicant characteristics (cumulative undergraduate grade point average (GPA), science grade point average (SGPA), prior experience as a radiation therapist, and previous academic degrees) and the successful completion of a medical dosimetry program, as measured by graduation. A key finding from the quantitative study was the statistically significant positive correlation between a student׳s previous degree and his or her successful graduation from the medical dosimetry program. Future research investigations could include a larger research sample, representative of more medical dosimetry student populations, and additional studies concerning the relationship of previous work as a radiation therapist and the effect on success as a medical dosimetry student. Based on the quantitative correlation analysis, medical dosimetry leadership on admissions committees could revise student selection rubrics to place less emphasis on an applicant׳s undergraduate cumulative GPA and increase the weight assigned to previous degrees. Copyright © 2016 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
Westgate, Philip M
2013-07-20
Generalized estimating equations (GEEs) are routinely used for the marginal analysis of correlated data. The efficiency of GEE depends on how closely the working covariance structure resembles the true structure, and therefore accurate modeling of the working correlation of the data is important. A popular approach is the use of an unstructured working correlation matrix, as it is not as restrictive as simpler structures such as exchangeable and AR-1 and thus can theoretically improve efficiency. However, because of the potential for having to estimate a large number of correlation parameters, variances of regression parameter estimates can be larger than theoretically expected when utilizing the unstructured working correlation matrix. Therefore, standard error estimates can be negatively biased. To account for this additional finite-sample variability, we derive a bias correction that can be applied to typical estimators of the covariance matrix of parameter estimates. Via simulation and in application to a longitudinal study, we show that our proposed correction improves standard error estimation and statistical inference. Copyright © 2012 John Wiley & Sons, Ltd.
Cross-Correlation Asymmetries and Causal Relationships between Stock and Market Risk
Borysov, Stanislav S.; Balatsky, Alexander V.
2014-01-01
We study historical correlations and lead-lag relationships between individual stock risk (volatility of daily stock returns) and market risk (volatility of daily returns of a market-representative portfolio) in the US stock market. We consider the cross-correlation functions averaged over all stocks, using 71 stock prices from the Standard & Poor's 500 index for 1994–2013. We focus on the behavior of the cross-correlations at the times of financial crises with significant jumps of market volatility. The observed historical dynamics showed that the dependence between the risks was almost linear during the US stock market downturn of 2002 and after the US housing bubble in 2007, remaining at that level until 2013. Moreover, the averaged cross-correlation function often had an asymmetric shape with respect to zero lag in the periods of high correlation. We develop the analysis by the application of the linear response formalism to study underlying causal relations. The calculated response functions suggest the presence of characteristic regimes near financial crashes, when the volatility of an individual stock follows the market volatility and vice versa. PMID:25162697
Cross-correlation asymmetries and causal relationships between stock and market risk.
Borysov, Stanislav S; Balatsky, Alexander V
2014-01-01
We study historical correlations and lead-lag relationships between individual stock risk (volatility of daily stock returns) and market risk (volatility of daily returns of a market-representative portfolio) in the US stock market. We consider the cross-correlation functions averaged over all stocks, using 71 stock prices from the Standard & Poor's 500 index for 1994-2013. We focus on the behavior of the cross-correlations at the times of financial crises with significant jumps of market volatility. The observed historical dynamics showed that the dependence between the risks was almost linear during the US stock market downturn of 2002 and after the US housing bubble in 2007, remaining at that level until 2013. Moreover, the averaged cross-correlation function often had an asymmetric shape with respect to zero lag in the periods of high correlation. We develop the analysis by the application of the linear response formalism to study underlying causal relations. The calculated response functions suggest the presence of characteristic regimes near financial crashes, when the volatility of an individual stock follows the market volatility and vice versa.
Coupled attenuation and multiscale damage model for composite structures
NASA Astrophysics Data System (ADS)
Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett; Arnold, Steven M.
2011-04-01
Composite materials are widely used in many applications for their high strength, low weight, and tailorability for specific applications. However, the development of robust and reliable methodologies to detect micro level damage in composite structures has been challenging. For composite materials, attenuation of ultrasonic waves propagating through the media can be used to determine damage within the material. Currently available numerical solutions for attenuation induce arbitrary damage, such as fiber-matrix debonding or inclusions, to show variations between healthy and damaged states. This paper addresses this issue by integrating a micromechanics analysis to simulate damage in the form of a fiber-matrix crack and an analytical model for calculating the attenuation of the waves when they pass through the damaged region. The hybrid analysis is validated by comparison with experimental stress-strain curves and piezoelectric sensing results for attenuation measurement. The results showed good agreement between the experimental stress-strain curves and the results from the micromechanics analysis. Wave propagation analysis also showed good correlation between simulation and experiment for the tested frequency range.
Pande, Tripti; Saravu, Kavitha; Temesgen, Zelalem; Seyoum, Al; Rai, Shipra; Rao, Raghavendra; Mahadev, Deekshith; Pai, Madhukar; Gagnon, Marie-Pierre
2017-01-01
Tuberculosis (TB) is the leading infectious killer, and India accounts for 2.8 of the 10.4 million TB cases that occur each year, making it the highest TB burden country worldwide. Poor quality of TB care is a major driver of the epidemic in India. India's large private, unregulated sector manages over 50% of the TB patients, with studies showing suboptimal diagnosis and treatment in the private sector. Better education of doctors using mobile applications (apps) is a possible solution. While India has seen an explosion of mobile phone services, and while the use of mobile health interventions has been gaining interest, little is known about mHealth around tuberculosis in India. Our study aimed to understand the user experience and acceptability of a smartphone application, LearnTB , amongst private sector academic clinicians in India. This study was conducted amongst 101 clinicians at Kasturba Hospital, Manipal, India. The user experience of participants (part 1) and acceptability (part 2) were evaluated with the use of two valid, English, paper-based questionnaires. The first questionnaire was based on the System Usability Scale (SUS); the second questionnaire was based on the Technology Acceptance Model (TAM). Data were collected during February and March 2017 and were analyzed using descriptive statistics, multiple linear regression as well as logistic regression analysis. A response rate of 99% was achieved; 100 participants responded to the second questionnaire and 100% of the participants responded to the first questionnaire. User experience was very high [mean SUS score =94.4 (92.07-96.76)]. Perceived usefulness (PU) was significantly correlated to intention to use (IU) (r=0.707, P<0.0001), and perceived ease of use (PEU) was significantly correlated to PU (r=0.466, P<0.0001). Path analysis confirmed the direct relationship between PU and IU (0.936, P<0.0001), and the indirect relationship between PEU and IU (0.5102, P<0.0001). Logistic regression analysis helped target items strongly influencing IU, such as "The use of the LearnTB application is compatible with my work habits" [OR =3.20 (1.04-9.84), P=0.004] and "The use of the LearnTB application could promote good clinical practice" [OR =5.23 (1.35-20.29); P=0.016]. The first part of the study indicated high user experience of the LearnTB application. The TAM questionnaire (second part) explained a significant portion of the variance in clinicians' IU the LearnTB application. The PU of the application has the highest impact on the clinicians' IU the Learn TB application. This study provides a preliminary analysis of mobile health interventions for tuberculosis in India, and emphasizes the need for future research in this domain.
2004 Photon Correlation and Scattering Conference
NASA Technical Reports Server (NTRS)
Meyer, William (Editor); Smart, Anthony (Editor); Wegdam, Gerard (Editor); Dogariu, Aristide (Editor); Carpenter, Bradley (Editor)
2004-01-01
The Photon Correlation and Scattering (PCS) meeting welcomes all who are interested in the art and science of photon correlation and its application to optical scattering. The meeting is intended to enhance interactions between theory, applications, instrument design, and participants.
Han, Sheng-Nan
2014-07-01
Chemometrics is a new branch of chemistry which is widely applied to various fields of analytical chemistry. Chemometrics can use theories and methods of mathematics, statistics, computer science and other related disciplines to optimize the chemical measurement process and maximize access to acquire chemical information and other information on material systems by analyzing chemical measurement data. In recent years, traditional Chinese medicine has attracted widespread attention. In the research of traditional Chinese medicine, it has been a key problem that how to interpret the relationship between various chemical components and its efficacy, which seriously restricts the modernization of Chinese medicine. As chemometrics brings the multivariate analysis methods into the chemical research, it has been applied as an effective research tool in the composition-activity relationship research of Chinese medicine. This article reviews the applications of chemometrics methods in the composition-activity relationship research in recent years. The applications of multivariate statistical analysis methods (such as regression analysis, correlation analysis, principal component analysis, etc. ) and artificial neural network (such as back propagation artificial neural network, radical basis function neural network, support vector machine, etc. ) are summarized, including the brief fundamental principles, the research contents and the advantages and disadvantages. Finally, the existing main problems and prospects of its future researches are proposed.
Mnatsakanyan, Mariam; Stevenson, Paul G; Shock, David; Conlan, Xavier A; Goodie, Tiffany A; Spencer, Kylie N; Barnett, Neil W; Francis, Paul S; Shalliker, R Andrew
2010-09-15
Differences between alkyl, dipole-dipole, hydrogen bonding, and pi-pi selective surfaces represented by non-resonance and resonance pi-stationary phases have been assessed for the separation of 'Ristretto' café espresso by employing 2DHPLC techniques with C18 phase selectivity detection. Geometric approach to factor analysis (GAFA) was used to measure the detected peaks (N), spreading angle (beta), correlation, practical peak capacity (n(p)) and percentage usage of the separations space, as an assessment of selectivity differences between regional quadrants of the two-dimensional separation plane. Although all tested systems were correlated to some degree to the C18 dimension, regional measurement of separation divergence revealed that performance of specific systems was better for certain sample components. The results illustrate that because of the complexity of the 'real' sample obtaining a truly orthogonal two-dimensional system for complex samples of natural origin may be practically impossible. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Model-based reconstruction of synthetic promoter library in Corynebacterium glutamicum.
Zhang, Shuanghong; Liu, Dingyu; Mao, Zhitao; Mao, Yufeng; Ma, Hongwu; Chen, Tao; Zhao, Xueming; Wang, Zhiwen
2018-05-01
To develop an efficient synthetic promoter library for fine-tuned expression of target genes in Corynebacterium glutamicum. A synthetic promoter library for C. glutamicum was developed based on conserved sequences of the - 10 and - 35 regions. The synthetic promoter library covered a wide range of strengths, ranging from 1 to 193% of the tac promoter. 68 promoters were selected and sequenced for correlation analysis between promoter sequence and strength with a statistical model. A new promoter library was further reconstructed with improved promoter strength and coverage based on the results of correlation analysis. Tandem promoter P70 was finally constructed with increased strength by 121% over the tac promoter. The promoter library developed in this study showed a great potential for applications in metabolic engineering and synthetic biology for the optimization of metabolic networks. To the best of our knowledge, this is the first reconstruction of synthetic promoter library based on statistical analysis of C. glutamicum.
Wang, Shutao; Wang, Yan; You, Hong; Liang, Zhihua
2004-09-01
A novel activated carbon coating fiber used for solid phase micro-extraction (SPME) was prepared using activated carbon powder and silica resin adhesive. The extraction properties of the novel activated carbon coating fiber were investigated. The results indicate that this coating fiber has high concentration ability, with enrichment factors for chloroform, carbon tetrachloride, trichloroethylene and tetrachloroethylene in the range of 13.8 to 18.7. The fiber is stable at temperature as high as 290 degrees C and it can be used for over 140 times at 250 degrees C. The activated carbon coating fiber was then applied to the analysis of the four halocarbon compounds mentioned above. A linear correlation with correlation coefficients between 0.995 2 and 0.999 4 and the detection limits between 0.008 and 0.05 microg/L were observed. The method was also applied to a real water sample analysis and the recoveries of these halocarbon compounds were from 95.5% to 104.6%.
Image analysis of pubic bone for age estimation in a computed tomography sample.
López-Alcaraz, Manuel; González, Pedro Manuel Garamendi; Aguilera, Inmaculada Alemán; López, Miguel Botella
2015-03-01
Radiology has demonstrated great utility for age estimation, but most of the studies are based on metrical and morphological methods in order to perform an identification profile. A simple image analysis-based method is presented, aimed to correlate the bony tissue ultrastructure with several variables obtained from the grey-level histogram (GLH) of computed tomography (CT) sagittal sections of the pubic symphysis surface and the pubic body, and relating them with age. The CT sample consisted of 169 hospital Digital Imaging and Communications in Medicine (DICOM) archives of known sex and age. The calculated multiple regression models showed a maximum R (2) of 0.533 for females and 0.726 for males, with a high intra- and inter-observer agreement. The method suggested is considered not only useful for performing an identification profile during virtopsy, but also for application in further studies in order to attach a quantitative correlation for tissue ultrastructure characteristics, without complex and expensive methods beyond image analysis.
Towards a nondestructive chemical characterization of biofilm matrix by Raman microscopy.
Ivleva, Natalia P; Wagner, Michael; Horn, Harald; Niessner, Reinhard; Haisch, Christoph
2009-01-01
In this study, the applicability of Raman microscopy (RM) for nondestructive chemical analysis of biofilm matrix, including microbial constituents and extracellular polymeric substances (EPS), has been assessed. The examination of a wide range of reference samples such as biofilm-specific polysaccharides, proteins, microorganisms, and encapsulated bacteria revealed characteristic frequency regions and specific marker bands for different biofilm constituents. Based on received data, the assignment of Raman bands in spectra of multispecies biofilms was performed. The study of different multispecies biofilms showed that RM can correlate various structural appearances within the biofilm to variations in their chemical composition and provide chemical information about a complex biofilm matrix. The results of RM analysis of biofilms are in good agreement with data obtained by confocal laser scanning microscopy (CLSM). Thus, RM is a promising tool for a label-free chemical characterization of different biofilm constituents. Moreover, the combination of RM with CLSM analysis for the study of biofilms grown under different environmental conditions can provide new insights into the complex structure/function correlations in biofilms.
Fractals in biology and medicine
NASA Technical Reports Server (NTRS)
Havlin, S.; Buldyrev, S. V.; Goldberger, A. L.; Mantegna, R. N.; Ossadnik, S. M.; Peng, C. K.; Simons, M.; Stanley, H. E.
1995-01-01
Our purpose is to describe some recent progress in applying fractal concepts to systems of relevance to biology and medicine. We review several biological systems characterized by fractal geometry, with a particular focus on the long-range power-law correlations found recently in DNA sequences containing noncoding material. Furthermore, we discuss the finding that the exponent alpha quantifying these long-range correlations ("fractal complexity") is smaller for coding than for noncoding sequences. We also discuss the application of fractal scaling analysis to the dynamics of heartbeat regulation, and report the recent finding that the normal heart is characterized by long-range "anticorrelations" which are absent in the diseased heart.
Crossover transition in the fluctuation of Internet
NASA Astrophysics Data System (ADS)
Qian, Jiang-Hai
2018-06-01
The inconsistent fluctuation behavior of Internet predicted by preferential attachment(PA) and Gibrat's law requires empirical investigations on the actual system. By using the interval-tunable Gibrat's law statistics, we find the actual fluctuation, characterized by the conditional standard deviation of the degree growth rate, changes with the interval length and displays a crossover transition from PA type to Gibrat's law type, which has not yet been captured by any previous models. We characterize the transition dynamics quantitatively and determine the applicative range of PA and Gibrat's law. The correlation analysis indicates the crossover transition may be attributed to the accumulative correlation between the internal links.
NASA Technical Reports Server (NTRS)
Cook, W. J.
1973-01-01
A theoretical study of heat transfer for zero pressure gradient hypersonic laminar boundary layers for various gases with particular application to the flows produced in an expansion tube facility was conducted. A correlation based on results obtained from solutions to the governing equations for five gases was formulated. Particular attention was directed toward the laminar boundary layer shock tube splitter plates in carbon dioxide flows generated by high speed shock waves. Computer analysis of the splitter plate boundary layer flow provided information that is useful in interpreting experimental data obtained in shock tube gas radiation studies.
What does the multiple mini interview have to offer over the panel interview?
Pau, Allan; Chen, Yu Sui; Lee, Verna Kar Mun; Sow, Chew Fei; De Alwis, Ranjit
2016-01-01
This paper compares the panel interview (PI) performance with the multiple mini interview (MMI) performance and indication of behavioural concerns of a sample of medical school applicants. The acceptability of the MMI was also assessed. All applicants shortlisted for a PI were invited to an MMI. Applicants attended a 30-min PI with two faculty interviewers followed by an MMI consisting of ten 8-min stations. Applicants were assessed on their performance at each MMI station by one faculty. The interviewer also indicated if they perceived the applicant to be a concern. Finally, applicants completed an acceptability questionnaire. From the analysis of 133 (75.1%) completed MMI scoresheets, the MMI scores correlated statistically significantly with the PI scores (r=0.438, p=0.001). Both were not statistically associated with sex, age, race, or pre-university academic ability to any significance. Applicants assessed as a concern at two or more stations performed statistically significantly less well at the MMI when compared with those who were assessed as a concern at one station or none at all. However, there was no association with PI performance. Acceptability scores were generally high, and comparison of mean scores for each of the acceptability questionnaire items did not show statistically significant differences between sex and race categories. Although PI and MMI performances are correlated, the MMI may have the added advantage of more objectively generating multiple impressions of the applicant's interpersonal skill, thoughtfulness, and general demeanour. Results of the present study indicated that the MMI is acceptable in a multicultural context.
What does the multiple mini interview have to offer over the panel interview?
Pau, Allan; Chen, Yu Sui; Lee, Verna Kar Mun; Sow, Chew Fei; Alwis, Ranjit De
2016-01-01
Introduction This paper compares the panel interview (PI) performance with the multiple mini interview (MMI) performance and indication of behavioural concerns of a sample of medical school applicants. The acceptability of the MMI was also assessed. Materials and methods All applicants shortlisted for a PI were invited to an MMI. Applicants attended a 30-min PI with two faculty interviewers followed by an MMI consisting of ten 8-min stations. Applicants were assessed on their performance at each MMI station by one faculty. The interviewer also indicated if they perceived the applicant to be a concern. Finally, applicants completed an acceptability questionnaire. Results From the analysis of 133 (75.1%) completed MMI scoresheets, the MMI scores correlated statistically significantly with the PI scores (r=0.438, p=0.001). Both were not statistically associated with sex, age, race, or pre-university academic ability to any significance. Applicants assessed as a concern at two or more stations performed statistically significantly less well at the MMI when compared with those who were assessed as a concern at one station or none at all. However, there was no association with PI performance. Acceptability scores were generally high, and comparison of mean scores for each of the acceptability questionnaire items did not show statistically significant differences between sex and race categories. Conclusions Although PI and MMI performances are correlated, the MMI may have the added advantage of more objectively generating multiple impressions of the applicant's interpersonal skill, thoughtfulness, and general demeanour. Results of the present study indicated that the MMI is acceptable in a multicultural context. PMID:26873337
What does the multiple mini interview have to offer over the panel interview?
Pau, Allan; Chen, Yu Sui; Lee, Verna Kar Mun; Sow, Chew Fei; Alwis, Ranjit De
2016-01-01
Introduction This paper compares the panel interview (PI) performance with the multiple mini interview (MMI) performance and indication of behavioural concerns of a sample of medical school applicants. The acceptability of the MMI was also assessed. Materials and methods All applicants shortlisted for a PI were invited to an MMI. Applicants attended a 30-min PI with two faculty interviewers followed by an MMI consisting of ten 8-min stations. Applicants were assessed on their performance at each MMI station by one faculty. The interviewer also indicated if they perceived the applicant to be a concern. Finally, applicants completed an acceptability questionnaire. Results From the analysis of 133 (75.1%) completed MMI scoresheets, the MMI scores correlated statistically significantly with the PI scores (r=0.438, p=0.001). Both were not statistically associated with sex, age, race, or pre-university academic ability to any significance. Applicants assessed as a concern at two or more stations performed statistically significantly less well at the MMI when compared with those who were assessed as a concern at one station or none at all. However, there was no association with PI performance. Acceptability scores were generally high, and comparison of mean scores for each of the acceptability questionnaire items did not show statistically significant differences between sex and race categories. Conclusions Although PI and MMI performances are correlated, the MMI may have the added advantage of more objectively generating multiple impressions of the applicant's interpersonal skill, thoughtfulness, and general demeanour. Results of the present study indicated that the MMI is acceptable in a multicultural context.
Meyer, Hans Jonas; Höhn, Annekathrin; Surov, Alexey
2018-04-06
Functional imaging modalities like Diffusion-weighted imaging are increasingly used to predict tumor behavior like cellularity and vascularity in different tumors. Histogram analysis is an emergent imaging analysis, in which every voxel is used to obtain a histogram and therefore statistically information about tumors can be provided. The purpose of this study was to elucidate possible associations between ADC histogram parameters and several immunhistochemical features in rectal cancer. Overall, 11 patients with histologically proven rectal cancer were included into the study. There were 2 (18.18%) females and 9 males with a mean age of 67.1 years. KI 67-index, expression of p53, EGFR, VEGF, and Hif1-alpha were semiautomatically estimated. The tumors were divided into PD1-positive and PD1-negative lesions. ADC histogram analysis was performed as a whole lesion measurement using an in-house matlab application. Spearman's correlation analysis revealed a strong correlation between EGFR expression and ADCmax (p=0.72, P=0.02). None of the vascular parameters (VEGF, Hif1-alpha) correlated with ADC parameters. Kurtosis and skewness correlated inversely with p53 expression (p=-0.64, P=0.03 and p=-0.81, P=0.002, respectively). ADCmedian and ADCmode correlated with Ki67 (p=-0.62, P=0.04 and p=-0.65, P=0.03, respectively). PD1-positive tumors showed statistically significant lower ADCmax values in comparison to PD1-negative tumors, 1.93 ± 0.36 vs 2.32 ± 0.47×10 -3 mm 2 /s, p=0.04. Several associations were identified between histogram parameter derived from ADC maps and EGFR, KI 67 and p53 expression in rectal cancer. Furthermore, ADCmax was different between PD1 positive and PD1 negative tumors indicating an important role of ADC parameters for possible future treatment prediction.
Meyer, Hans Jonas; Höhn, Annekathrin; Surov, Alexey
2018-01-01
Functional imaging modalities like Diffusion-weighted imaging are increasingly used to predict tumor behavior like cellularity and vascularity in different tumors. Histogram analysis is an emergent imaging analysis, in which every voxel is used to obtain a histogram and therefore statistically information about tumors can be provided. The purpose of this study was to elucidate possible associations between ADC histogram parameters and several immunhistochemical features in rectal cancer. Overall, 11 patients with histologically proven rectal cancer were included into the study. There were 2 (18.18%) females and 9 males with a mean age of 67.1 years. KI 67-index, expression of p53, EGFR, VEGF, and Hif1-alpha were semiautomatically estimated. The tumors were divided into PD1-positive and PD1-negative lesions. ADC histogram analysis was performed as a whole lesion measurement using an in-house matlab application. Spearman's correlation analysis revealed a strong correlation between EGFR expression and ADCmax (p=0.72, P=0.02). None of the vascular parameters (VEGF, Hif1-alpha) correlated with ADC parameters. Kurtosis and skewness correlated inversely with p53 expression (p=-0.64, P=0.03 and p=-0.81, P=0.002, respectively). ADCmedian and ADCmode correlated with Ki67 (p=-0.62, P=0.04 and p=-0.65, P=0.03, respectively). PD1-positive tumors showed statistically significant lower ADCmax values in comparison to PD1-negative tumors, 1.93 ± 0.36 vs 2.32 ± 0.47×10−3mm2/s, p=0.04. Several associations were identified between histogram parameter derived from ADC maps and EGFR, KI 67 and p53 expression in rectal cancer. Furthermore, ADCmax was different between PD1 positive and PD1 negative tumors indicating an important role of ADC parameters for possible future treatment prediction. PMID:29719621
PERIODIC AUTOREGRESSIVE-MOVING AVERAGE (PARMA) MODELING WITH APPLICATIONS TO WATER RESOURCES.
Vecchia, A.V.
1985-01-01
Results involving correlation properties and parameter estimation for autogressive-moving average models with periodic parameters are presented. A multivariate representation of the PARMA model is used to derive parameter space restrictions and difference equations for the periodic autocorrelations. Close approximation to the likelihood function for Gaussian PARMA processes results in efficient maximum-likelihood estimation procedures. Terms in the Fourier expansion of the parameters are sequentially included, and a selection criterion is given for determining the optimal number of harmonics to be included. Application of the techniques is demonstrated through analysis of a monthly streamflow time series.
Ryu, Ju Seok; Park, Donghwi; Oh, Yoongul; Lee, Seok Tae; Kang, Jin Young
2016-01-01
Background/Aims The purpose of this study was to develop new parameters of high-resolution manometry (HRM) and to applicate these to quantify the effect of bolus volume and texture on pharyngeal swallowing. Methods Ten healthy subjects prospectively swallowed dry, thin fluid 2 mL, thin fluid 5 mL, thin fluid 10 mL, and drinking twice to compare effects of bolus volume. To compare effect of texture, subjects swallowed thin fluid 5 mL, yogurt 5 mL, and bread twice. A 32-sensor HRM catheter and BioVIEW ANALYSIS software were used for data collection and analysis. HRM data were synchronized with kinematic analysis of videofluoroscopic swallowing study (VFSS) using epiglottis tilting. Results Linear correlation analysis for volume showed significant correlation for area of velopharynx, duration of velopharynx, pre-upper esophageal sphincter (UES) maximal pressure, minimal UES pressure, UES activity time, and nadir UES duration. In the correlation with texture, all parameters were not significantly different. The contraction of the velopharynx was faster than laryngeal elevation. The durations of UES relaxation was shorter in the kinematic analysis than HRM. Conclusions The bolus volume was shown to have significant effect on pharyngeal pressure and timing, but the texture did not show any effect on pharyngeal swallowing. The parameters of HRM were more sensitive than those of kinematic analysis. As the parameters of HRM are based on precise anatomic structure and the kinematic analysis reflects the actions of multiple anatomic structures, HRM and VFSS should be used according to their purposes. PMID:26598598
Hospital electronic medical record enterprise application strategies: do they matter?
Fareed, Naleef; Ozcan, Yasar A; DeShazo, Jonathan P
2012-01-01
Successful implementations and the ability to reap the benefits of electronic medical record (EMR) systems may be correlated with the type of enterprise application strategy that an administrator chooses when acquiring an EMR system. Moreover, identifying the most optimal enterprise application strategy is a task that may have important linkages with hospital performance. This study explored whether hospitals that have adopted differential EMR enterprise application strategies concomitantly differ in their overall efficiency. Specifically, the study examined whether hospitals with a single-vendor strategy had a higher likelihood of being efficient than those with a best-of-breed strategy and whether hospitals with a best-of-suite strategy had a higher probability of being efficient than those with best-of-breed or single-vendor strategies. A conceptual framework was used to formulate testable hypotheses. A retrospective cross-sectional approach using data envelopment analysis was used to obtain efficiency scores of hospitals by EMR enterprise application strategy. A Tobit regression analysis was then used to determine the probability of a hospital being inefficient as related to its EMR enterprise application strategy, while moderating for the hospital's EMR "implementation status" and controlling for hospital and market characteristics. The data envelopment analysis of hospitals suggested that only 32 hospitals were efficient in the study's sample of 2,171 hospitals. The results from the post hoc analysis showed partial support for the hypothesis that hospitals with a best-of-suite strategy were more likely to be efficient than those with a single-vendor strategy. This study underscores the importance of understanding the differences between the three strategies discussed in this article. On the basis of the findings, hospital administrators should consider the efficiency associations that a specific strategy may have compared with another prior to moving toward an enterprise application strategy.
Bakbergenuly, Ilyas; Morgenthaler, Stephan
2016-01-01
We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062
NASA Astrophysics Data System (ADS)
Felipe-Sesé, Luis; López-Alba, Elías; Siegmann, Philip; Díaz, Francisco A.
2016-12-01
A low-cost approach for three-dimensional (3-D) full-field displacement measurement is applied for the analysis of large displacements involved in two different mechanical events. The method is based on a combination of fringe projection and two-dimensional digital image correlation (DIC) techniques. The two techniques have been employed simultaneously using an RGB camera and a color encoding method; therefore, it is possible to measure in-plane and out-of-plane displacements at the same time with only one camera even at high speed rates. The potential of the proposed methodology has been employed for the analysis of large displacements during contact experiments in a soft material block. Displacement results have been successfully compared with those obtained using a 3D-DIC commercial system. Moreover, the analysis of displacements during an impact test on a metal plate was performed to emphasize the application of the methodology for dynamics events. Results show a good level of agreement, highlighting the potential of FP + 2D DIC as low-cost alternative for the analysis of large deformations problems.
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
A new procedure of modal parameter estimation for high-speed digital image correlation
NASA Astrophysics Data System (ADS)
Huňady, Róbert; Hagara, Martin
2017-09-01
The paper deals with the use of 3D digital image correlation in determining modal parameters of mechanical systems. It is a non-contact optical method, which for the measurement of full-field spatial displacements and strains of bodies uses precise digital cameras with high image resolution. Most often this method is utilized for testing of components or determination of material properties of various specimens. In the case of using high-speed cameras for measurement, the correlation system is capable of capturing various dynamic behaviors, including vibration. This enables the potential use of the mentioned method in experimental modal analysis. For that purpose, the authors proposed a measuring chain for the correlation system Q-450 and developed a software application called DICMAN 3D, which allows the direct use of this system in the area of modal testing. The created application provides the post-processing of measured data and the estimation of modal parameters. It has its own graphical user interface, in which several algorithms for the determination of natural frequencies, mode shapes and damping of particular modes of vibration are implemented. The paper describes the basic principle of the new estimation procedure which is crucial in the light of post-processing. Since the FRF matrix resulting from the measurement is usually relatively large, the estimation of modal parameters directly from the FRF matrix may be time-consuming and may occupy a large part of computer memory. The procedure implemented in DICMAN 3D provides a significant reduction in memory requirements and computational time while achieving a high accuracy of modal parameters. Its computational efficiency is particularly evident when the FRF matrix consists of thousands of measurement DOFs. The functionality of the created software application is presented on a practical example in which the modal parameters of a composite plate excited by an impact hammer were determined. For the verification of the obtained results a verification experiment was conducted during which the vibration responses were measured using conventional acceleration sensors. In both cases MIMO analysis was realized.
Analysis of Skylab fluid mechanics science demonstrations
NASA Technical Reports Server (NTRS)
Tegart, J. R.; Butz, J. R.
1975-01-01
The results of the data reduction and analysis of the Skylab fluid mechanics demonstrations are presented. All the fluid mechanics data available from the Skylab missions were identified and surveyed. The significant fluid mechanics phenomena were identified and reduced to measurable quantities wherever possible. Data correlations were performed using existing theories. Among the phenomena analyzed were: static low-g interface shapes, oscillation frequency and damping of a liquid drop, coalescence, rotating drop, liquid films and low-g ice melting. A survey of the possible applications of the results was made and future experiments are recommended.
Correlation signatures of wet soils and snows. [algorithm development and computer programming
NASA Technical Reports Server (NTRS)
Phillips, M. R.
1972-01-01
Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.
DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.
Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco
2017-07-24
DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Application of TIMS data in stratigraphic analysis
NASA Technical Reports Server (NTRS)
Lang, H. R.
1986-01-01
An in-progress study demonstrates the utility of Thermal Infrared Multispectral Scanner (TIMS) data for unraveling the stratigraphic sequence of a western interior, North American foreland basin. The TIMS data can be used to determine the stratigraphic distribution of minerals that are diagnostic of specific depositional distribution. The thematic mapper (TM) and TIMS data were acquired in the Wind River/Bighorn area of central Wyoming in November 1982, and July 1983, respectively. Combined image processing, photogeologic, and spectral analysis methods were used to: map strata; construct stratigraphic columns; correlate data; and identify mineralogical facies.
Development of Heterostructure Materials for Thermoelectric Device Applications
2005-08-01
morphology changes as thick QDSLs are grown. Therefore, a correlation of strain and film morphology by x - ray and TEM analysis will be important for...triple axis x - ray analysis and atomic force microscopy (AFM) will be carried out at MIT while thermoelectric measurements will be carried out at...2.5 , zT= S 2CT (1) BI Tt/STo, PbTaSeTe/PbT’ (1) U E QUANTUM DOTS 1* 2.0 SUPERLATTICES SUPsmxrTICS ge. Materials with ZT>>I are of =". x great interest
NASA Technical Reports Server (NTRS)
Gradl, Paul
2016-01-01
NASA Marshall Space Flight Center (MSFC) has been advancing dynamic optical measurement systems, primarily Digital Image Correlation, for extreme environment rocket engine test applications. The Digital Image Correlation (DIC) technology is used to track local and full field deformations, displacement vectors and local and global strain measurements. This technology has been evaluated at MSFC through lab testing to full scale hotfire engine testing of the J-2X Upper Stage engine at Stennis Space Center. It has been shown to provide reliable measurement data and has replaced many traditional measurement techniques for NASA applications. NASA and AMRDEC have recently signed agreements for NASA to train and transition the technology to applications for missile and helicopter testing. This presentation will provide an overview and progression of the technology, various testing applications at NASA MSFC, overview of Army-NASA test collaborations and application lessons learned about Digital Image Correlation.
NASA Astrophysics Data System (ADS)
Betterle, A.; Radny, D.; Schirmer, M.; Botter, G.
2017-12-01
The spatial correlation of daily streamflows represents a statistical index encapsulating the similarity between hydrographs at two arbitrary catchment outlets. In this work, a process-based analytical framework is utilized to investigate the hydrological drivers of streamflow spatial correlation through an extensive application to 78 pairs of stream gauges belonging to 13 unregulated catchments in the eastern United States. The analysis provides insight on how the observed heterogeneity of the physical processes that control flow dynamics ultimately affect streamflow correlation and spatial patterns of flow regimes. Despite the variability of recession properties across the study catchments, the impact of heterogeneous drainage rates on the streamflow spatial correlation is overwhelmed by the spatial variability of frequency and intensity of effective rainfall events. Overall, model performances are satisfactory, with root mean square errors between modeled and observed streamflow spatial correlation below 10% in most cases. We also propose a method for estimating streamflow correlation in the absence of discharge data, which proves useful to predict streamflow regimes in ungauged areas. The method consists in setting a minimum threshold on the modeled flow correlation to individuate hydrologically similar sites. Catchment outlets that are most correlated (ρ>0.9) are found to be characterized by analogous streamflow distributions across a broad range of flow regimes.
Novel composites for wing and fuselage applications
NASA Technical Reports Server (NTRS)
Sobel, L. H.; Buttitta, C.; Suarez, J. A.
1995-01-01
Probabilistic predictions based on the IPACS code are presented for the material and structural response of unnotched and notched, IM6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is judged poor because IPACS did not have a progressive failure capability at the time this work was performed. The report also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.
Uncertainty Analysis on Heat Transfer Correlations for RP-1 Fuel in Copper Tubing
NASA Technical Reports Server (NTRS)
Driscoll, E. A.; Landrum, D. B.
2004-01-01
NASA is studying kerosene (RP-1) for application in Next Generation Launch Technology (NGLT). Accurate heat transfer correlations in narrow passages at high temperatures and pressures are needed. Hydrocarbon fuels, such as RP-1, produce carbon deposition (coke) along the inside of tube walls when heated to high temperatures. A series of tests to measure the heat transfer using RP-1 fuel and examine the coking were performed in NASA Glenn Research Center's Heated Tube Facility. The facility models regenerative cooling by flowing room temperature RP-1 through resistively heated copper tubing. A Regression analysis is performed on the data to determine the heat transfer correlation for Nusselt number as a function of Reynolds and Prandtl numbers. Each measurement and calculation is analyzed to identify sources of uncertainty, including RP-1 property variations. Monte Carlo simulation is used to determine how each uncertainty source propagates through the regression and an overall uncertainty in predicted heat transfer coefficient. The implications of these uncertainties on engine design and ways to minimize existing uncertainties are discussed.
Decoding the auditory brain with canonical component analysis.
de Cheveigné, Alain; Wong, Daniel D E; Di Liberto, Giovanni M; Hjortkjær, Jens; Slaney, Malcolm; Lalor, Edmund
2018-05-15
The relation between a stimulus and the evoked brain response can shed light on perceptual processes within the brain. Signals derived from this relation can also be harnessed to control external devices for Brain Computer Interface (BCI) applications. While the classic event-related potential (ERP) is appropriate for isolated stimuli, more sophisticated "decoding" strategies are needed to address continuous stimuli such as speech, music or environmental sounds. Here we describe an approach based on Canonical Correlation Analysis (CCA) that finds the optimal transform to apply to both the stimulus and the response to reveal correlations between the two. Compared to prior methods based on forward or backward models for stimulus-response mapping, CCA finds significantly higher correlation scores, thus providing increased sensitivity to relatively small effects, and supports classifier schemes that yield higher classification scores. CCA strips the brain response of variance unrelated to the stimulus, and the stimulus representation of variance that does not affect the response, and thus improves observations of the relation between stimulus and response. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ridgley, Jennie
2001-08-21
The purpose of the phase 2 Mesaverde study part of the Department of Energy funded project ''Analysis of oil-bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico'' was to define the facies of the oil-producing units within the subsurface units of the Mesaverde Group and integrate these results with outcrop studies that defined the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) integration of subsurface correlations with outcrop correlations of components of the Mesaverde, (2) application of the sequence stratigraphicmore » model determined in the phase one study to these correlations, (3) determination of the facies distribution of the Mesaverde Group and their relationship to sites of oil and gas accumulation, (4) evaluation of the thermal maturity and potential source rocks for oil and gas in the Mesaverde Group, and (5) evaluation of the structural features on the Reservation as they may control sites of oil accumulation.« less
Astray, G; Soto, B; Lopez, D; Iglesias, M A; Mejuto, J C
2016-01-01
Transit data analysis and artificial neural networks (ANNs) have proven to be a useful tool for characterizing and modelling non-linear hydrological processes. In this paper, these methods have been used to characterize and to predict the discharge of Lor River (North Western Spain), 1, 2 and 3 days ahead. Transit data analyses show a coefficient of correlation of 0.53 for a lag between precipitation and discharge of 1 day. On the other hand, temperature and discharge has a negative coefficient of correlation (-0.43) for a delay of 19 days. The ANNs developed provide a good result for the validation period, with R(2) between 0.92 and 0.80. Furthermore, these prediction models have been tested with discharge data from a period 16 years later. Results of this testing period also show a good correlation, with R(2) between 0.91 and 0.64. Overall, results indicate that ANNs are a good tool to predict river discharge with a small number of input variables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ziatdinov, Maxim A.; Fujii, Shintaro; Kiguchi, Manabu
The link between changes in the material crystal structure and its mechanical, electronic, magnetic, and optical functionalities known as the structure-property relationship is the cornerstone of the contemporary materials science research. The recent advances in scanning transmission electron and scanning probe microscopies (STEM and SPM) have opened an unprecedented path towards examining the materials structure property relationships on the single-impurity and atomic-configuration levels. Lacking, however, are the statistics-based approaches for cross-correlation of structure and property variables obtained in different information channels of the STEM and SPM experiments. Here we have designed an approach based on a combination of sliding windowmore » Fast Fourier Transform, Pearson correlation matrix, linear and kernel canonical correlation, to study a relationship between lattice distortions and electron scattering from the SPM data on graphene with defects. Our analysis revealed that the strength of coupling to strain is altered between different scattering channels which can explain coexistence of several quasiparticle interference patterns in the nanoscale regions of interest. In addition, the application of the kernel functions allowed us extracting a non-linear component of the relationship between the lattice strain and scattering intensity in graphene. Lastly, the outlined approach can be further utilized to analyzing correlations in various multi-modal imaging techniques where the information of interest is spatially distributed and has usually a complex multidimensional nature.« less
[Nitrogen status diagnosis of rice by using a digital camera].
Jia, Liang-Liang; Fan, Ming-Sheng; Zhang, Fu-Suo; Chen, Xin-Ping; Lü, Shi-Hua; Sun, Yan-Ming
2009-08-01
In the present research, a field experiment with different N application rate was conducted to study the possibility of using visible band color analysis methods to monitor the N status of rice canopy. The Correlations of visible spectrum band color intensity between rice canopy image acquired from a digital camera and conventional nitrogen status diagnosis parameters of leaf SPAD chlorophyll meter readings, total N content, upland biomass and N uptake were studied. The results showed that the red color intensity (R), green color intensity (G) and normalized redness intensity (NRI) have significant inverse linear correlations with the conventional N diagnosis parameters of SPAD readings, total N content, upland biomass and total N uptake. The correlation coefficient values (r) were from -0.561 to -0.714 for red band (R), from -0.452 to -0.505 for green band (G), and from -0.541 to 0.817 for normalized redness intensity (NRI). But the normalized greenness intensity (NGI) showed a significant positive correlation with conventional N parameters and the correlation coefficient values (r) were from 0.505 to 0.559. Compared with SPAD readings, the normalized redness intensity (NRI), with a high r value of 0.541-0.780 with conventional N parameters, could better express the N status of rice. The digital image color analysis method showed the potential of being used in rice N status diagnosis in the future.
Zhang, Qingyang
2018-05-16
Differential co-expression analysis, as a complement of differential expression analysis, offers significant insights into the changes in molecular mechanism of different phenotypes. A prevailing approach to detecting differentially co-expressed genes is to compare Pearson's correlation coefficients in two phenotypes. However, due to the limitations of Pearson's correlation measure, this approach lacks the power to detect nonlinear changes in gene co-expression which is common in gene regulatory networks. In this work, a new nonparametric procedure is proposed to search differentially co-expressed gene pairs in different phenotypes from large-scale data. Our computational pipeline consisted of two main steps, a screening step and a testing step. The screening step is to reduce the search space by filtering out all the independent gene pairs using distance correlation measure. In the testing step, we compare the gene co-expression patterns in different phenotypes by a recently developed edge-count test. Both steps are distribution-free and targeting nonlinear relations. We illustrate the promise of the new approach by analyzing the Cancer Genome Atlas data and the METABRIC data for breast cancer subtypes. Compared with some existing methods, the new method is more powerful in detecting nonlinear type of differential co-expressions. The distance correlation screening can greatly improve computational efficiency, facilitating its application to large data sets.
Ziatdinov, Maxim A.; Fujii, Shintaro; Kiguchi, Manabu; ...
2016-11-09
The link between changes in the material crystal structure and its mechanical, electronic, magnetic, and optical functionalities known as the structure-property relationship is the cornerstone of the contemporary materials science research. The recent advances in scanning transmission electron and scanning probe microscopies (STEM and SPM) have opened an unprecedented path towards examining the materials structure property relationships on the single-impurity and atomic-configuration levels. Lacking, however, are the statistics-based approaches for cross-correlation of structure and property variables obtained in different information channels of the STEM and SPM experiments. Here we have designed an approach based on a combination of sliding windowmore » Fast Fourier Transform, Pearson correlation matrix, linear and kernel canonical correlation, to study a relationship between lattice distortions and electron scattering from the SPM data on graphene with defects. Our analysis revealed that the strength of coupling to strain is altered between different scattering channels which can explain coexistence of several quasiparticle interference patterns in the nanoscale regions of interest. In addition, the application of the kernel functions allowed us extracting a non-linear component of the relationship between the lattice strain and scattering intensity in graphene. Lastly, the outlined approach can be further utilized to analyzing correlations in various multi-modal imaging techniques where the information of interest is spatially distributed and has usually a complex multidimensional nature.« less
Koerner, Tess K.; Zhang, Yang
2017-01-01
Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers. PMID:28264422
Design and analysis of photonic crystal coupled cavity arrays for quantum simulation
NASA Astrophysics Data System (ADS)
Majumdar, Arka; Rundquist, Armand; Bajcsy, Michal; Dasika, Vaishno D.; Bank, Seth R.; Vučković, Jelena
2012-11-01
We performed an experimental study of coupled optical cavity arrays in a photonic crystal platform. We find that the coupling between the cavities is significantly larger than the fabrication-induced disorder in the cavity frequencies. Satisfying this condition is necessary for using such cavity arrays to generate strongly correlated photons, which has potential application in the quantum simulation of many-body systems.
Christoper J. Schmitt; A. Dennis Lemly; Parley V. Winger
1993-01-01
Data from several sources were collated and analyzed by correlation, regression, and principal components analysis to define surrrogate variables for use in the brook trout (Salvelinus fontinalis) habitat suitability index (HSI) model, and to evaluate the applicability of the model for assessing habitat in high elevation streams of the southern Blue Ridge Province (...
ERIC Educational Resources Information Center
Omel'chenko, E. L.
2015-01-01
The article looks at the experience of studying young people in today's Russia and the way the experience correlates with Western traditions of research. The analysis that is proposed is oriented toward understanding the analytical and empirical potential of the concept of solidarity applicable to the current agenda. [This article was translated…
Statistical modeling of space shuttle environmental data
NASA Technical Reports Server (NTRS)
Tubbs, J. D.; Brewer, D. W.
1983-01-01
Statistical models which use a class of bivariate gamma distribution are examined. Topics discussed include: (1) the ratio of positively correlated gamma varieties; (2) a method to determine if unequal shape parameters are necessary in bivariate gamma distribution; (3) differential equations for modal location of a family of bivariate gamma distribution; and (4) analysis of some wind gust data using the analytical results developed for modeling application.
Amplitude envelope correlations measure synchronous cortical oscillations in performing musicians.
Zamm, Anna; Debener, Stefan; Bauer, Anna-Katharina R; Bleichner, Martin G; Demos, Alexander P; Palmer, Caroline
2018-05-14
A major question facing cognitive neuroscience is measurement of interbrain synchrony between individuals performing joint actions. We describe the application of a novel method for measuring musicians' interbrain synchrony: amplitude envelope correlations (AECs). Amplitude envelopes (AEs) reflect energy fluctuations in cortical oscillations over time; AE correlations measure the degree to which two envelope fluctuations are temporally correlated, such as cortical oscillations arising from two individuals performing a joint action. Wireless electroencephalography was recorded from two pianists performing a musical duet; an analysis pipeline is described for computing AEs of cortical oscillations at the duet performance frequency (number of tones produced per second) to test whether these oscillations reflect the temporal dynamics of partners' performances. The pianists' AE correlations were compared with correlations based on a distribution of AEs simulated from white noise signals using the same methods. The AE method was also applied to the temporal characteristics of the pianists' performances, to show that the observed pair's AEs reflect the temporal dynamics of their performance. AE correlations offer a promising approach for assessing interbrain correspondences in cortical activity associated with performing joint tasks. © 2018 New York Academy of Sciences.
Optical diffusion property of chicken tissue
NASA Astrophysics Data System (ADS)
Schneider, Patricia S.; Flamholz, Alex; Wong, Peter K.; Lieberman, David H.; Cheung, Tak D.; Itoka, Harriet; Minott, Troy; Quizhpi, Janie; Rodriguez, Jacquelin
2004-11-01
Chicken tissue acts as a turbid medium in optical wavelength. Optical characterization data of fresh chicken dark and white meat were studied using the theory of light diffusion. The gaussian-like transmission profile was used to determine the transport mean free path and absorption. The refractive index, a fundamental parameter, was extracted via transmission correlation function analysis without using index-matching fluid. The variation in refractive index also produced various small shifts in the oscillatory feature of the intensity spatial correlation function at distance shorter than the transport mean free path. The optical system was calibrated with porous silicate slabs containing different water contents and also with a solid alumina slab. The result suggested that the selective scattering/absorption of myoglobin and mitochondria in the dark tissues is consistent with the transmission data. The refractive index was similar for dark and white tissues at the He-Ne wavelength and suggested that the index could serve as a marker for quality control. Application to chicken lunchmeat samples revealed that higher protein and lower carbohydrate would shift the correlation toward smaller distance. The pure fat refractive index was different from that of the meat tissue. Application of refractive index as a fat marker is also discussed
Predicting Intention Perform Breast Self-Examination: Application of the Theory of Reasoned Action
Dewi, Triana Kesuma; Zein, Rizqy Amelia
2017-01-01
Objective: The present study aimed to examine the applicability of the theory of reasoned action to explain intention to perform breast self-examination (BSE). Methods: A questionnaire was constructed to collect data. The hypothesis was tested in two steps. First, to assess the strength of the correlation among the constructs of theory of reasoned action (TRA), Pearson’s product moment correlations were applied. Second, multivariate relationships among the constructs were examined by performing hierarchical multiple linear regression analysis. Result: The findings supported the TRA model, explaining 45.8% of the variance in the students’ BSE intention, which was significantly correlated with attitude (r = 0.609, p = 0.000) and subjective norms (r = 0.420, p =0 .000). Conclusion: TRA could be a suitable model to predict BSE intentions. Participants who believed that doing BSE regularly is beneficial for early diagnosis of breast cancer and also believed that their significant referents think that doing BSE would significantly detect breast cancer earlier, were more likely to intend to perform BSE regularly. Therefore, the research findings supported the conclusion that promoting the importance of BSE at the community/social level would enhance individuals to perform BSE routinely. PMID:29172263
Predicting Intention Perform Breast Self-Examination: Application of the Theory of Reasoned Action
Dewi, Triana Kesuma; Zein, Rizqy Amelia
2017-11-26
Objective: The present study aimed to examine the applicability of the theory of reasoned action to explain intention to perform breast self-examination (BSE). Methods: A questionnaire was constructed to collect data. The hypothesis was tested in two steps. First, to assess the strength of the correlation among the constructs of theory of reasoned action (TRA), Pearson’s product moment correlations were applied. Second, multivariate relationships among the constructs were examined by performing hierarchical multiple linear regression analysis. Result: The findings supported the TRA model, explaining 45.8% of the variance in the students’ BSE intention, which was significantly correlated with attitude (r = 0.609, p = 0.000) and subjective norms (r = 0.420, p =0 .000). Conclusion: TRA could be a suitable model to predict BSE intentions . Participants who believed that doing BSE regularly is beneficial for early diagnosis of breast cancer and also believed that their significant referents think that doing BSE would significantly detect breast cancer earlier, were more likely to intend to perform BSE regularly. Therefore, the research findings supported the conclusion that promoting the importance of BSE at the community/social level would enhance individuals to perform BSE routinely. Creative Commons Attribution License
NASA Technical Reports Server (NTRS)
Coulbourn, W. C.; Egan, W. G.; Olsen, D. A. (Principal Investigator); Heaslip, G. B.
1973-01-01
The author has identified the following significant results. The boundaries of application of ERTS-1 and aircraft data are established for St. Thomas harbor within which useful water quality information can be obtained. In situ physical, chemical, and biological water quality and benthic data were collected. Moored current meters were employed. Optical measurements of solar irradiance, color test panel radiance and water absorption were taken. Procedures for correlating in situ optical, biological, and chemical data with underflight aircraft I2S data and ERTS-1 MSS scanner data are presented. Comparison of bulk and precision CCT computer printout data for this application is made, and a simple method for geometrically locating bulk data individual pixels based on land-water interface is described. ERTS spacecraft data and I2S aircraft imagery are correlated with optical in situ measurements of the harbor water, with the aircraft green photographic and ERTS-1 MSS-4 bands being the most useful. The biological pigments correlate inversely with the optical data for inshore areas and directly further seaward. Automated computer data processing facilitated analysis.
NASA Astrophysics Data System (ADS)
Shiryaev, A. A.; Voloshchuk, A. M.; Volkov, V. V.; Averin, A. A.; Artamonova, S. D.
2017-05-01
Furfural-derived sorbents and activated carbonaceous fibers were studied using Small- and Wide-angle X-ray scattering (SWAXS), X-ray diffraction and multiwavelength Raman spectroscopy after storage at ambient conditions. Correlations between structural features with degree of activation and with sorption parameters are observed for samples obtained from a common precursor and differing in duration of activation. However, the correlations are not necessarily applicable to the carbons obtained from different precursors. Using two independent approaches we show that treatment of SWAXS results should be performed with careful analysis of applicability of the Porod law to the sample under study. In general case of a pore with rough/corrugated surface deviations from the Porod law may became significant and reflect structure of the pore-carbon interface. Ignorance of these features may invalidate extraction of closed porosity values. In most cases the pore-matrix interface in the studied samples is not atomically sharp, but is characterized by 1D or 2D fluctuations of electronic density responsible for deviations from the Porod law. Intensity of the pores-related small-angle scattering correlates positively with SBET values obtained from N2 adsorption.
Fluctuation-dissipation theory of input-output interindustrial relations
NASA Astrophysics Data System (ADS)
Iyetomi, Hiroshi; Nakayama, Yasuhiro; Aoyama, Hideaki; Fujiwara, Yoshi; Ikeda, Yuichi; Souma, Wataru
2011-01-01
In this study, the fluctuation-dissipation theory is invoked to shed light on input-output interindustrial relations at a macroscopic level by its application to indices of industrial production (IIP) data for Japan. Statistical noise arising from finiteness of the time series data is carefully removed by making use of the random matrix theory in an eigenvalue analysis of the correlation matrix; as a result, two dominant eigenmodes are detected. Our previous study successfully used these two modes to demonstrate the existence of intrinsic business cycles. Here a correlation matrix constructed from the two modes describes genuine interindustrial correlations in a statistically meaningful way. Furthermore, it enables us to quantitatively discuss the relationship between shipments of final demand goods and production of intermediate goods in a linear response framework. We also investigate distinctive external stimuli for the Japanese economy exerted by the current global economic crisis. These stimuli are derived from residuals of moving-average fluctuations of the IIP remaining after subtracting the long-period components arising from inherent business cycles. The observation reveals that the fluctuation-dissipation theory is applicable to an economic system that is supposed to be far from physical equilibrium.
Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong
2014-07-01
Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dos Santos, Rafael Aparecido; Derhon, Viviane; Brandalize, Michelle; Brandalize, Danielle; Rossi, Luciano Pavan
2017-07-01
Goniometers are commonly used to measure range of motion in the musculoskeletal system. Recently smartphone goniometry applications have become available to clinicians. Compare angular measures using a universal goniometer and a smartphone application. Thirty four healthy women with at least 20° of limited range of motion regarding knee extension were recruited. Knee flexion angles of the dominant limb were measured with a universal goniometer and the ROM © goniometric application for the smartphone. Three trained examiners compared the two assessment tools. Strong correlations were found between the measures of the universal goniometer and smartphone application (Pearson's correlation and interclass correlation coefficient > 0.93). The measurements with both devices demonstrated low dispersion and little variation. Measurements obtained using the smartphone goniometric application analyzed are as reliable as those of a universal goniometer. This application is therefore a useful tool for the evaluation of knee range of motion. Copyright © 2016 Elsevier Ltd. All rights reserved.
Flight motor set 360L001 (STS-26R). (Reconstructed dynamic loads analysis)
NASA Technical Reports Server (NTRS)
Call, V. B.
1989-01-01
A transient analysis was performed to correlate the predicted versus measured behavior of the Redesigned Solid Rocket Booster (RSRB) during Flight 360L001 (STS-26R) liftoff. Approximately 9 accelerometers, 152 strain gages, and 104 girth gages were bonded to the motors during this event. Prior to Flight 360L001, a finite element model of the RSRB was analyzed to predict the accelerations, strains, and displacements measured by this developmental flight instrumentation (DFI) within an order of magnitude. Subsequently, an analysis has been performed which uses actual Flight 360L001 liftoff loading conditions, and makes more precise predictions for the RSRB structural behavior. Essential information describing the analytical model, analytical techniques used, correlation of the predicted versus measured RSRB behavior, and conclusions, are presented. A detailed model of the RSRB was developed and correlated for use in analyzing the motor behavior during liftoff loading conditions. This finite element model, referred to as the RSRB global model, uses super-element techniques to model all components of the RSRB. The objective of the RSRB global model is to accurately predict deflections and gap openings in the field joints to an accuracy of approximately 0.001 inch. The model of the field joint component was correlated to Referee and Joint Environment Simulation (JES) tests. The accuracy of the assembled RSRB global model was validated by correlation to static-fire tests such DM-8, DM-9, QM-7, and QM-8. This validated RSRB global model was used to predict RSRB structural behavior and joint gap opening during Flight 360L001 liftoff. The results of a transient analysis of the RSRB global model with imposed liftoff loading conditions are presented. Rockwell used many gage measurements to reconstruct the load parameters which were imposed on the RSRB during the Flight 360L001 liftoff. Each load parameter, and its application, is described. Also presented are conclusions and recommendations based on the analysis of this load case and the resulting correlation between predicted and measured RSRB structural behavior.
Achour, Brahim; Dantonio, Alyssa; Niosi, Mark; Novak, Jonathan J; Al-Majdoub, Zubida M; Goosen, Theunis C; Rostami-Hodjegan, Amin; Barber, Jill
2018-06-01
Quantitative proteomic methods require optimization at several stages, including sample preparation, liquid chromatography-tandem mass spectrometry (LC-MS/MS), and data analysis, with the final analysis stage being less widely appreciated by end-users. Previously reported measurement of eight uridine-5'-diphospho-glucuronosyltransferases (UGT) generated by two laboratories [using stable isotope-labeled (SIL) peptides or quantitative concatemer (QconCAT)] reflected significant disparity between proteomic methods. Initial analysis of QconCAT data showed lack of correlation with catalytic activity for several UGTs (1A4, 1A6, 1A9, 2B15) and moderate correlations for UGTs 1A1, 1A3, and 2B7 ( R s = 0.40-0.79, P < 0.05; R 2 = 0.30); good correlations were demonstrated between cytochrome P450 activities and abundances measured in the same experiments. Consequently, a systematic review of data analysis, starting from unprocessed LC-MS/MS data, was undertaken, with the aim of improving accuracy, defined by correlation against activity. Three main criteria were found to be important: choice of monitored peptides and fragments, correction for isotope-label incorporation, and abundance normalization using fractional protein mass. Upon optimization, abundance-activity correlations improved significantly for six UGTs ( R s = 0.53-0.87, P < 0.01; R 2 = 0.48-0.73); UGT1A9 showed moderate correlation ( R s = 0.47, P = 0.02; R 2 = 0.34). No spurious abundance-activity relationships were identified. However, methods remained suboptimal for UGT1A3 and UGT1A9; here hydrophobicity of standard peptides is believed to be limiting. This commentary provides a detailed data analysis strategy and indicates, using examples, the significance of systematic data processing following acquisition. The proposed strategy offers significant improvement on existing guidelines applicable to clinically relevant proteins quantified using QconCAT. Copyright © 2018 by The American Society for Pharmacology and Experimental Therapeutics.
NASA Astrophysics Data System (ADS)
Facheris, L.; Tanelli, S.; Giuli, D.
A method is presented for analyzing the storm motion through the application of a nowcasting technique based on radar echoes tracking through multiscale correlation. The application of the correlation principle to weather radar image processing - the so called TREC (Tracking Radar Echoes by Correlation) and derived algorithms - is de- scribed in [1] and in references cited therein. The block matching approach exploited there is typical of video compression applications, whose purpose is to remove the temporal correlation between two subsequent frames of a sequence of images. In par- ticular, the wavelet decomposition approach to motion estimation seems particularly suitable for weather radar maps. In fact, block matching is particularly efficient when the images have a sufficient level of contrast. Though this does not hold for original resolution radar maps, it can be easily obtained by changing the resolution level by means of the wavelet decomposition. The technique first proposed in [2] (TREMC - Tracking of Radar Echoes by means of Multiscale Correlation) adopts a multiscale, multiresolution, and partially overlapped, block grid which adapts to the radar reflec- tivity pattern. Multiresolution decomposition is performed through 2D wavelet based filtering. Correlation coefficients are calculated taking after preliminary screening of unreliable data (e.g. those affected by ground clutter or beam shielding), so as to avoid strong undesired motion estimation biases due to the presence of stationary features. Such features are detected by a previous analysis carried out as discussed in [2]. In this paper, motion fields obtained by analyzing precipitation events over the Arno river basin are compared to the related Doppler velocity fields in order to identify growth and decay areas and orographic effects. Data presented have been collected by the weather radar station POLAR 55C sited in Montagnana (Firenze-Italy), a polarimetric C-band system providing absolute and differential reflectivity maps, mean Doppler velocity and Doppler spread maps with a resolution of 125/250 m [3]. [1] Li L. Schmid W. and Joss J., Nowcasting of motion and growth of precipitation with radar over a complex orography Journal of Applied Meteorology, vol. 34, pp. 1286-1300, 1995. [2] L.Facheris, S. Tanelli, F. Argenti, D.Giuli, SWavelet Applica- & cedil;tions to Multiparameter Weather Radar AnalysisT, to be published on SInformation & cedil;Processing for Remote SensingT, Prof. C.H. Chen Ed. for World Scientific Publish- 1 ing Co., pagg. 187-207, 1999 [3] Scarchilli G. Gorgucci E. Giuli D. Facheris L. Freni A. and Vezzani G., Arno Project: Radar System and objectives., Proceedings 25th In- ternational Conference on Radar Meteorology, Paris, France, 24-28 June 1991, pp. 805-808 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, X; Li, Z; Zheng, D
Purpose: In the context of evaluating dosimetric impacts of a variety of uncertainties involved in HDR Tandem-and-Ovoid treatment, to study the correlations between conventional point doses and 3D volumetric doses. Methods: For 5 cervical cancer patients treated with HDR T&O, 150 plans were retrospectively created to study dosimetric impacts of the following uncertainties: (1) inter-fractional applicator displacement between two treatment fractions within a single insertion by applying Fraction#1 plan to Fraction#2 CT; (2) positional dwell error simulated from −5mm to 5mm in 1mm steps; (3) simulated temporal dwell error of 0.05s, 0.1s, 0.5s, and 1s. The original plans were basedmore » on point dose prescription, from which the volume covered by the prescription dose was generated as the pseudo target volume to study the 3D target dose effect. OARs were contoured. The point and volumetric dose errors were calculated by taking the differences between original and simulated plans. The correlations between the point and volumetric dose errors were analyzed. Results: For the most clinically relevant positional dwell uncertainty of 1mm, temporal uncertainty of 0.05s, and inter-fractional applicator displacement within the same insertion, the mean target D90 and V100 deviation were within 1%. Among these uncertainties, the applicator displacement showed the largest potential target coverage impact (2.6% on D90) as well as the OAR dose impact (2.5% and 3.4% on bladder D2cc and rectum D2cc). The Spearman correlation analysis shows a correlation coefficient of 0.43 with a p-value of 0.11 between target D90 coverage and H point dose. Conclusion: With the most clinically relevant positional and temporal dwell uncertainties and patient interfractional applicator displacement within the same insertion, the dose error is within clinical acceptable range. The lack of correlation between H point and 3D volumetric dose errors is a motivator for the use of 3D treatment planning in cervical HDR brachytherapy.« less
Hebert, Benedict; Costantino, Santiago; Wiseman, Paul W
2005-05-01
We introduce a new extension of image correlation spectroscopy (ICS) and image cross-correlation spectroscopy (ICCS) that relies on complete analysis of both the temporal and spatial correlation lags for intensity fluctuations from a laser-scanning microscopy image series. This new approach allows measurement of both diffusion coefficients and velocity vectors (magnitude and direction) for fluorescently labeled membrane proteins in living cells through monitoring of the time evolution of the full space-time correlation function. By using filtering in Fourier space to remove frequencies associated with immobile components, we are able to measure the protein transport even in the presence of a large fraction (>90%) of immobile species. We present the background theory, computer simulations, and analysis of measurements on fluorescent microspheres to demonstrate proof of principle, capabilities, and limitations of the method. We demonstrate mapping of flow vectors for mixed samples containing fluorescent microspheres with different emission wavelengths using space time image cross-correlation. We also present results from two-photon laser-scanning microscopy studies of alpha-actinin/enhanced green fluorescent protein fusion constructs at the basal membrane of living CHO cells. Using space-time image correlation spectroscopy (STICS), we are able to measure protein fluxes with magnitudes of mum/min from retracting lamellar regions and protrusions for adherent cells. We also demonstrate the measurement of correlated directed flows (magnitudes of mum/min) and diffusion of interacting alpha5 integrin/enhanced cyan fluorescent protein and alpha-actinin/enhanced yellow fluorescent protein within living CHO cells. The STICS method permits us to generate complete transport maps of proteins within subregions of the basal membrane even if the protein concentration is too high to perform single particle tracking measurements.
Revuelta Menéndez, Javier; Ximénez Gómez, Carmen
2012-11-01
The application of mean and covariance structure analysis with quantitative data is increasing. However, latent means analysis with qualitative data is not as widespread. This article summarizes the procedures to conduct an analysis of latent means of dichotomous data from an item response theory approach. We illustrate the implementation of these procedures in an empirical example referring to the organizational context, where a multi-group analysis was conducted to compare the latent means of three employee groups in two factors measuring personal preferences and the perceived degree of rewards from the organization. Results show that higher personal motivations are associated with higher perceived importance of the organization, and that these perceptions differ across groups, so that higher-level employees have a lower level of personal and perceived motivation. The article shows how to estimate the factor means and the factor correlation from dichotomous data, and how to assess goodness of fit. Lastly, we provide the M-Plus syntax code in order to facilitate the latent means analyses for applied researchers.