Sample records for robust correlation coefficient

  1. Robust Approximations to the Non-Null Distribution of the Product Moment Correlation Coefficient I: The Phi Coefficient.

    ERIC Educational Resources Information Center

    Edwards, Lynne K.; Meyers, Sarah A.

    Correlation coefficients are frequently reported in educational and psychological research. The robustness properties and optimality among practical approximations when phi does not equal 0 with moderate sample sizes are not well documented. Three major approximations and their variations are examined: (1) a normal approximation of Fisher's Z,…

  2. The Robustness of Designs for Trials with Nested Data against Incorrect Initial Intracluster Correlation Coefficient Estimates

    ERIC Educational Resources Information Center

    Korendijk, Elly J. H.; Moerbeek, Mirjam; Maas, Cora J. M.

    2010-01-01

    In the case of trials with nested data, the optimal allocation of units depends on the budget, the costs, and the intracluster correlation coefficient. In general, the intracluster correlation coefficient is unknown in advance and an initial guess has to be made based on published values or subject matter knowledge. This initial estimate is likely…

  3. A robust bayesian estimate of the concordance correlation coefficient.

    PubMed

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2015-01-01

    A need for assessment of agreement arises in many situations including statistical biomarker qualification or assay or method validation. Concordance correlation coefficient (CCC) is one of the most popular scaled indices reported in evaluation of agreement. Robust methods for CCC estimation currently present an important statistical challenge. Here, we propose a novel Bayesian method of robust estimation of CCC based on multivariate Student's t-distribution and compare it with its alternatives. Furthermore, we extend the method to practically relevant settings, enabling incorporation of confounding covariates and replications. The superiority of the new approach is demonstrated using simulation as well as real datasets from biomarker application in electroencephalography (EEG). This biomarker is relevant in neuroscience for development of treatments for insomnia.

  4. Quantitative Structure-Activity Relationship of Insecticidal Activity of Benzyl Ether Diamidine Derivatives

    NASA Astrophysics Data System (ADS)

    Zhai, Mengting; Chen, Yan; Li, Jing; Zhou, Jun

    2017-12-01

    The molecular electrongativity distance vector (MEDV-13) was used to describe the molecular structure of benzyl ether diamidine derivatives in this paper, Based on MEDV-13, The three-parameter (M 3, M 15, M 47) QSAR model of insecticidal activity (pIC 50) for 60 benzyl ether diamidine derivatives was constructed by leaps-and-bounds regression (LBR) . The traditional correlation coefficient (R) and the cross-validation correlation coefficient (R CV ) were 0.975 and 0.971, respectively. The robustness of the regression model was validated by Jackknife method, the correlation coefficient R were between 0.971 and 0.983. Meanwhile, the independent variables in the model were tested to be no autocorrelation. The regression results indicate that the model has good robust and predictive capabilities. The research would provide theoretical guidance for the development of new generation of anti African trypanosomiasis drugs with efficiency and low toxicity.

  5. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    PubMed

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  6. An asymptotic theory for cross-correlation between auto-correlated sequences and its application on neuroimaging data.

    PubMed

    Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng

    2018-04-20

    Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.

  7. Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient.

    PubMed

    Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J

    2008-06-18

    Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. This study shows that SCC is an alternative to the Pearson correlation coefficient and the SD-weighted correlation coefficient, and is particularly useful for clustering replicated microarray data. This computational approach should be generally useful for proteomic data or other high-throughput analysis methodology.

  8. [Surface electromyography signal classification using gray system theory].

    PubMed

    Xie, Hongbo; Ma, Congbin; Wang, Zhizhong; Huang, Hai

    2004-12-01

    A new method based on gray correlation was introduced to improve the identification rate in artificial limb. The electromyography (EMG) signal was first transformed into time-frequency domain by wavelet transform. Singular value decomposition (SVD) was then used to extract feature vector from the wavelet coefficient for pattern recognition. The decision was made according to the maximum gray correlation coefficient. Compared with neural network recognition, this robust method has an almost equivalent recognition rate but much lower computation costs and less training samples.

  9. Chaotic CDMA watermarking algorithm for digital image in FRFT domain

    NASA Astrophysics Data System (ADS)

    Liu, Weizhong; Yang, Wentao; Feng, Zhuoming; Zou, Xuecheng

    2007-11-01

    A digital image-watermarking algorithm based on fractional Fourier transform (FRFT) domain is presented by utilizing chaotic CDMA technique in this paper. As a popular and typical transmission technique, CDMA has many advantages such as privacy, anti-jamming and low power spectral density, which can provide robustness against image distortions and malicious attempts to remove or tamper with the watermark. A super-hybrid chaotic map, with good auto-correlation and cross-correlation characteristics, is adopted to produce many quasi-orthogonal codes (QOC) that can replace the periodic PN-code used in traditional CDAM system. The watermarking data is divided into a lot of segments that correspond to different chaotic QOC respectively and are modulated into the CDMA watermarking data embedded into low-frequency amplitude coefficients of FRFT domain of the cover image. During watermark detection, each chaotic QOC extracts its corresponding watermarking segment by calculating correlation coefficients between chaotic QOC and watermarked data of the detected image. The CDMA technique not only can enhance the robustness of watermark but also can compress the data of the modulated watermark. Experimental results show that the watermarking algorithm has good performances in three aspects: better imperceptibility, anti-attack robustness and security.

  10. Robustness of networks formed from interdependent correlated networks under intentional attacks

    NASA Astrophysics Data System (ADS)

    Liu, Long; Meng, Ke; Dong, Zhaoyang

    2018-02-01

    We study the problem of intentional attacks targeting to interdependent networks generated with known degree distribution (in-degree oriented model) or distribution of interlinks (out-degree oriented model). In both models, each node's degree is correlated with the number of its links that connect to the other network. For both models, varying the correlation coefficient has a significant effect on the robustness of a system undergoing random attacks or attacks targeting nodes with low degree. For a system with an assortative relationship between in-degree and out-degree, reducing the broadness of networks' degree distributions can increase the resistance of systems against intentional attacks.

  11. Challenge of assessing symptoms in seriously ill intensive care unit patients: can proxy reporters help?

    PubMed

    Puntillo, Kathleen A; Neuhaus, John; Arai, Shoshana; Paul, Steven M; Gropper, Michael A; Cohen, Neal H; Miaskowski, Christine

    2012-10-01

    Determine levels of agreement among intensive care unit patients and their family members, nurses, and physicians (proxies) regarding patients' symptoms and compare levels of mean intensity (i.e., the magnitude of a symptom sensation) and distress (i.e., the degree of emotionality that a symptom engenders) of symptoms among patients and proxy reporters. Prospective study of proxy reporters of symptoms in seriously ill patients. Two intensive care units in a tertiary medical center in the Western United States. Two hundred and forty-five intensive care unit patients, 243 family members, 103 nurses, and 92 physicians. None. On the basis of the magnitude of intraclass correlation coefficients, where coefficients from .35 to .78 are considered to be appropriately robust, correlation coefficients between patients' and family members' ratings met this criterion (≥.35) for intensity in six of ten symptoms. No intensity ratings between patients and nurses had intraclass correlation coefficients >.32. Three symptoms had intensity correlation coefficients of ≥.36 between patients' and physicians' ratings. Correlation coefficients between patients and family members were >.40 for five symptom-distress ratings. No symptoms had distress correlation coefficients of ≥.28 between patients' and nurses' ratings. Two symptoms had symptom-distress correlation coefficients between patients' and physicians' ratings at >.39. Family members, nurses, and physicians reported higher symptom-intensity scores than patients did for 80%, 60%, and 60% of the symptoms, respectively. Family members, nurses, and physicians reported higher symptom-distress scores than patients did for 90%, 70%, and 80% of the symptoms, respectively. Patient-family intraclass correlation coefficients were sufficiently close for us to consider using family members to help assess intensive care unit patients' symptoms. Relatively low intraclass correlation coefficients between intensive care unit clinicians' and patients' symptom ratings indicate that some proxy raters overestimate whereas others underestimate patients' symptoms. Proxy overestimation of patients' symptom scores warrants further study because this may influence decisions about treating patients' symptoms.

  12. Simultaneous characterization of lateral lipid and prothrombin diffusion coefficients by z-scan fluorescence correlation spectroscopy.

    PubMed

    Stefl, Martin; Kułakowska, Anna; Hof, Martin

    2009-08-05

    A new (to our knowledge) robust approach for the determination of lateral diffusion coefficients of weakly bound proteins is applied for the phosphatidylserine specific membrane interaction of bovine prothrombin. It is shown that z-scan fluorescence correlation spectroscopy in combination with pulsed interleaved dual excitation allows simultaneous monitoring of the lateral diffusion of labeled protein and phospholipids. Moreover, from the dependencies of the particle numbers on the axial sample positions at different protein concentrations phosphatidylserine-dependent equilibrium dissociation constants are derived confirming literature values. Increasing the amount of membrane-bound prothrombin retards the lateral protein and lipid diffusion, indicating coupling of both processes. The lateral diffusion coefficients of labeled lipids are considerably larger than the simultaneously determined lateral diffusion coefficients of prothrombin, which contradicts findings reported for the isolated N-terminus of prothrombin.

  13. DFT and 3D-QSAR Studies of Anti-Cancer Agents m-(4-Morpholinoquinazolin-2-yl) Benzamide Derivatives for Novel Compounds Design

    NASA Astrophysics Data System (ADS)

    Zhao, Siqi; Zhang, Guanglong; Xia, Shuwei; Yu, Liangmin

    2018-06-01

    As a group of diversified frameworks, quinazolin derivatives displayed a broad field of biological functions, especially as anticancer. To investigate the quantitative structure-activity relationship, 3D-QSAR models were generated with 24 quinazolin scaffold molecules. The experimental and predicted pIC50 values for both training and test set compounds showed good correlation, which proved the robustness and reliability of the generated QSAR models. The most effective CoMFA and CoMSIA were obtained with correlation coefficient r 2 ncv of 1.00 (both) and leave-one-out coefficient q 2 of 0.61 and 0.59, respectively. The predictive abilities of CoMFA and CoMSIA were quite good with the predictive correlation coefficients ( r 2 pred ) of 0.97 and 0.91. In addition, the statistic results of CoMFA and CoMSIA were used to design new quinazolin molecules.

  14. Asymptotic properties of Pearson's rank-variate correlation coefficient under contaminated Gaussian model.

    PubMed

    Ma, Rubao; Xu, Weichao; Zhang, Yun; Ye, Zhongfu

    2014-01-01

    This paper investigates the robustness properties of Pearson's rank-variate correlation coefficient (PRVCC) in scenarios where one channel is corrupted by impulsive noise and the other is impulsive noise-free. As shown in our previous work, these scenarios that frequently encountered in radar and/or sonar, can be well emulated by a particular bivariate contaminated Gaussian model (CGM). Under this CGM, we establish the asymptotic closed forms of the expectation and variance of PRVCC by means of the well known Delta method. To gain a deeper understanding, we also compare PRVCC with two other classical correlation coefficients, i.e., Spearman's rho (SR) and Kendall's tau (KT), in terms of the root mean squared error (RMSE). Monte Carlo simulations not only verify our theoretical findings, but also reveal the advantage of PRVCC by an example of estimating the time delay in the particular impulsive noise environment.

  15. An empirical comparison of methods for analyzing correlated data from a discrete choice survey to elicit patient preference for colorectal cancer screening

    PubMed Central

    2012-01-01

    Background A discrete choice experiment (DCE) is a preference survey which asks participants to make a choice among product portfolios comparing the key product characteristics by performing several choice tasks. Analyzing DCE data needs to account for within-participant correlation because choices from the same participant are likely to be similar. In this study, we empirically compared some commonly-used statistical methods for analyzing DCE data while accounting for within-participant correlation based on a survey of patient preference for colorectal cancer (CRC) screening tests conducted in Hamilton, Ontario, Canada in 2002. Methods A two-stage DCE design was used to investigate the impact of six attributes on participants' preferences for CRC screening test and willingness to undertake the test. We compared six models for clustered binary outcomes (logistic and probit regressions using cluster-robust standard error (SE), random-effects and generalized estimating equation approaches) and three models for clustered nominal outcomes (multinomial logistic and probit regressions with cluster-robust SE and random-effects multinomial logistic model). We also fitted a bivariate probit model with cluster-robust SE treating the choices from two stages as two correlated binary outcomes. The rank of relative importance between attributes and the estimates of β coefficient within attributes were used to assess the model robustness. Results In total 468 participants with each completing 10 choices were analyzed. Similar results were reported for the rank of relative importance and β coefficients across models for stage-one data on evaluating participants' preferences for the test. The six attributes ranked from high to low as follows: cost, specificity, process, sensitivity, preparation and pain. However, the results differed across models for stage-two data on evaluating participants' willingness to undertake the tests. Little within-patient correlation (ICC ≈ 0) was found in stage-one data, but substantial within-patient correlation existed (ICC = 0.659) in stage-two data. Conclusions When small clustering effect presented in DCE data, results remained robust across statistical models. However, results varied when larger clustering effect presented. Therefore, it is important to assess the robustness of the estimates via sensitivity analysis using different models for analyzing clustered data from DCE studies. PMID:22348526

  16. Change Detection via Selective Guided Contrasting Filters

    NASA Astrophysics Data System (ADS)

    Vizilter, Y. V.; Rubis, A. Y.; Zheltov, S. Y.

    2017-05-01

    Change detection scheme based on guided contrasting was previously proposed. Guided contrasting filter takes two images (test and sample) as input and forms the output as filtered version of test image. Such filter preserves the similar details and smooths the non-similar details of test image with respect to sample image. Due to this the difference between test image and its filtered version (difference map) could be a basis for robust change detection. Guided contrasting is performed in two steps: at the first step some smoothing operator (SO) is applied for elimination of test image details; at the second step all matched details are restored with local contrast proportional to the value of some local similarity coefficient (LSC). The guided contrasting filter was proposed based on local average smoothing as SO and local linear correlation as LSC. In this paper we propose and implement new set of selective guided contrasting filters based on different combinations of various SO and thresholded LSC. Linear average and Gaussian smoothing, nonlinear median filtering, morphological opening and closing are considered as SO. Local linear correlation coefficient, morphological correlation coefficient (MCC), mutual information, mean square MCC and geometrical correlation coefficients are applied as LSC. Thresholding of LSC allows operating with non-normalized LSC and enhancing the selective properties of guided contrasting filters: details are either totally recovered or not recovered at all after the smoothing. These different guided contrasting filters are tested as a part of previously proposed change detection pipeline, which contains following stages: guided contrasting filtering on image pyramid, calculation of difference map, binarization, extraction of change proposals and testing change proposals using local MCC. Experiments on real and simulated image bases demonstrate the applicability of all proposed selective guided contrasting filters. All implemented filters provide the robustness relative to weak geometrical discrepancy of compared images. Selective guided contrasting based on morphological opening/closing and thresholded morphological correlation demonstrates the best change detection result.

  17. A non-linear regression method for CT brain perfusion analysis

    NASA Astrophysics Data System (ADS)

    Bennink, E.; Oosterbroek, J.; Viergever, M. A.; Velthuis, B. K.; de Jong, H. W. A. M.

    2015-03-01

    CT perfusion (CTP) imaging allows for rapid diagnosis of ischemic stroke. Generation of perfusion maps from CTP data usually involves deconvolution algorithms providing estimates for the impulse response function in the tissue. We propose the use of a fast non-linear regression (NLR) method that we postulate has similar performance to the current academic state-of-art method (bSVD), but that has some important advantages, including the estimation of vascular permeability, improved robustness to tracer-delay, and very few tuning parameters, that are all important in stroke assessment. The aim of this study is to evaluate the fast NLR method against bSVD and a commercial clinical state-of-art method. The three methods were tested against a published digital perfusion phantom earlier used to illustrate the superiority of bSVD. In addition, the NLR and clinical methods were also tested against bSVD on 20 clinical scans. Pearson correlation coefficients were calculated for each of the tested methods. All three methods showed high correlation coefficients (>0.9) with the ground truth in the phantom. With respect to the clinical scans, the NLR perfusion maps showed higher correlation with bSVD than the perfusion maps from the clinical method. Furthermore, the perfusion maps showed that the fast NLR estimates are robust to tracer-delay. In conclusion, the proposed fast NLR method provides a simple and flexible way of estimating perfusion parameters from CT perfusion scans, with high correlation coefficients. This suggests that it could be a better alternative to the current clinical and academic state-of-art methods.

  18. The structure and resilience of financial market networks

    NASA Astrophysics Data System (ADS)

    Kauê Dal'Maso Peron, Thomas; da Fontoura Costa, Luciano; Rodrigues, Francisco A.

    2012-03-01

    Financial markets can be viewed as a highly complex evolving system that is very sensitive to economic instabilities. The complex organization of the market can be represented in a suitable fashion in terms of complex networks, which can be constructed from stock prices such that each pair of stocks is connected by a weighted edge that encodes the distance between them. In this work, we propose an approach to analyze the topological and dynamic evolution of financial networks based on the stock correlation matrices. An entropy-related measurement is adopted to quantify the robustness of the evolving financial market organization. It is verified that the network topological organization suffers strong variation during financial instabilities and the networks in such periods become less robust. A statistical robust regression model is proposed to quantity the relationship between the network structure and resilience. The obtained coefficients of such model indicate that the average shortest path length is the measurement most related to network resilience coefficient. This result indicates that a collective behavior is observed between stocks during financial crisis. More specifically, stocks tend to synchronize their price evolution, leading to a high correlation between pair of stock prices, which contributes to the increase in distance between them and, consequently, decrease the network resilience.

  19. Hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) and its application to predicting key process variables.

    PubMed

    He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-03-01

    In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Adaptive Green-Kubo estimates of transport coefficients from molecular dynamics based on robust error analysis.

    PubMed

    Jones, Reese E; Mandadapu, Kranthi K

    2012-04-21

    We present a rigorous Green-Kubo methodology for calculating transport coefficients based on on-the-fly estimates of: (a) statistical stationarity of the relevant process, and (b) error in the resulting coefficient. The methodology uses time samples efficiently across an ensemble of parallel replicas to yield accurate estimates, which is particularly useful for estimating the thermal conductivity of semi-conductors near their Debye temperatures where the characteristic decay times of the heat flux correlation functions are large. Employing and extending the error analysis of Zwanzig and Ailawadi [Phys. Rev. 182, 280 (1969)] and Frenkel [in Proceedings of the International School of Physics "Enrico Fermi", Course LXXV (North-Holland Publishing Company, Amsterdam, 1980)] to the integral of correlation, we are able to provide tight theoretical bounds for the error in the estimate of the transport coefficient. To demonstrate the performance of the method, four test cases of increasing computational cost and complexity are presented: the viscosity of Ar and water, and the thermal conductivity of Si and GaN. In addition to producing accurate estimates of the transport coefficients for these materials, this work demonstrates precise agreement of the computed variances in the estimates of the correlation and the transport coefficient with the extended theory based on the assumption that fluctuations follow a Gaussian process. The proposed algorithm in conjunction with the extended theory enables the calculation of transport coefficients with the Green-Kubo method accurately and efficiently.

  1. Adaptive Green-Kubo estimates of transport coefficients from molecular dynamics based on robust error analysis

    NASA Astrophysics Data System (ADS)

    Jones, Reese E.; Mandadapu, Kranthi K.

    2012-04-01

    We present a rigorous Green-Kubo methodology for calculating transport coefficients based on on-the-fly estimates of: (a) statistical stationarity of the relevant process, and (b) error in the resulting coefficient. The methodology uses time samples efficiently across an ensemble of parallel replicas to yield accurate estimates, which is particularly useful for estimating the thermal conductivity of semi-conductors near their Debye temperatures where the characteristic decay times of the heat flux correlation functions are large. Employing and extending the error analysis of Zwanzig and Ailawadi [Phys. Rev. 182, 280 (1969)], 10.1103/PhysRev.182.280 and Frenkel [in Proceedings of the International School of Physics "Enrico Fermi", Course LXXV (North-Holland Publishing Company, Amsterdam, 1980)] to the integral of correlation, we are able to provide tight theoretical bounds for the error in the estimate of the transport coefficient. To demonstrate the performance of the method, four test cases of increasing computational cost and complexity are presented: the viscosity of Ar and water, and the thermal conductivity of Si and GaN. In addition to producing accurate estimates of the transport coefficients for these materials, this work demonstrates precise agreement of the computed variances in the estimates of the correlation and the transport coefficient with the extended theory based on the assumption that fluctuations follow a Gaussian process. The proposed algorithm in conjunction with the extended theory enables the calculation of transport coefficients with the Green-Kubo method accurately and efficiently.

  2. Robust spectral-domain optical coherence tomography speckle model and its cross-correlation coefficient analysis

    PubMed Central

    Liu, Xuan; Ramella-Roman, Jessica C.; Huang, Yong; Guo, Yuan; Kang, Jin U.

    2013-01-01

    In this study, we proposed a generic speckle simulation for optical coherence tomography (OCT) signal, by convolving the point spread function (PSF) of the OCT system with the numerically synthesized random sample field. We validated our model and used the simulation method to study the statistical properties of cross-correlation coefficients (XCC) between Ascans which have been recently applied in transverse motion analysis by our group. The results of simulation show that over sampling is essential for accurate motion tracking; exponential decay of OCT signal leads to an under estimate of motion which can be corrected; lateral heterogeneity of sample leads to an over estimate of motion for a few pixels corresponding to the structural boundary. PMID:23456001

  3. Tabu Search enhances network robustness under targeted attacks

    NASA Astrophysics Data System (ADS)

    Sun, Shi-wen; Ma, Yi-lin; Li, Rui-qi; Wang, Li; Xia, Cheng-yi

    2016-03-01

    We focus on the optimization of network robustness with respect to intentional attacks on high-degree nodes. Given an existing network, this problem can be considered as a typical single-objective combinatorial optimization problem. Based on the heuristic Tabu Search optimization algorithm, a link-rewiring method is applied to reconstruct the network while keeping the degree of every node unchanged. Through numerical simulations, BA scale-free network and two real-world networks are investigated to verify the effectiveness of the proposed optimization method. Meanwhile, we analyze how the optimization affects other topological properties of the networks, including natural connectivity, clustering coefficient and degree-degree correlation. The current results can help to improve the robustness of existing complex real-world systems, as well as to provide some insights into the design of robust networks.

  4. Parametric Method Performance for Dynamic 3'-Deoxy-3'-18F-Fluorothymidine PET/CT in Epidermal Growth Factor Receptor-Mutated Non-Small Cell Lung Carcinoma Patients Before and During Therapy.

    PubMed

    Kramer, Gerbrand Maria; Frings, Virginie; Heijtel, Dennis; Smit, E F; Hoekstra, Otto S; Boellaard, Ronald

    2017-06-01

    The objective of this study was to validate several parametric methods for quantification of 3'-deoxy-3'- 18 F-fluorothymidine ( 18 F-FLT) PET in advanced-stage non-small cell lung carcinoma (NSCLC) patients with an activating epidermal growth factor receptor mutation who were treated with gefitinib or erlotinib. Furthermore, we evaluated the impact of noise on accuracy and precision of the parametric analyses of dynamic 18 F-FLT PET/CT to assess the robustness of these methods. Methods : Ten NSCLC patients underwent dynamic 18 F-FLT PET/CT at baseline and 7 and 28 d after the start of treatment. Parametric images were generated using plasma input Logan graphic analysis and 2 basis functions-based methods: a 2-tissue-compartment basis function model (BFM) and spectral analysis (SA). Whole-tumor-averaged parametric pharmacokinetic parameters were compared with those obtained by nonlinear regression of the tumor time-activity curve using a reversible 2-tissue-compartment model with blood volume fraction. In addition, 2 statistically equivalent datasets were generated by countwise splitting the original list-mode data, each containing 50% of the total counts. Both new datasets were reconstructed, and parametric pharmacokinetic parameters were compared between the 2 replicates and the original data. Results: After the settings of each parametric method were optimized, distribution volumes (V T ) obtained with Logan graphic analysis, BFM, and SA all correlated well with those derived using nonlinear regression at baseline and during therapy ( R 2 ≥ 0.94; intraclass correlation coefficient > 0.97). SA-based V T images were most robust to increased noise on a voxel-level (repeatability coefficient, 16% vs. >26%). Yet BFM generated the most accurate K 1 values ( R 2 = 0.94; intraclass correlation coefficient, 0.96). Parametric K 1 data showed a larger variability in general; however, no differences were found in robustness between methods (repeatability coefficient, 80%-84%). Conclusion: Both BFM and SA can generate quantitatively accurate parametric 18 F-FLT V T images in NSCLC patients before and during therapy. SA was more robust to noise, yet BFM provided more accurate parametric K 1 data. We therefore recommend BFM as the preferred parametric method for analysis of dynamic 18 F-FLT PET/CT studies; however, SA can also be used. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  5. Iterative Ellipsoidal Trimming.

    DTIC Science & Technology

    1980-02-11

    to above. Iterative ellipsoidal trimming has been investigated before by other statisticians, most notably by Gnanadesikan and his coworkers...J., Gnanadesikan R., and Kettenring, J. R. (1975). "Robust estimation and outlier detection with correlation coefficients." Biometrika. 62, 531-45. [6...Duda, Richard, and Hart, Peter (1973). Pattern Classification and Scene Analysis. Wiley, New York. [7] Gnanadesikan , R. (1977). Methods for

  6. Statistical procedures for evaluating daily and monthly hydrologic model predictions

    USGS Publications Warehouse

    Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.

    2004-01-01

    The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.

  7. Comparing the structure of an emerging market with a mature one under global perturbation

    NASA Astrophysics Data System (ADS)

    Namaki, A.; Jafari, G. R.; Raei, R.

    2011-09-01

    In this paper we investigate the Tehran stock exchange (TSE) and Dow Jones Industrial Average (DJIA) in terms of perturbed correlation matrices. To perturb a stock market, there are two methods, namely local and global perturbation. In the local method, we replace a correlation coefficient of the cross-correlation matrix with one calculated from two Gaussian-distributed time series, whereas in the global method, we reconstruct the correlation matrix after replacing the original return series with Gaussian-distributed time series. The local perturbation is just a technical study. We analyze these markets through two statistical approaches, random matrix theory (RMT) and the correlation coefficient distribution. By using RMT, we find that the largest eigenvalue is an influence that is common to all stocks and this eigenvalue has a peak during financial shocks. We find there are a few correlated stocks that make the essential robustness of the stock market but we see that by replacing these return time series with Gaussian-distributed time series, the mean values of correlation coefficients, the largest eigenvalues of the stock markets and the fraction of eigenvalues that deviate from the RMT prediction fall sharply in both markets. By comparing these two markets, we can see that the DJIA is more sensitive to global perturbations. These findings are crucial for risk management and portfolio selection.

  8. The Assessment of Reliability Under Range Restriction: A Comparison of [Alpha], [Omega], and Test-Retest Reliability for Dichotomous Data

    ERIC Educational Resources Information Center

    Fife, Dustin A.; Mendoza, Jorge L.; Terry, Robert

    2012-01-01

    Though much research and attention has been directed at assessing the correlation coefficient under range restriction, the assessment of reliability under range restriction has been largely ignored. This article uses item response theory to simulate dichotomous item-level data to assess the robustness of KR-20 ([alpha]), [omega], and test-retest…

  9. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  10. State and group dynamics of world stock market by principal component analysis

    NASA Astrophysics Data System (ADS)

    Nobi, Ashadun; Lee, Jae Woo

    2016-05-01

    We study the dynamic interactions and structural changes by a principal component analysis (PCA) to cross-correlation coefficients of global financial indices in the years 1998-2012. The variances explained by the first PC increase with time and show a drastic change during the crisis. A sharp change in PC coefficient implies a transition of market state, a situation which occurs frequently in the American and Asian indices. However, the European indices remain stable over time. Using the first two PC coefficients, we identify indices that are similar and more strongly correlated than the others. We observe that the European indices form a robust group over the observation period. The dynamics of the individual indices within the group increase in similarity with time, and the dynamics of indices are more similar during the crises. Furthermore, the group formation of indices changes position in two-dimensional spaces due to crises. Finally, after a financial crisis, the difference of PCs between the European and American indices narrows.

  11. [Quantitative structure-gas chromatographic retention relationship of polycyclic aromatic sulfur heterocycles using molecular electronegativity-distance vector].

    PubMed

    Li, Zhenghua; Cheng, Fansheng; Xia, Zhining

    2011-01-01

    The chemical structures of 114 polycyclic aromatic sulfur heterocycles (PASHs) have been studied by molecular electronegativity-distance vector (MEDV). The linear relationships between gas chromatographic retention index and the MEDV have been established by a multiple linear regression (MLR) model. The results of variable selection by stepwise multiple regression (SMR) and the powerful predictive abilities of the optimization model appraised by leave-one-out cross-validation showed that the optimization model with the correlation coefficient (R) of 0.994 7 and the cross-validated correlation coefficient (Rcv) of 0.994 0 possessed the best statistical quality. Furthermore, when the 114 PASHs compounds were divided into calibration and test sets in the ratio of 2:1, the statistical analysis showed our models possesses almost equal statistical quality, the very similar regression coefficients and the good robustness. The quantitative structure-retention relationship (QSRR) model established may provide a convenient and powerful method for predicting the gas chromatographic retention of PASHs.

  12. Comparison of co-expression measures: mutual information, correlation, and model based indices.

    PubMed

    Song, Lin; Langfelder, Peter; Horvath, Steve

    2012-12-09

    Co-expression measures are often used to define networks among genes. Mutual information (MI) is often used as a generalized correlation measure. It is not clear how much MI adds beyond standard (robust) correlation measures or regression model based association measures. Further, it is important to assess what transformations of these and other co-expression measures lead to biologically meaningful modules (clusters of genes). We provide a comprehensive comparison between mutual information and several correlation measures in 8 empirical data sets and in simulations. We also study different approaches for transforming an adjacency matrix, e.g. using the topological overlap measure. Overall, we confirm close relationships between MI and correlation in all data sets which reflects the fact that most gene pairs satisfy linear or monotonic relationships. We discuss rare situations when the two measures disagree. We also compare correlation and MI based approaches when it comes to defining co-expression network modules. We show that a robust measure of correlation (the biweight midcorrelation transformed via the topological overlap transformation) leads to modules that are superior to MI based modules and maximal information coefficient (MIC) based modules in terms of gene ontology enrichment. We present a function that relates correlation to mutual information which can be used to approximate the mutual information from the corresponding correlation coefficient. We propose the use of polynomial or spline regression models as an alternative to MI for capturing non-linear relationships between quantitative variables. The biweight midcorrelation outperforms MI in terms of elucidating gene pairwise relationships. Coupled with the topological overlap matrix transformation, it often leads to more significantly enriched co-expression modules. Spline and polynomial networks form attractive alternatives to MI in case of non-linear relationships. Our results indicate that MI networks can safely be replaced by correlation networks when it comes to measuring co-expression relationships in stationary data.

  13. A determinant-based criterion for working correlation structure selection in generalized estimating equations.

    PubMed

    Jaman, Ajmery; Latif, Mahbub A H M; Bari, Wasimul; Wahed, Abdus S

    2016-05-20

    In generalized estimating equations (GEE), the correlation between the repeated observations on a subject is specified with a working correlation matrix. Correct specification of the working correlation structure ensures efficient estimators of the regression coefficients. Among the criteria used, in practice, for selecting working correlation structure, Rotnitzky-Jewell, Quasi Information Criterion (QIC) and Correlation Information Criterion (CIC) are based on the fact that if the assumed working correlation structure is correct then the model-based (naive) and the sandwich (robust) covariance estimators of the regression coefficient estimators should be close to each other. The sandwich covariance estimator, used in defining the Rotnitzky-Jewell, QIC and CIC criteria, is biased downward and has a larger variability than the corresponding model-based covariance estimator. Motivated by this fact, a new criterion is proposed in this paper based on the bias-corrected sandwich covariance estimator for selecting an appropriate working correlation structure in GEE. A comparison of the proposed and the competing criteria is shown using simulation studies with correlated binary responses. The results revealed that the proposed criterion generally performs better than the competing criteria. An example of selecting the appropriate working correlation structure has also been shown using the data from Madras Schizophrenia Study. Copyright © 2015 John Wiley & Sons, Ltd.

  14. More accurate, calibrated bootstrap confidence intervals for correlating two autocorrelated climate time series

    NASA Astrophysics Data System (ADS)

    Olafsdottir, Kristin B.; Mudelsee, Manfred

    2013-04-01

    Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.

  15. A quantitative property-property relationship for the internal diffusion coefficients of organic compounds in solid materials.

    PubMed

    Huang, L; Fantke, P; Ernstoff, A; Jolliet, O

    2017-11-01

    Indoor releases of organic chemicals encapsulated in solid materials are major contributors to human exposures and are directly related to the internal diffusion coefficient in solid materials. Existing correlations to estimate the diffusion coefficient are only valid for a limited number of chemical-material combinations. This paper develops and evaluates a quantitative property-property relationship (QPPR) to predict diffusion coefficients for a wide range of organic chemicals and materials. We first compiled a training dataset of 1103 measured diffusion coefficients for 158 chemicals in 32 consolidated material types. Following a detailed analysis of the temperature influence, we developed a multiple linear regression model to predict diffusion coefficients as a function of chemical molecular weight (MW), temperature, and material type (adjusted R 2 of .93). The internal validations showed the model to be robust, stable and not a result of chance correlation. The external validation against two separate prediction datasets demonstrated the model has good predicting ability within its applicability domain (Rext2>.8), namely MW between 30 and 1178 g/mol and temperature between 4 and 180°C. By covering a much wider range of organic chemicals and materials, this QPPR facilitates high-throughput estimates of human exposures for chemicals encapsulated in solid materials. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. The influence of base rates on correlations: An evaluation of proposed alternative effect sizes with real-world data.

    PubMed

    Babchishin, Kelly M; Helmus, Leslie-Maaike

    2016-09-01

    Correlations are the simplest and most commonly understood effect size statistic in psychology. The purpose of the current paper was to use a large sample of real-world data (109 correlations with 60,415 participants) to illustrate the base rate dependence of correlations when applied to dichotomous or ordinal data. Specifically, we examined the influence of the base rate on different effect size metrics. Correlations decreased when the dichotomous variable did not have a 50 % base rate. The higher the deviation from a 50 % base rate, the smaller the observed Pearson's point-biserial and Kendall's tau correlation coefficients. In contrast, the relationship between base rate deviations and the more commonly proposed alternatives (i.e., polychoric correlation coefficients, AUCs, Pearson/Thorndike adjusted correlations, and Cohen's d) were less remarkable, with AUCs being most robust to attenuation due to base rates. In other words, the base rate makes a marked difference in the magnitude of the correlation. As such, when using dichotomous data, the correlation may be more sensitive to base rates than is optimal for the researcher's goals. Given the magnitude of the association between the base rate and point-biserial correlations (r = -.81) and Kendall's tau (r = -.80), we recommend that AUCs, Pearson/Thorndike adjusted correlations, Cohen's d, or polychoric correlations should be considered as alternate effect size statistics in many contexts.

  17. The power metric: a new statistically robust enrichment-type metric for virtual screening applications with early recovery capability.

    PubMed

    Lopes, Julio Cesar Dias; Dos Santos, Fábio Mendes; Martins-José, Andrelly; Augustyns, Koen; De Winter, Hans

    2017-01-01

    A new metric for the evaluation of model performance in the field of virtual screening and quantitative structure-activity relationship applications is described. This metric has been termed the power metric and is defined as the fraction of the true positive rate divided by the sum of the true positive and false positive rates, for a given cutoff threshold. The performance of this metric is compared with alternative metrics such as the enrichment factor, the relative enrichment factor, the receiver operating curve enrichment factor, the correct classification rate, Matthews correlation coefficient and Cohen's kappa coefficient. The performance of this new metric is found to be quite robust with respect to variations in the applied cutoff threshold and ratio of the number of active compounds to the total number of compounds, and at the same time being sensitive to variations in model quality. It possesses the correct characteristics for its application in early-recognition virtual screening problems.

  18. Influence of Climate Oscillations on Extreme Precipitation in Texas

    NASA Astrophysics Data System (ADS)

    Bhatia, N.; Singh, V. P.; Srivastav, R. K.

    2016-12-01

    Much research in the field of hydroclimatology is focusing on the impact of climate variability on hydrologic extremes. Recent studies show that the unique geographical location and the enormous areal extent, coupled with extensive variations in climate oscillations, have intensified the regional hydrologic cycle of Texas. The state-wide extreme precipitation events can actually be attributed to sea-surface pressure and temperature anomalies, such as Bermuda High and Jet Streams, which are further triggered by such climate oscillations. This study aims to quantify the impact of five major Atlantic and Pacific Ocean related climate oscillations: (i) Atlantic Multidecadal Oscillation (AMO), (ii) North Atlantic Oscillation (NAO), (iii) Pacific Decadal Oscillation (PDO), (iv) Pacific North American Pattern (PNA), and (v) Southern Oscillation Index (SOI), on extreme precipitation in Texas. Their respective effects will be determined for both climate divisions delineated by the National Climatic Data Centre (NCDC) and climate regions defined by the Köppen Climate Classification System. This study will adopt a weighted correlation approach to attain the robust correlation coefficients while addressing the regionally variable data outliers for extreme precipitation. Further, the variation of robust correlation coefficients across Texas is found to be related to the station elevation, historical average temperature, and total precipitation in the months of extremes. The research will shed light on the relationship between precipitation extremes and climate variability, thus aiding regional water boards in planning, designing, and managing the respective systems as per the future climate change.

  19. Using spatial information about recurrence risk for robust optimization of dose-painting prescription functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Edward T.

    Purpose: To develop a robust method for deriving dose-painting prescription functions using spatial information about the risk for disease recurrence. Methods: Spatial distributions of radiobiological model parameters are derived from distributions of recurrence risk after uniform irradiation. These model parameters are then used to derive optimal dose-painting prescription functions given a constant mean biologically effective dose. Results: An estimate for the optimal dose distribution can be derived based on spatial information about recurrence risk. Dose painting based on imaging markers that are moderately or poorly correlated with recurrence risk are predicted to potentially result in inferior disease control when comparedmore » the same mean biologically effective dose delivered uniformly. A robust optimization approach may partially mitigate this issue. Conclusions: The methods described here can be used to derive an estimate for a robust, patient-specific prescription function for use in dose painting. Two approximate scaling relationships were observed: First, the optimal choice for the maximum dose differential when using either a linear or two-compartment prescription function is proportional to R, where R is the Pearson correlation coefficient between a given imaging marker and recurrence risk after uniform irradiation. Second, the predicted maximum possible gain in tumor control probability for any robust optimization technique is nearly proportional to the square of R.« less

  20. Heterogeneity in Intratumor Correlations of 18F-FDG, 18F-FLT, and 61Cu-ATSM PET in Canine Sinonasal Tumors

    PubMed Central

    Bradshaw, Tyler J.; Bowen, Stephen R.; Jallow, Ngoneh; Forrest, Lisa J.; Jeraj, Robert

    2014-01-01

    Intratumor heterogeneity in biologic properties and in relationships between various phenotypes may present a challenge for biologically targeted therapies. Understanding the relationships between different phenotypes in individual tumor types could help inform treatment selection. The goal of this study was to characterize spatial correlations of glucose metabolism, proliferation, and hypoxia in 2 histologic types of tumors. Methods Twenty canine veterinary patients with spontaneously occurring sinonasal tumors (13 carcinomas and 7 sarcomas) were imaged with 18F-FDG, 18F-labeled 39-deoxy-39-fluorothymidine (18F-FLT), and 61Cu-labeled diacetyl-bis(N4-methylthiosemicarbazone) (61Cu-ATSM) PET/CT on 3 consecutive days. Precise positioning and immobilization techniques coupled with anesthesia enabled motionless scans with repeatable positioning. Standardized uptake values (SUVs) of gross sarcoma and carcinoma volumes were compared by use of Mann– Whitney U tests. Patient images were rigidly registered together, and intratumor tracer uptake distributions were compared. Voxel-based Spearman correlation coefficients were used to quantify intertracer correlations, and the correlation coefficients of sarcomas and carcinomas were compared. The relative overlap of the highest uptake volumes of the 3 tracers was quantified, and the values were compared for sarcomas and carcinomas. Results Large degrees of heterogeneity in SUV measures and phenotype correlations were observed. Carcinoma and sarcoma tumors differed significantly in SUV measures, with carcinoma tumors having significantly higher 18F-FDG maximum SUVs than sarcoma tumors (11.1 vs. 5.0; P = 0.01) as well as higher 61Cu-ATSM mean SUVs (2.6 vs. 1.2; P = 0.02). Carcinomas had significantly higher population-averaged Spearman correlation coefficients than sarcomas in comparisons of 18F-FDG and 18F-FLT (0.80 vs. 0.61; P = 0.02), 18F-FLT and 61Cu-ATSM (0.83 vs. 0.38; P < 0.0001), and 18F-FDG and 61Cu-ATSM (0.82 vs. 0.69; P = 0.04). Additionally, the highest uptake volumes of the 3 tracers had significantly greater overlap in carcinomas than in sarcomas. Conclusion The relationships of glucose metabolism, proliferation, and hypoxia were heterogeneous across different tumors, with carcinomas tending to have high correlations and sarcomas having low correlations. Consequently, canine carcinoma tumors are robust targets for therapies that target a single biologic property, whereas sarcoma tumors may not be well suited for such therapies. Histology-specific PET correlations have far-reaching implications for the robustness of biologic target definition. PMID:24042031

  1. Comparing the Pearson and Spearman correlation coefficients across distributions and sample sizes: A tutorial using simulations and empirical data.

    PubMed

    de Winter, Joost C F; Gosling, Samuel D; Potter, Jeff

    2016-09-01

    The Pearson product–moment correlation coefficient ( r p ) and the Spearman rank correlation coefficient ( r s ) are widely used in psychological research. We compare r p and r s on 3 criteria: variability, bias with respect to the population value, and robustness to an outlier. Using simulations across low (N = 5) to high (N = 1,000) sample sizes we show that, for normally distributed variables, r p and r s have similar expected values but r s is more variable, especially when the correlation is strong. However, when the variables have high kurtosis, r p is more variable than r s . Next, we conducted a sampling study of a psychometric dataset featuring symmetrically distributed data with light tails, and of 2 Likert-type survey datasets, 1 with light-tailed and the other with heavy-tailed distributions. Consistent with the simulations, r p had lower variability than r s in the psychometric dataset. In the survey datasets with heavy-tailed variables in particular, r s had lower variability than r p , and often corresponded more accurately to the population Pearson correlation coefficient ( R p ) than r p did. The simulations and the sampling studies showed that variability in terms of standard deviations can be reduced by about 20% by choosing r s instead of r p . In comparison, increasing the sample size by a factor of 2 results in a 41% reduction of the standard deviations of r s and r p . In conclusion, r p is suitable for light-tailed distributions, whereas r s is preferable when variables feature heavy-tailed distributions or when outliers are present, as is often the case in psychological research. PsycINFO Database Record (c) 2016 APA, all rights reserved

  2. Bayesian power spectrum inference with foreground and target contamination treatment

    NASA Astrophysics Data System (ADS)

    Jasche, J.; Lavaux, G.

    2017-10-01

    This work presents a joint and self-consistent Bayesian treatment of various foreground and target contaminations when inferring cosmological power spectra and three-dimensional density fields from galaxy redshift surveys. This is achieved by introducing additional block-sampling procedures for unknown coefficients of foreground and target contamination templates to the previously presented ARES framework for Bayesian large-scale structure analyses. As a result, the method infers jointly and fully self-consistently three-dimensional density fields, cosmological power spectra, luminosity-dependent galaxy biases, noise levels of the respective galaxy distributions, and coefficients for a set of a priori specified foreground templates. In addition, this fully Bayesian approach permits detailed quantification of correlated uncertainties amongst all inferred quantities and correctly marginalizes over observational systematic effects. We demonstrate the validity and efficiency of our approach in obtaining unbiased estimates of power spectra via applications to realistic mock galaxy observations that are subject to stellar contamination and dust extinction. While simultaneously accounting for galaxy biases and unknown noise levels, our method reliably and robustly infers three-dimensional density fields and corresponding cosmological power spectra from deep galaxy surveys. Furthermore, our approach correctly accounts for joint and correlated uncertainties between unknown coefficients of foreground templates and the amplitudes of the power spectrum. This effect amounts to correlations and anti-correlations of up to 10 per cent across wide ranges in Fourier space.

  3. Stochastic dynamics of uncoupled neural oscillators: Fokker-Planck studies with the finite element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galan, Roberto F.; Urban, Nathaniel N.; Center for the Neural Basis of Cognition, Mellon Institute, Pittsburgh, Pennsylvania 15213

    We have investigated the effect of the phase response curve on the dynamics of oscillators driven by noise in two limit cases that are especially relevant for neuroscience. Using the finite element method to solve the Fokker-Planck equation we have studied (i) the impact of noise on the regularity of the oscillations quantified as the coefficient of variation, (ii) stochastic synchronization of two uncoupled phase oscillators driven by correlated noise, and (iii) their cross-correlation function. We show that, in general, the limit of type II oscillators is more robust to noise and more efficient at synchronizing by correlated noise thanmore » type I.« less

  4. Identifying in vivo DCE MRI markers associated with microvessel architecture and gleason grades of prostate cancer.

    PubMed

    Singanamalli, Asha; Rusu, Mirabela; Sparks, Rachel E; Shih, Natalie N C; Ziober, Amy; Wang, Li-Ping; Tomaszewski, John; Rosen, Mark; Feldman, Michael; Madabhushi, Anant

    2016-01-01

    To identify computer extracted in vivo dynamic contrast enhanced (DCE) MRI markers associated with quantitative histomorphometric (QH) characteristics of microvessels and Gleason scores (GS) in prostate cancer. This study considered retrospective data from 23 biopsy confirmed prostate cancer patients who underwent 3 Tesla multiparametric MRI before radical prostatectomy (RP). Representative slices from RP specimens were stained with vascular marker CD31. Tumor extent was mapped from RP sections onto DCE MRI using nonlinear registration methods. Seventy-seven microvessel QH features and 18 DCE MRI kinetic features were extracted and evaluated for their ability to distinguish low from intermediate and high GS. The effect of temporal sampling on kinetic features was assessed and correlations between those robust to temporal resolution and microvessel features discriminative of GS were examined. A total of 12 microvessel architectural features were discriminative of low and intermediate/high grade tumors with area under the receiver operating characteristic curve (AUC) > 0.7. These features were most highly correlated with mean washout gradient (WG) (max rho = -0.62). Independent analysis revealed WG to be moderately robust to temporal resolution (intraclass correlation coefficient [ICC] = 0.63) and WG variance, which was poorly correlated with microvessel features, to be predictive of low grade tumors (AUC = 0.77). Enhancement ratio was the most robust (ICC = 0.96) and discriminative (AUC = 0.78) kinetic feature but was moderately correlated with microvessel features (max rho = -0.52). Computer extracted features of prostate DCE MRI appear to be correlated with microvessel architecture and may be discriminative of low versus intermediate and high GS. © 2015 Wiley Periodicals, Inc.

  5. The Evaluation on the Cadmium Net Concentration for Soil Ecosystems.

    PubMed

    Yao, Yu; Wang, Pei-Fang; Wang, Chao; Hou, Jun; Miao, Ling-Zhan

    2017-03-12

    Yixing, known as the "City of Ceramics", is facing a new dilemma: a raw material crisis. Cadmium (Cd) exists in extremely high concentrations in soil due to the considerable input of industrial wastewater into the soil ecosystem. The in situ technique of diffusive gradients in thin film (DGT), the ex situ static equilibrium approach (HAc, EDTA and CaCl2), and the dissolved concentration in soil solution, as well as microwave digestion, were applied to predict the Cd bioavailability of soil, aiming to provide a robust and accurate method for Cd bioavailability evaluation in Yixing. Moreover, the typical local cash crops-paddy and zizania aquatica-were selected for Cd accumulation, aiming to select the ideal plants with tolerance to the soil Cd contamination. The results indicated that the biomasses of the two applied plants were sufficiently sensitive to reflect the stark regional differences of different sampling sites. The zizania aquatica could effectively reduce the total Cd concentration, as indicated by the high accumulation coefficients. However, the fact that the zizania aquatica has extremely high transfer coefficients, and its stem, as the edible part, might accumulate large amounts of Cd, led to the conclusion that zizania aquatica was not an ideal cash crop in Yixing. Furthermore, the labile Cd concentrations which were obtained by the DGT technique and dissolved in the soil solution showed a significant correlation with the Cd concentrations of the biota accumulation. However, the ex situ methods and the microwave digestion-obtained Cd concentrations showed a poor correlation with the accumulated Cd concentration in plant tissue. Correspondingly, the multiple linear regression models were built for fundamental analysis of the performance of different methods available for Cd bioavailability evaluation. The correlation coefficients of DGT obtained by the improved multiple linear regression model have not significantly improved compared to the coefficients obtained by the simple linear regression model. The results revealed that DGT was a robust measurement, which could obtain the labile Cd concentrations independent of the physicochemical features' variation in the soil ecosystem. Consequently, these findings provide stronger evidence that DGT is an effective and ideal tool for labile Cd evaluation in Yixing.

  6. The Evaluation on the Cadmium Net Concentration for Soil Ecosystems

    PubMed Central

    Yao, Yu; Wang, Pei-Fang; Wang, Chao; Hou, Jun; Miao, Ling-Zhan

    2017-01-01

    Yixing, known as the “City of Ceramics”, is facing a new dilemma: a raw material crisis. Cadmium (Cd) exists in extremely high concentrations in soil due to the considerable input of industrial wastewater into the soil ecosystem. The in situ technique of diffusive gradients in thin film (DGT), the ex situ static equilibrium approach (HAc, EDTA and CaCl2), and the dissolved concentration in soil solution, as well as microwave digestion, were applied to predict the Cd bioavailability of soil, aiming to provide a robust and accurate method for Cd bioavailability evaluation in Yixing. Moreover, the typical local cash crops—paddy and zizania aquatica—were selected for Cd accumulation, aiming to select the ideal plants with tolerance to the soil Cd contamination. The results indicated that the biomasses of the two applied plants were sufficiently sensitive to reflect the stark regional differences of different sampling sites. The zizania aquatica could effectively reduce the total Cd concentration, as indicated by the high accumulation coefficients. However, the fact that the zizania aquatica has extremely high transfer coefficients, and its stem, as the edible part, might accumulate large amounts of Cd, led to the conclusion that zizania aquatica was not an ideal cash crop in Yixing. Furthermore, the labile Cd concentrations which were obtained by the DGT technique and dissolved in the soil solution showed a significant correlation with the Cd concentrations of the biota accumulation. However, the ex situ methods and the microwave digestion-obtained Cd concentrations showed a poor correlation with the accumulated Cd concentration in plant tissue. Correspondingly, the multiple linear regression models were built for fundamental analysis of the performance of different methods available for Cd bioavailability evaluation. The correlation coefficients of DGT obtained by the improved multiple linear regression model have not significantly improved compared to the coefficients obtained by the simple linear regression model. The results revealed that DGT was a robust measurement, which could obtain the labile Cd concentrations independent of the physicochemical features’ variation in the soil ecosystem. Consequently, these findings provide stronger evidence that DGT is an effective and ideal tool for labile Cd evaluation in Yixing. PMID:28287500

  7. Revealing how network structure affects accuracy of link prediction

    NASA Astrophysics Data System (ADS)

    Yang, Jin-Xuan; Zhang, Xiao-Dong

    2017-08-01

    Link prediction plays an important role in network reconstruction and network evolution. The network structure affects the accuracy of link prediction, which is an interesting problem. In this paper we use common neighbors and the Gini coefficient to reveal the relation between them, which can provide a good reference for the choice of a suitable link prediction algorithm according to the network structure. Moreover, the statistical analysis reveals correlation between the common neighbors index, Gini coefficient index and other indices to describe the network structure, such as Laplacian eigenvalues, clustering coefficient, degree heterogeneity, and assortativity of network. Furthermore, a new method to predict missing links is proposed. The experimental results show that the proposed algorithm yields better prediction accuracy and robustness to the network structure than existing currently used methods for a variety of real-world networks.

  8. A Robust Image Watermarking in the Joint Time-Frequency Domain

    NASA Astrophysics Data System (ADS)

    Öztürk, Mahmut; Akan, Aydın; Çekiç, Yalçın

    2010-12-01

    With the rapid development of computers and internet applications, copyright protection of multimedia data has become an important problem. Watermarking techniques are proposed as a solution to copyright protection of digital media files. In this paper, a new, robust, and high-capacity watermarking method that is based on spatiofrequency (SF) representation is presented. We use the discrete evolutionary transform (DET) calculated by the Gabor expansion to represent an image in the joint SF domain. The watermark is embedded onto selected coefficients in the joint SF domain. Hence, by combining the advantages of spatial and spectral domain watermarking methods, a robust, invisible, secure, and high-capacity watermarking method is presented. A correlation-based detector is also proposed to detect and extract any possible watermarks on an image. The proposed watermarking method was tested on some commonly used test images under different signal processing attacks like additive noise, Wiener and Median filtering, JPEG compression, rotation, and cropping. Simulation results show that our method is robust against all of the attacks.

  9. A Robust Post-Processing Workflow for Datasets with Motion Artifacts in Diffusion Kurtosis Imaging

    PubMed Central

    Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X.; Wan, Mingxi

    2014-01-01

    Purpose The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). Materials and methods The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). Results The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). Conclusion The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements. PMID:24727862

  10. A robust post-processing workflow for datasets with motion artifacts in diffusion kurtosis imaging.

    PubMed

    Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X; Wan, Mingxi

    2014-01-01

    The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (p<0.05). It indicated that LPCC was more sensitive in detecting motion artifacts. MSEs of all derived parameters from the reserved data after the artifacts rejection were smaller than the variance of the noise. It suggested that influence of rejected artifacts was less than influence of noise on the precision of derived parameters. The proposed workflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (p<0.05). The proposed post-processing workflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements.

  11. QR code-based non-linear image encryption using Shearlet transform and spiral phase transform

    NASA Astrophysics Data System (ADS)

    Kumar, Ravi; Bhaduri, Basanta; Hennelly, Bryan

    2018-02-01

    In this paper, we propose a new quick response (QR) code-based non-linear technique for image encryption using Shearlet transform (ST) and spiral phase transform. The input image is first converted into a QR code and then scrambled using the Arnold transform. The scrambled image is then decomposed into five coefficients using the ST and the first Shearlet coefficient, C1 is interchanged with a security key before performing the inverse ST. The output after inverse ST is then modulated with a random phase mask and further spiral phase transformed to get the final encrypted image. The first coefficient, C1 is used as a private key for decryption. The sensitivity of the security keys is analysed in terms of correlation coefficient and peak signal-to noise ratio. The robustness of the scheme is also checked against various attacks such as noise, occlusion and special attacks. Numerical simulation results are shown in support of the proposed technique and an optoelectronic set-up for encryption is also proposed.

  12. Do foreign exchange and equity markets co-move in Latin American region? Detrended cross-correlation approach

    NASA Astrophysics Data System (ADS)

    Bashir, Usman; Yu, Yugang; Hussain, Muntazir; Zebende, Gilney F.

    2016-11-01

    This paper investigates the dynamics of the relationship between foreign exchange markets and stock markets through time varying co-movements. In this sense, we analyzed the time series monthly of Latin American countries for the period from 1991 to 2015. Furthermore, we apply Granger causality to verify the direction of causality between foreign exchange and stock market and detrended cross-correlation approach (ρDCCA) for any co-movements at different time scales. Our empirical results suggest a positive cross correlation between exchange rate and stock price for all Latin American countries. The findings reveal two clear patterns of correlation. First, Brazil and Argentina have positive correlation in both short and long time frames. Second, the remaining countries are negatively correlated in shorter time scale, gradually moving to positive. This paper contributes to the field in three ways. First, we verified the co-movements of exchange rate and stock prices that were rarely discussed in previous empirical studies. Second, ρDCCA coefficient is a robust and powerful methodology to measure the cross correlation when dealing with non stationarity of time series. Third, most of the studies employed one or two time scales using co-integration and vector autoregressive approaches. Not much is known about the co-movements at varying time scales between foreign exchange and stock markets. ρDCCA coefficient facilitates the understanding of its explanatory depth.

  13. Validation of the French version of the Burn Specific Health Scale-Brief (BSHS-B) questionnaire.

    PubMed

    Gandolfi, S; Auquit-Auckbur, I; Panunzi, S; Mici, E; Grolleau, J-L; Chaput, B

    2016-11-01

    The Burn Specific Health Scale-Brief questionnaire is a widely validated tool for estimating the health related quality of life and for assessing the best multidisciplinary management of burn patients. The aim of this study was to translate the BSHS-B into French and to investigate its reliability and validity. According to the procedure proposed by the Scientific Advisory Committee of the Medical Outcomes Trust, the Burn Specific Health Scale-Brief (BSHS-B) was translated from the English version into French. In order to test the reliability of the French version of the BSHS-B, 53 burn patients French speakers completed the BSHS-B and SF-36 questionnaires from two to four years after burn. Ten of them have been re-tested at 6 months after the first evaluation. To evaluate clinical utility of the BSHS-F, internal consistency, construct validity (using SF-36) and stability in time were assessed using Cronbach's alpha statistic, Spearman rank test, and intra-class correlation coefficient respectively. The French version of the BSHS-B Cronbach's alpha coefficient was 0.93 and was >0.80 for all the sub-domains. French version of the BSHS-B and the SF-36 were positively correlated, all the associations were statistically significant (p<0.01). Intra-class correlation coefficients for test-retest ranged between 0.95 and 0.99 for the sub-domains. The intra-class correlation coefficient (ICC) for the total score was 0.98. The French version of the BSHS-B shows a robust rate of internal consistency, construct validity and stability in time, supporting its application in routine clinical practice as well as in international studies. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  14. Skeletal Correlates for Body Mass Estimation in Modern and Fossil Flying Birds

    PubMed Central

    Field, Daniel J.; Lynner, Colton; Brown, Christian; Darroch, Simon A. F.

    2013-01-01

    Scaling relationships between skeletal dimensions and body mass in extant birds are often used to estimate body mass in fossil crown-group birds, as well as in stem-group avialans. However, useful statistical measurements for constraining the precision and accuracy of fossil mass estimates are rarely provided, which prevents the quantification of robust upper and lower bound body mass estimates for fossils. Here, we generate thirteen body mass correlations and associated measures of statistical robustness using a sample of 863 extant flying birds. By providing robust body mass regressions with upper- and lower-bound prediction intervals for individual skeletal elements, we address the longstanding problem of body mass estimation for highly fragmentary fossil birds. We demonstrate that the most precise proxy for estimating body mass in the overall dataset, measured both as coefficient determination of ordinary least squares regression and percent prediction error, is the maximum diameter of the coracoid’s humeral articulation facet (the glenoid). We further demonstrate that this result is consistent among the majority of investigated avian orders (10 out of 18). As a result, we suggest that, in the majority of cases, this proxy may provide the most accurate estimates of body mass for volant fossil birds. Additionally, by presenting statistical measurements of body mass prediction error for thirteen different body mass regressions, this study provides a much-needed quantitative framework for the accurate estimation of body mass and associated ecological correlates in fossil birds. The application of these regressions will enhance the precision and robustness of many mass-based inferences in future paleornithological studies. PMID:24312392

  15. Automated brain tumor segmentation in magnetic resonance imaging based on sliding-window technique and symmetry analysis.

    PubMed

    Lian, Yanyun; Song, Zhijian

    2014-01-01

    Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.

  16. Characteristics of Venture Capital Network and Its Correlation with Regional Economy: Evidence from China.

    PubMed

    Jin, Yonghong; Zhang, Qi; Shan, Lifei; Li, Sai-Ping

    2015-01-01

    Financial networks have been extensively studied as examples of real world complex networks. In this paper, we establish and study the network of venture capital (VC) firms in China. We compute and analyze the statistical properties of the network, including parameters such as degrees, mean lengths of the shortest paths, clustering coefficient and robustness. We further study the topology of the network and find that it has small-world behavior. A multiple linear regression model is introduced to study the relation between network parameters and major regional economic indices in China. From the result of regression, we find that, economic aggregate (including the total GDP, investment, consumption and net export), upgrade of industrial structure, employment and remuneration of a region are all positively correlated with the degree and the clustering coefficient of the VC sub-network of the region, which suggests that the development of the VC industry has substantial effects on regional economy in China.

  17. Characteristics of Venture Capital Network and Its Correlation with Regional Economy: Evidence from China

    PubMed Central

    Jin, Yonghong; Zhang, Qi; Shan, Lifei; Li, Sai-Ping

    2015-01-01

    Financial networks have been extensively studied as examples of real world complex networks. In this paper, we establish and study the network of venture capital (VC) firms in China. We compute and analyze the statistical properties of the network, including parameters such as degrees, mean lengths of the shortest paths, clustering coefficient and robustness. We further study the topology of the network and find that it has small-world behavior. A multiple linear regression model is introduced to study the relation between network parameters and major regional economic indices in China. From the result of regression, we find that, economic aggregate (including the total GDP, investment, consumption and net export), upgrade of industrial structure, employment and remuneration of a region are all positively correlated with the degree and the clustering coefficient of the VC sub-network of the region, which suggests that the development of the VC industry has substantial effects on regional economy in China. PMID:26340555

  18. Quantitative structure-activity relationship for the partition coefficient of hydrophobic compounds between silicone oil and air.

    PubMed

    Qu, Yanfei; Ma, Yongwen; Wan, Jinquan; Wang, Yan

    2018-06-01

    The silicon oil-air partition coefficients (K SiO/A ) of hydrophobic compounds are vital parameters for applying silicone oil as non-aqueous-phase liquid in partitioning bioreactors. Due to the limited number of K SiO/A values determined by experiment for hydrophobic compounds, there is an urgent need to model the K SiO/A values for unknown chemicals. In the present study, we developed a universal quantitative structure-activity relationship (QSAR) model using a sequential approach with macro-constitutional and micromolecular descriptors for silicone oil-air partition coefficients (K SiO/A ) of hydrophobic compounds with large structural variance. The geometry optimization and vibrational frequencies of each chemical were calculated using the hybrid density functional theory at the B3LYP/6-311G** level. Several quantum chemical parameters that reflect various intermolecular interactions as well as hydrophobicity were selected to develop QSAR model. The result indicates that a regression model derived from logK SiO/A , the number of non-hydrogen atoms (#nonHatoms) and energy gap of E LUMO and E HOMO (E LUMO -E HOMO ) could explain the partitioning mechanism of hydrophobic compounds between silicone oil and air. The correlation coefficient R 2 of the model is 0.922, and the internal and external validation coefficient, Q 2 LOO and Q 2 ext , are 0.91 and 0.89 respectively, implying that the model has satisfactory goodness-of-fit, robustness, and predictive ability and thus provides a robust predictive tool to estimate the logK SiO/A values for chemicals in application domain. The applicability domain of the model was visualized by the Williams plot.

  19. Quantum-memory-assisted entropic uncertainty in spin models with Dzyaloshinskii-Moriya interaction

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming

    2018-02-01

    In this article, we investigate the dynamics and correlations of quantum-memory-assisted entropic uncertainty, the tightness of the uncertainty, entanglement, quantum correlation and mixedness for various spin chain models with Dzyaloshinskii-Moriya (DM) interaction, including the XXZ model with DM interaction, the XY model with DM interaction and the Ising model with DM interaction. We find that the uncertainty grows to a stable value with growing temperature but reduces as the coupling coefficient, anisotropy parameter and DM values increase. It is found that the entropic uncertainty is closely correlated with the mixedness of the system. The increasing quantum correlation can result in a decrease in the uncertainty, and the robustness of quantum correlation is better than entanglement since entanglement means sudden birth and death. The tightness of the uncertainty drops to zero, apart from slight volatility as various parameters increase. Furthermore, we propose an effective approach to steering the uncertainty by weak measurement reversal.

  20. Saving time and money: a validation of the self ratings on the prospective NIMH Life-Chart Method (NIMH-LCM).

    PubMed

    Born, Christoph; Amann, Benedikt L; Grunze, Heinz; Post, Robert M; Schärer, Lars

    2014-05-07

    Careful observation of the longitudinal course of bipolar disorders is pivotal to finding optimal treatments and improving outcome. A useful tool is the daily prospective Life-Chart Method, developed by the National Institute of Mental Health. However, it remains unclear whether the patient version is as valid as the clinician version. We compared the patient-rated version of the Lifechart (LC-self) with the Young-Mania-Rating Scale (YMRS), Inventory of Depressive Symptoms-Clinician version (IDS-C), and Clinical Global Impression-Bipolar version (CGI-BP) in 108 bipolar I and II patients who participated in the Naturalistic Follow-up Study (NFS) of the German centres of the Bipolar Collaborative Network (BCN; formerly Stanley Foundation Bipolar Network). For statistical evaluation, levels of severity of mood states on the Lifechart were transformed numerically and comparison with affective scales was performed using chi-square and t tests. For testing correlations Pearson´s coefficient was calculated. Ratings for depression of LC-self and total scores of IDS-C were found to be highly correlated (Pearson coefficient r = -.718; p < .001), whilst the correlation of ratings for mania with YMRS compared to LC-self were slightly less robust (Pearson coefficient r = .491; p = .001). These results were confirmed by good correlations between the CGI-BP IA (mania), IB (depression) and IC (overall mood state) and the LC-self ratings (Pearson coefficient r = .488, r = .721 and r = .65, respectively; all p < .001). The LC-self shows a significant correlation and good concordance with standard cross sectional affective rating scales, suggesting that the LC-self is a valid and time and money saving alternative to the clinician-rated version which should be incorporated in future clinical research in bipolar disorder. Generalizability of the results is limited by the selection of highly motivated patients in specialized bipolar centres and by the open design of the study.

  1. Robustness analysis of interdependent networks under multiple-attacking strategies

    NASA Astrophysics Data System (ADS)

    Gao, Yan-Li; Chen, Shi-Ming; Nie, Sen; Ma, Fei; Guan, Jun-Jie

    2018-04-01

    The robustness of complex networks under attacks largely depends on the structure of a network and the nature of the attacks. Previous research on interdependent networks has focused on two types of initial attack: random attack and degree-based targeted attack. In this paper, a deliberate attack function is proposed, where six kinds of deliberate attacking strategies can be derived by adjusting the tunable parameters. Moreover, the robustness of four types of interdependent networks (BA-BA, ER-ER, BA-ER and ER-BA) with different coupling modes (random, positive and negative correlation) is evaluated under different attacking strategies. Interesting conclusions could be obtained. It can be found that the positive coupling mode can make the vulnerability of the interdependent network to be absolutely dependent on the most vulnerable sub-network under deliberate attacks, whereas random and negative coupling modes make the vulnerability of interdependent network to be mainly dependent on the being attacked sub-network. The robustness of interdependent network will be enhanced with the degree-degree correlation coefficient varying from positive to negative. Therefore, The negative coupling mode is relatively more optimal than others, which can substantially improve the robustness of the ER-ER network and ER-BA network. In terms of the attacking strategies on interdependent networks, the degree information of node is more valuable than the betweenness. In addition, we found a more efficient attacking strategy for each coupled interdependent network and proposed the corresponding protection strategy for suppressing cascading failure. Our results can be very useful for safety design and protection of interdependent networks.

  2. Validity of EuroQOL-5D, time trade-off, and standard gamble for age-related macular degeneration in the Singapore population

    PubMed Central

    Au Eong, K G; Chan, E W; Luo, N; Wong, S H; Tan, N W H; Lim, T H; Wagle, A M

    2012-01-01

    Background/aims Utility values of age-related macular degeneration (AMD) in Asian patients are unknown. This study aims to assess utility values and construct validity of the EuroQOL-5D (EQ-5D), time trade-off (TTO), and standard gamble (SG) instruments in the Singapore multi-ethnic AMD population. Methods Cross-sectional, two-centre, institution-based study. Visual acuity (VA), clinical AMD severity, and utility scores on the EQ-5D, TTO, and SG were obtained from 338 AMD patients. VA was analysed in terms of the better-seeing eye (BEVA), worse-seeing eye (WEVA), and weighted average of both eyes (WVA). We evaluated SG on the perfect health-death (SG(death)) and binocular perfect vision-binocular blindness (SG(blindness)) scales. Construct validity was determined by testing a priorihypotheses relating the EQ-5D, TTO, and SG utility scores to VA and clinical AMD severity. Results The mean utilities on the EQ-5D, TTO, SG(death), and SG(blindness) were 0.89, 0.81, 0.86, and 0.90, respectively. EQ-5D scores correlated weakly with BEVA, WEVA, and WVA (Pearson's correlation coefficients −0.291, −0.247, and −0.305 respectively, P<0.001 for all). SG(death) and SG(blindness) demonstrated no correlation with BEVA, WEVA, or WVA (Pearson's correlation coefficients, range −0.06 to −0.125). TTO showed weak association only with WEVA and WVA (correlation coefficients −0.237, −0.228, P<0.0001), but not with BEVA (correlation coefficient −0.161). Clinical AMD severity correlated with EQ-5D and SG(death), but not with TTO and SG(blindness) (P=0.004, 0.002, 0.235, and 0.069, respectively). Conclusions AMD has a negative impact on utilities, although utility scores were high compared with Western cohorts. EQ-5D, TTO, and SG showed suboptimal construct validity, suggesting that health status utilities may not be sufficiently robust for cost-utility analyses in this population. PMID:22222257

  3. Precipitation, temperature, and teleconnection signals across the combined North American, Monsoon Asia, and Old World Drought Atlases

    NASA Astrophysics Data System (ADS)

    Smerdon, J. E.; Baek, S. H.; Coats, S.; Williams, P.; Cook, B.; Cook, E. R.; Seager, R.

    2017-12-01

    The tree-ring-based North American Drought Atlas (NADA), Monsoon Asia Drought Atlas (MADA), and Old World Drought Atlas (OWDA) collectively yield a near-hemispheric gridded reconstruction of hydroclimate variability over the last millennium. To test the robustness of the large-scale representation of hydroclimate variability across the drought atlases, the joint expression of seasonal climate variability and teleconnections in the NADA, MADA, and OWDA are compared against two global, observation-based PDSI products. Predominantly positive (negative) correlations are determined between seasonal precipitation (surface air temperature) and collocated tree-ring-based PDSI, with average Pearson's correlation coefficients increasing in magnitude from boreal winter to summer. For precipitation, these correlations tend to be stronger in the boreal winter and summer when calculated for the observed PDSI record, while remaining similar for temperature. Notwithstanding these differences, the drought atlases robustly express teleconnection patterns associated with the El Niño-Southern Oscillation (ENSO), North Atlantic Oscillation (NAO), Pacific Decadal Oscillation (PDO), and Atlantic Multidecadal Oscillation (AMO). These expressions exist in the drought atlas estimates of boreal summer PDSI despite the fact that these modes of climate variability are dominant in boreal winter, with the exception of the Atlantic Multidecadal Oscillation. ENSO and NAO teleconnection patterns in the drought atlases are particularly consistent with their well-known dominant expressions in boreal winter and over the OWDA domain, respectively. Collectively, our findings confirm that the joint Northern Hemisphere drought atlases robustly reflect large-scale patterns of hydroclimate variability on seasonal to multidecadal timescales over the 20th century and are likely to provide similarly robust estimates of hydroclimate variability prior to the existence of widespread instrumental data.

  4. Assessing the utility of gene co-expression stability in combination with correlation in the analysis of protein-protein interaction networks

    PubMed Central

    2011-01-01

    Background Gene co-expression, in the form of a correlation coefficient, has been valuable in the analysis, classification and prediction of protein-protein interactions. However, it is susceptible to bias from a few samples having a large effect on the correlation coefficient. Gene co-expression stability is a means of quantifying this bias, with high stability indicating robust, unbiased co-expression correlation coefficients. We assess the utility of gene co-expression stability as an additional measure to support the co-expression correlation in the analysis of protein-protein interaction networks. Results We studied the patterns of co-expression correlation and stability in interacting proteins with respect to their interaction promiscuity, levels of intrinsic disorder, and essentiality or disease-relatedness. Co-expression stability, along with co-expression correlation, acts as a better classifier of hub proteins in interaction networks, than co-expression correlation alone, enabling the identification of a class of hubs that are functionally distinct from the widely accepted transient (date) and obligate (party) hubs. Proteins with high levels of intrinsic disorder have low co-expression correlation and high stability with their interaction partners suggesting their involvement in transient interactions, except for a small group that have high co-expression correlation and are typically subunits of stable complexes. Similar behavior was seen for disease-related and essential genes. Interacting proteins that are both disordered have higher co-expression stability than ordered protein pairs. Using co-expression correlation and stability, we found that transient interactions are more likely to occur between an ordered and a disordered protein while obligate interactions primarily occur between proteins that are either both ordered, or disordered. Conclusions We observe that co-expression stability shows distinct patterns in structurally and functionally different groups of proteins and interactions. We conclude that it is a useful and important measure to be used in concert with gene co-expression correlation for further insights into the characteristics of proteins in the context of their interaction network. PMID:22369639

  5. Robust check loss-based variable selection of high-dimensional single-index varying-coefficient model

    NASA Astrophysics Data System (ADS)

    Song, Yunquan; Lin, Lu; Jian, Ling

    2016-07-01

    Single-index varying-coefficient model is an important mathematical modeling method to model nonlinear phenomena in science and engineering. In this paper, we develop a variable selection method for high-dimensional single-index varying-coefficient models using a shrinkage idea. The proposed procedure can simultaneously select significant nonparametric components and parametric components. Under defined regularity conditions, with appropriate selection of tuning parameters, the consistency of the variable selection procedure and the oracle property of the estimators are established. Moreover, due to the robustness of the check loss function to outliers in the finite samples, our proposed variable selection method is more robust than the ones based on the least squares criterion. Finally, the method is illustrated with numerical simulations.

  6. Analysis, calculation and utilization of the k-balance attribute in interdependent networks

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Li, Qing; Wang, Dan; Xu, Mingwei

    2018-05-01

    Interdependent networks, where two networks depend on each other, are becoming more and more significant in modern systems. From previous work, it can be concluded that interdependent networks are more vulnerable than a single network. The robustness in interdependent networks deserves special attention. In this paper, we propose a metric of robustness from a new perspective-the balance. First, we define the balance-coefficient of the interdependent system. Based on precise analysis and derivation, we prove some significant theories and provide an efficient algorithm to compute the balance-coefficient. Finally, we propose an optimal solution to reduce the balance-coefficient to enhance the robustness of the given system. Comprehensive experiments confirm the efficiency of our algorithms.

  7. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  8. Statistical analysis of co-occurrence patterns in microbial presence-absence datasets.

    PubMed

    Mainali, Kumar P; Bewick, Sharon; Thielen, Peter; Mehoke, Thomas; Breitwieser, Florian P; Paudel, Shishir; Adhikari, Arjun; Wolfe, Joshua; Slud, Eric V; Karig, David; Fagan, William F

    2017-01-01

    Drawing on a long history in macroecology, correlation analysis of microbiome datasets is becoming a common practice for identifying relationships or shared ecological niches among bacterial taxa. However, many of the statistical issues that plague such analyses in macroscale communities remain unresolved for microbial communities. Here, we discuss problems in the analysis of microbial species correlations based on presence-absence data. We focus on presence-absence data because this information is more readily obtainable from sequencing studies, especially for whole-genome sequencing, where abundance estimation is still in its infancy. First, we show how Pearson's correlation coefficient (r) and Jaccard's index (J)-two of the most common metrics for correlation analysis of presence-absence data-can contradict each other when applied to a typical microbiome dataset. In our dataset, for example, 14% of species-pairs predicted to be significantly correlated by r were not predicted to be significantly correlated using J, while 37.4% of species-pairs predicted to be significantly correlated by J were not predicted to be significantly correlated using r. Mismatch was particularly common among species-pairs with at least one rare species (<10% prevalence), explaining why r and J might differ more strongly in microbiome datasets, where there are large numbers of rare taxa. Indeed 74% of all species-pairs in our study had at least one rare species. Next, we show how Pearson's correlation coefficient can result in artificial inflation of positive taxon relationships and how this is a particular problem for microbiome studies. We then illustrate how Jaccard's index of similarity (J) can yield improvements over Pearson's correlation coefficient. However, the standard null model for Jaccard's index is flawed, and thus introduces its own set of spurious conclusions. We thus identify a better null model based on a hypergeometric distribution, which appropriately corrects for species prevalence. This model is available from recent statistics literature, and can be used for evaluating the significance of any value of an empirically observed Jaccard's index. The resulting simple, yet effective method for handling correlation analysis of microbial presence-absence datasets provides a robust means of testing and finding relationships and/or shared environmental responses among microbial taxa.

  9. Collinear Latent Variables in Multilevel Confirmatory Factor Analysis

    PubMed Central

    van de Schoot, Rens; Hox, Joop

    2014-01-01

    Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation coefficient (ICC) and estimation method; maximum likelihood estimation with robust chi-squares and standard errors and Bayesian estimation, on the convergence rate are investigated. The other variables of interest were rate of inadmissible solutions and the relative parameter and standard error bias on the between level. The results showed that inadmissible solutions were obtained when there was between level collinearity and the estimation method was maximum likelihood. In the within level multicollinearity condition, all of the solutions were admissible but the bias values were higher compared with the between level collinearity condition. Bayesian estimation appeared to be robust in obtaining admissible parameters but the relative bias was higher than for maximum likelihood estimation. Finally, as expected, high ICC produced less biased results compared to medium ICC conditions. PMID:29795827

  10. Collinear Latent Variables in Multilevel Confirmatory Factor Analysis: A Comparison of Maximum Likelihood and Bayesian Estimations.

    PubMed

    Can, Seda; van de Schoot, Rens; Hox, Joop

    2015-06-01

    Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation coefficient (ICC) and estimation method; maximum likelihood estimation with robust chi-squares and standard errors and Bayesian estimation, on the convergence rate are investigated. The other variables of interest were rate of inadmissible solutions and the relative parameter and standard error bias on the between level. The results showed that inadmissible solutions were obtained when there was between level collinearity and the estimation method was maximum likelihood. In the within level multicollinearity condition, all of the solutions were admissible but the bias values were higher compared with the between level collinearity condition. Bayesian estimation appeared to be robust in obtaining admissible parameters but the relative bias was higher than for maximum likelihood estimation. Finally, as expected, high ICC produced less biased results compared to medium ICC conditions.

  11. Quality Control and Reproducibility in M-mode, Two-dimensional, and Speckle Tracking Echocardiography Acquisition and Analysis: The CARDIA Study, Year-25 Examination Experience

    PubMed Central

    Armstrong, Anderson C.; Ricketts, Erin P.; Cox, Christopher; Adler, Paul; Arynchyn, Alexander; Liu, Kiang; Stengel, Ellen; RDCS; Sidney, Stephen; Lewis, Cora E.; Schreiner, Pamela J.; Shikany, James M.; Keck, Kimberly; Merlo, Jamie; Gidding, Samuel S.; Lima, João A. C.

    2014-01-01

    Introduction Few large studies describe quality control procedures and reproducibility findings in cardiovascular ultra-sound, particularly in novel techniques such as Speckle Tracking (STE). We evaluate the echocardiography assessment performance in the CARDIA study Y25 examination (2010-2011) and report findings from a quality control and reproducibility program conducted to assess Field Center image acquisition and Reading Center (RC) accuracy. Methods The CARDIA Y25 examination had 3,475 echocardiograms performed in 4 US Field Centers and analyzed in a Reading Center, assessing standard echocardiography (LA dimension, aortic root, LV mass, LV end-diastolic volume [LVEDV], ejection fraction [LVEF]), and STE (2- and 4-chamber longitudinal, circumferential, and radial strains). Reproducibility was assessed using intra-class correlation coefficients (ICC), coefficients of variation (CV), and Bland-Altman plots. Results For standard echocardiography reproducibility, LV mass and LVEDV consistently had CV above 10% and aortic root below 6%. Intra-sonographer aortic root and LV mass had the most robust values of ICC in standard echocardiography. For STE, the number of properly tracking segments was above 80% in short-axis and 4-chamber and 58% in 2-chamber. Longitudinal strain parameters were the most robust and radial strain showed the highest variation. Comparing Field Centers with Echo RC STE readings, mean differences ranged from 0.4% to 4.1% and ICC from 0.37 to 0.66, with robust results for longitudinal strains. Conclusion Echocardiography image acquisition and reading processes in the CARDIA study were highly reproducible, including robust results for STE analysis. Consistent quality control may increase the reliability of echocardiography measurements in large cohort studies. PMID:25382818

  12. Quality Control and Reproducibility in M-Mode, Two-Dimensional, and Speckle Tracking Echocardiography Acquisition and Analysis: The CARDIA Study, Year 25 Examination Experience.

    PubMed

    Armstrong, Anderson C; Ricketts, Erin P; Cox, Christopher; Adler, Paul; Arynchyn, Alexander; Liu, Kiang; Stengel, Ellen; Sidney, Stephen; Lewis, Cora E; Schreiner, Pamela J; Shikany, James M; Keck, Kimberly; Merlo, Jamie; Gidding, Samuel S; Lima, João A C

    2015-08-01

    Few large studies describe quality control procedures and reproducibility findings in cardiovascular ultrasound, particularly in novel techniques such as speckle tracking echocardiography (STE). We evaluate the echocardiography assessment performance in the Coronary Artery Risk Development in Young Adults (CARDIA) study Year 25 (Y25) examination (2010-2011) and report findings from a quality control and reproducibility program conducted to assess Field Center image acquisition and reading center (RC) accuracy. The CARDIA Y25 examination had 3475 echocardiograms performed in 4 US Field Centers and analyzed in a RC, assessing standard echocardiography (LA dimension, aortic root, LV mass, LV end-diastolic volume [LVEDV], ejection fraction [LVEF]), and STE (two- and four-chamber longitudinal, circumferential, and radial strains). Reproducibility was assessed using intraclass correlation coefficients (ICC), coefficients of variation (CV), and Bland-Altman plots. For standard echocardiography reproducibility, LV mass and LVEDV consistently had CV above 10% and aortic root below 6%. Intra-sonographer aortic root and LV mass had the most robust values of ICC in standard echocardiography. For STE, the number of properly tracking segments was above 80% in short-axis and four-chamber and 58% in two-chamber views. Longitudinal strain parameters were the most robust and radial strain showed the highest variation. Comparing Field Centers with echocardiography RC STE readings, mean differences ranged from 0.4% to 4.1% and ICC from 0.37 to 0.66, with robust results for longitudinal strains. Echocardiography image acquisition and reading processes in the CARDIA study were highly reproducible, including robust results for STE analysis. Consistent quality control may increase the reliability of echocardiography measurements in large cohort studies. © 2014, Wiley Periodicals, Inc.

  13. Robust real-time extraction of respiratory signals from PET list-mode data.

    PubMed

    Salomon, Andre; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas

    2018-05-01

    Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions' detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting ("binning") of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signalsdirectly from the acquired PET data simplifies the clinical workflow as it avoids to handle additional signal measurement equipment. We introduce a new data-driven method "Combined Local Motion Detection" (CLMD). It uses the Time-of-Flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using 7 measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4s in total on a standard multi-core CPU and thus provides a robust and accurate approach enabling real-time processing capabilities using standard PC hardware. © 2018 Institute of Physics and Engineering in Medicine.

  14. Robust real-time extraction of respiratory signals from PET list-mode data

    NASA Astrophysics Data System (ADS)

    Salomon, André; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas

    2018-06-01

    Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions’ detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting (‘binning’) of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signals directly from the acquired PET data simplifies the clinical workflow as it avoids handling additional signal measurement equipment. We introduce a new data-driven method ‘combined local motion detection’ (CLMD). It uses the time-of-flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using seven measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4 s in total on a standard multi-core CPU and thus provides a robust and accurate approach enabling real-time processing capabilities using standard PC hardware.

  15. Cocaine profiling for strategic intelligence, a cross-border project between France and Switzerland: part II. Validation of the statistical methodology for the profiling of cocaine.

    PubMed

    Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P

    2008-05-20

    Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.

  16. An effective approach to quantitative analysis of ternary amino acids in foxtail millet substrate based on terahertz spectroscopy.

    PubMed

    Lu, Shao Hua; Li, Bao Qiong; Zhai, Hong Lin; Zhang, Xin; Zhang, Zhuo Yong

    2018-04-25

    Terahertz time-domain spectroscopy has been applied to many fields, however, it still encounters drawbacks in multicomponent mixtures analysis due to serious spectral overlapping. Here, an effective approach to quantitative analysis was proposed, and applied on the determination of the ternary amino acids in foxtail millet substrate. Utilizing three parameters derived from the THz-TDS, the images were constructed and the Tchebichef image moments were used to extract the information of target components. Then the quantitative models were obtained by stepwise regression. The correlation coefficients of leave-one-out cross-validation (R loo-cv 2 ) were more than 0.9595. As for external test set, the predictive correlation coefficients (R p 2 ) were more than 0.8026 and the root mean square error of prediction (RMSE p ) were less than 1.2601. Compared with the traditional methods (PLS and N-PLS methods), our approach is more accurate, robust and reliable, and can be a potential excellent approach to quantify multicomponent with THz-TDS spectroscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Free wake analysis of hover performance using a new influence coefficient method

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.; Ong, Ching Cho; Ching, Cho Ong

    1990-01-01

    A new approach to the prediction of helicopter rotor performance using a free wake analysis was developed. This new method uses a relaxation process that does not suffer from the convergence problems associated with previous time marching simulations. This wake relaxation procedure was coupled to a vortex-lattice, lifting surface loads analysis to produce a novel, self contained performance prediction code: EHPIC (Evaluation of Helicopter Performance using Influence Coefficients). The major technical features of the EHPIC code are described and a substantial amount of background information on the capabilities and proper operation of the code is supplied. Sample problems were undertaken to demonstrate the robustness and flexibility of the basic approach. Also, a performance correlation study was carried out to establish the breadth of applicability of the code, with very favorable results.

  18. A signature of 12 microRNAs is robustly associated with growth rate in a variety of CHO cell lines.

    PubMed

    Klanert, Gerald; Jadhav, Vaibhav; Shanmukam, Vinoth; Diendorfer, Andreas; Karbiener, Michael; Scheideler, Marcel; Bort, Juan Hernández; Grillari, Johannes; Hackl, Matthias; Borth, Nicole

    2016-10-10

    As Chinese Hamster Ovary (CHO) cells are the cell line of choice for the production of human-like recombinant proteins, there is interest in genetic optimization of host cell lines to overcome certain limitations in their growth rate and protein secretion. At the same time, a detailed understanding of these processes could be used to advantage by identification of marker transcripts that characterize states of performance. In this context, microRNAs (miRNAs) that exhibit a robust correlation to the growth rate of CHO cells were determined by analyzing miRNA expression profiles in a comprehensive collection of 46 samples including CHO-K1, CHO-S and CHO-DUKXB11, which were adapted to various culture conditions, and analyzed in different growth stages using microarrays. By applying Spearman or Pearson correlation coefficient criteria of>|0.6|, miRNAs with high correlation to the overall growth, or growth rates observed in exponential, serum-free, and serum-free exponential phase were identified. An overlap of twelve miRNAs common for all sample sets was revealed, with nine positively and three negatively correlating miRNAs. The here identified panel of miRNAs can help to understand growth regulation in CHO cells and contains putative engineering targets as well as biomarkers for cell lines with advantageous growth characteristics. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Exploiting structure: Introduction and motivation

    NASA Technical Reports Server (NTRS)

    Xu, Zhong Ling

    1993-01-01

    Research activities performed during the period of 29 June 1993 through 31 Aug. 1993 are summarized. The Robust Stability of Systems where transfer function or characteristic polynomial are multilinear affine functions of parameters of interest in two directions, Algorithmic and Theoretical, was developed. In the algorithmic direction, a new approach that reduces the computational burden of checking the robust stability of the system with multilinear uncertainty is found. This technique is called 'Stability by linear process.' In fact, the 'Stability by linear process' described gives an algorithm. In analysis, we obtained a robustness criterion for the family of polynomials with coefficients of multilinear affine function in the coefficient space and obtained the result for the robust stability of diamond families of polynomials with complex coefficients also. We obtained the limited results for SPR design and we provide a framework for solving ACS. Finally, copies of the outline of our results are provided in the appendix. Also, there is an administration issue in the appendix.

  20. [Income inequality, corruption, and life expectancy at birth in Mexico].

    PubMed

    Idrovo, Alvaro Javier

    2005-01-01

    To ascertain if the effect of income inequality on life expectancy at birth in Mexico is mediated by corruption, used as a proxy of social capital. An ecological study was carried out with the 32 Mexican federative entities. Global and by sex correlations between life expectancy at birth were estimated by federative entity with the Gini coefficient, the Corruption and Good Government Index, the percentage of Catholics, and the percentage of the population speaking indigenous language. Robust linear regressions, with and without instrumental variables, were used to explore if corruption acts as intermediate variable in the studied relationship. Negative correlations with Spearman's rho near to -0.60 (p < 0.05) and greater than -0.66 (p < 0.05) between life expectancy at birth, the Gini coefficient and the population speaking indigenous language, respectively, were observed. Moreover, the Corruption and Good Government Index correlated with men's life expectancy at birth with Spearman's rho -0.3592 (p < 0.05). Regressions with instruments were more consistent than conventional ones and they show a strong negative effect (p < 0.05) of income inequality on life expectancy at birth. This effect was greater among men. The findings suggest a negative effect of income inequality on life expectancy at birth in Mexico, mediated by corruption levels and other related cultural factors.

  1. Psychometric testing of the modified Care Dependency Scale (Neuro-CDS).

    PubMed

    Piredda, Michela; Biagioli, Valentina; Gambale, Giulia; Porcelli, Elisa; Barbaranelli, Claudio; Palese, Alvisa; De Marinis, Maria Grazia

    2016-01-01

    Effective measures of nursing care dependency in neurorehabilitation are warranted to plan nursing interventions to help patients avoid increasing dependency. The Care Dependency Scale (CDS) is a theory-based, comprehensive tool to evaluate functional disability. This study aimed to modify the CDS for neurological and neurorehabilitation patients (Neuro-CDS) and to test its psychometric properties in adult neurorehabilitation inpatients. Exploratory factor analysis (EFA) was performed using a Maximum Likelihood robust (MLR) estimator. The Barthel Index (BI) was used to evaluate concurrent validity. Stability was measured using the Intra-class Correlation Coefficient (ICC). The sample included 124 patients (mean age = 69.7 years, 54% male). The EFA revealed a two-factor structure with good fit indexes, Factor 1 (Physical care dependence) loaded by 11 items and Factor 2 (Psycho-social care dependence) loaded by 4 items. The correlation between factors was 0.61. Correlations between Factor 1 and the BI and between Factor 2 and the BI were r = 0.843 and r = 0.677, respectively (p <  0.001). The Cronbach's alpha coefficients were 0.99 and 0.88 (Factor 1 and 2). The ICC was 0.98. The Neuro-CDS is multidimensional, valid, reliable, straightforward, and able to measure care dependence in neurorehabilitation patients as a basis for individualized and holistic care.

  2. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    NASA Astrophysics Data System (ADS)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  3. Identifying tectonic parameters that influence tsunamigenesis

    NASA Astrophysics Data System (ADS)

    van Zelst, Iris; Brizzi, Silvia; van Dinther, Ylona; Heuret, Arnauld; Funiciello, Francesca

    2017-04-01

    The role of tectonics in tsunami generation is at present poorly understood. However, the fact that some regions produce more tsunamis than others indicates that tectonics could influence tsunamigenesis. Here, we complement a global earthquake database that contains geometrical, mechanical, and seismicity parameters of subduction zones with tsunami data. We statistically analyse the database to identify the tectonic parameters that affect tsunamigenesis. The Pearson's product-moment correlation coefficients reveal high positive correlations of 0.65 between, amongst others, the maximum water height of tsunamis and the seismic coupling in a subduction zone. However, these correlations are mainly caused by outliers. The Spearman's rank correlation coefficient results in more robust correlations of 0.60 between the number of tsunamis in a subduction zone and subduction velocity (positive correlation) and the sediment thickness at the trench (negative correlation). Interestingly, there is a positive correlation between the latter and tsunami magnitude. In an effort towards multivariate statistics, a binary decision tree analysis is conducted with one variable. However, this shows that the amount of data is too scarce. To complement this limited amount of data and to assess physical causality of the tectonic parameters with regard to tsunamigenesis, we conduct a numerical study of the most promising parameters using a geodynamic seismic cycle model. We show that an increase in sediment thickness on the subducting plate results in a shift in seismic activity from outerrise normal faults to splay faults. We also show that the splay fault is the preferred rupture path for a strongly velocity strengthening friction regime in the shallow part of the subduction zone, which increases the tsunamigenic potential. A larger updip limit of the seismogenic zone results in larger vertical surface displacement.

  4. Efficient moving target analysis for inverse synthetic aperture radar images via joint speeded-up robust features and regular moment

    NASA Astrophysics Data System (ADS)

    Yang, Hongxin; Su, Fulin

    2018-01-01

    We propose a moving target analysis algorithm using speeded-up robust features (SURF) and regular moment in inverse synthetic aperture radar (ISAR) image sequences. In our study, we first extract interest points from ISAR image sequences by SURF. Different from traditional feature point extraction methods, SURF-based feature points are invariant to scattering intensity, target rotation, and image size. Then, we employ a bilateral feature registering model to match these feature points. The feature registering scheme can not only search the isotropic feature points to link the image sequences but also reduce the error matching pairs. After that, the target centroid is detected by regular moment. Consequently, a cost function based on correlation coefficient is adopted to analyze the motion information. Experimental results based on simulated and real data validate the effectiveness and practicability of the proposed method.

  5. Media violence exposure and physical aggression in fifth-grade children.

    PubMed

    Coker, Tumaini R; Elliott, Marc N; Schwebel, David C; Windle, Michael; Toomey, Sara L; Tortolero, Susan R; Hertz, Marci F; Peskin, Melissa F; Schuster, Mark A

    2015-01-01

    To examine the association of media violence exposure and physical aggression in fifth graders across 3 media types. We analyzed data from a population-based, cross-sectional survey of 5,147 fifth graders and their parents in 3 US metropolitan areas. We used multivariable linear regression and report partial correlation coefficients to examine associations between children's exposure to violence in television/film, video games, and music (reported time spent consuming media and reported frequency of violent content: physical fighting, hurting, shooting, or killing) and the Problem Behavior Frequency Scale. Child-reported media violence exposure was associated with physical aggression after multivariable adjustment for sociodemographics, family and community violence, and child mental health symptoms (partial correlation coefficients: TV, 0.17; video games, 0.15; music, 0.14). This association was significant and independent for television, video games, and music violence exposure in a model including all 3 media types (partial correlation coefficients: TV, 0.11; video games, 0.09; music, 0.09). There was a significant positive interaction between media time and media violence for video games and music but not for television. Effect sizes for the association of media violence exposure and physical aggression were greater in magnitude than for most of the other examined variables. The association between physical aggression and media violence exposure is robust and persistent; the strength of this association of media violence may be at least as important as that of other factors with physical aggression in children, such as neighborhood violence, home violence, child mental health, and male gender. Copyright © 2015 Academic Pediatric Association. All rights reserved.

  6. A Common Calibration Source Framework for Fully-Polarimetric and Interferometric Radiometers

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Davis, Brynmor; Piepmeier, Jeff; Zukor, Dorothy J. (Technical Monitor)

    2000-01-01

    Two types of microwave radiometry--synthetic thinned array radiometry (STAR) and fully-polarimetric (FP) radiometry--have received increasing attention during the last several years. STAR radiometers offer a technological solution to achieving high spatial resolution imaging from orbit without requiring a filled aperture or a moving antenna, and FP radiometers measure extra polarization state information upon which entirely new or more robust geophysical retrieval algorithms can be based. Radiometer configurations used for both STAR and FP instruments share one fundamental feature that distinguishes them from more 'standard' radiometers, namely, they measure correlations between pairs of microwave signals. The calibration requirements for correlation radiometers are broader than those for standard radiometers. Quantities of interest include total powers, complex correlation coefficients, various offsets, and possible nonlinearities. A candidate for an ideal calibration source would be one that injects test signals with precisely controllable correlation coefficients and absolute powers simultaneously into a pair of receivers, permitting all of these calibration quantities to be measured. The complex nature of correlation radiometer calibration, coupled with certain inherent similarities between STAR and FP instruments, suggests significant leverage in addressing both problems together. Recognizing this, a project was recently begun at NASA Goddard Space Flight Center to develop a compact low-power subsystem for spaceflight STAR or FP receiver calibration. We present a common theoretical framework for the design of signals for a controlled correlation calibration source. A statistical model is described, along with temporal and spectral constraints on such signals. Finally, a method for realizing these signals is demonstrated using a Matlab-based implementation.

  7. The Validation of a Case-Based, Cumulative Assessment and Progressions Examination

    PubMed Central

    Coker, Adeola O.; Copeland, Jeffrey T.; Gottlieb, Helmut B.; Horlen, Cheryl; Smith, Helen E.; Urteaga, Elizabeth M.; Ramsinghani, Sushma; Zertuche, Alejandra; Maize, David

    2016-01-01

    Objective. To assess content and criterion validity, as well as reliability of an internally developed, case-based, cumulative, high-stakes third-year Annual Student Assessment and Progression Examination (P3 ASAP Exam). Methods. Content validity was assessed through the writing-reviewing process. Criterion validity was assessed by comparing student scores on the P3 ASAP Exam with the nationally validated Pharmacy Curriculum Outcomes Assessment (PCOA). Reliability was assessed with psychometric analysis comparing student performance over four years. Results. The P3 ASAP Exam showed content validity through representation of didactic courses and professional outcomes. Similar scores on the P3 ASAP Exam and PCOA with Pearson correlation coefficient established criterion validity. Consistent student performance using Kuder-Richardson coefficient (KR-20) since 2012 reflected reliability of the examination. Conclusion. Pharmacy schools can implement internally developed, high-stakes, cumulative progression examinations that are valid and reliable using a robust writing-reviewing process and psychometric analyses. PMID:26941435

  8. Fully automated contour detection of the ascending aorta in cardiac 2D phase-contrast MRI.

    PubMed

    Codari, Marina; Scarabello, Marco; Secchi, Francesco; Sforza, Chiarella; Baselli, Giuseppe; Sardanelli, Francesco

    2018-04-01

    In this study we proposed a fully automated method for localizing and segmenting the ascending aortic lumen with phase-contrast magnetic resonance imaging (PC-MRI). Twenty-five phase-contrast series were randomly selected out of a large population dataset of patients whose cardiac MRI examination, performed from September 2008 to October 2013, was unremarkable. The local Ethical Committee approved this retrospective study. The ascending aorta was automatically identified on each phase of the cardiac cycle using a priori knowledge of aortic geometry. The frame that maximized the area, eccentricity, and solidity parameters was chosen for unsupervised initialization. Aortic segmentation was performed on each frame using active contouring without edges techniques. The entire algorithm was developed using Matlab R2016b. To validate the proposed method, the manual segmentation performed by a highly experienced operator was used. Dice similarity coefficient, Bland-Altman analysis, and Pearson's correlation coefficient were used as performance metrics. Comparing automated and manual segmentation of the aortic lumen on 714 images, Bland-Altman analysis showed a bias of -6.68mm 2 , a coefficient of repeatability of 91.22mm 2 , a mean area measurement of 581.40mm 2 , and a reproducibility of 85%. Automated and manual segmentation were highly correlated (R=0.98). The Dice similarity coefficient versus the manual reference standard was 94.6±2.1% (mean±standard deviation). A fully automated and robust method for identification and segmentation of ascending aorta on PC-MRI was developed. Its application on patients with a variety of pathologic conditions is advisable. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. LCC-Demons: a robust and accurate symmetric diffeomorphic registration algorithm.

    PubMed

    Lorenzi, M; Ayache, N; Frisoni, G B; Pennec, X

    2013-11-01

    Non-linear registration is a key instrument for computational anatomy to study the morphology of organs and tissues. However, in order to be an effective instrument for the clinical practice, registration algorithms must be computationally efficient, accurate and most importantly robust to the multiple biases affecting medical images. In this work we propose a fast and robust registration framework based on the log-Demons diffeomorphic registration algorithm. The transformation is parameterized by stationary velocity fields (SVFs), and the similarity metric implements a symmetric local correlation coefficient (LCC). Moreover, we show how the SVF setting provides a stable and consistent numerical scheme for the computation of the Jacobian determinant and the flux of the deformation across the boundaries of a given region. Thus, it provides a robust evaluation of spatial changes. We tested the LCC-Demons in the inter-subject registration setting, by comparing with state-of-the-art registration algorithms on public available datasets, and in the intra-subject longitudinal registration problem, for the statistically powered measurements of the longitudinal atrophy in Alzheimer's disease. Experimental results show that LCC-Demons is a generic, flexible, efficient and robust algorithm for the accurate non-linear registration of images, which can find several applications in the field of medical imaging. Without any additional optimization, it solves equally well intra & inter-subject registration problems, and compares favorably to state-of-the-art methods. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Statistical analysis of co-occurrence patterns in microbial presence-absence datasets

    PubMed Central

    Bewick, Sharon; Thielen, Peter; Mehoke, Thomas; Breitwieser, Florian P.; Paudel, Shishir; Adhikari, Arjun; Wolfe, Joshua; Slud, Eric V.; Karig, David; Fagan, William F.

    2017-01-01

    Drawing on a long history in macroecology, correlation analysis of microbiome datasets is becoming a common practice for identifying relationships or shared ecological niches among bacterial taxa. However, many of the statistical issues that plague such analyses in macroscale communities remain unresolved for microbial communities. Here, we discuss problems in the analysis of microbial species correlations based on presence-absence data. We focus on presence-absence data because this information is more readily obtainable from sequencing studies, especially for whole-genome sequencing, where abundance estimation is still in its infancy. First, we show how Pearson’s correlation coefficient (r) and Jaccard’s index (J)–two of the most common metrics for correlation analysis of presence-absence data–can contradict each other when applied to a typical microbiome dataset. In our dataset, for example, 14% of species-pairs predicted to be significantly correlated by r were not predicted to be significantly correlated using J, while 37.4% of species-pairs predicted to be significantly correlated by J were not predicted to be significantly correlated using r. Mismatch was particularly common among species-pairs with at least one rare species (<10% prevalence), explaining why r and J might differ more strongly in microbiome datasets, where there are large numbers of rare taxa. Indeed 74% of all species-pairs in our study had at least one rare species. Next, we show how Pearson’s correlation coefficient can result in artificial inflation of positive taxon relationships and how this is a particular problem for microbiome studies. We then illustrate how Jaccard’s index of similarity (J) can yield improvements over Pearson’s correlation coefficient. However, the standard null model for Jaccard’s index is flawed, and thus introduces its own set of spurious conclusions. We thus identify a better null model based on a hypergeometric distribution, which appropriately corrects for species prevalence. This model is available from recent statistics literature, and can be used for evaluating the significance of any value of an empirically observed Jaccard’s index. The resulting simple, yet effective method for handling correlation analysis of microbial presence-absence datasets provides a robust means of testing and finding relationships and/or shared environmental responses among microbial taxa. PMID:29145425

  11. A comparison of different synchronization measures in electroencephalogram during propofol anesthesia.

    PubMed

    Liang, Zhenhu; Ren, Ye; Yan, Jiaqing; Li, Duan; Voss, Logan J; Sleigh, Jamie W; Li, Xiaoli

    2016-08-01

    Electroencephalogram (EEG) synchronization is becoming an essential tool to describe neurophysiological mechanisms of communication between brain regions under general anesthesia. Different synchronization measures have their own properties to reflect the changes of EEG activities during different anesthetic states. However, the performance characteristics and the relations of different synchronization measures in evaluating synchronization changes during propofol-induced anesthesia are not fully elucidated. Two-channel EEG data from seven volunteers who had undergone a brief standardized propofol anesthesia were then adopted to calculate eight synchronization indexes. We computed the prediction probability (P K ) of synchronization indexes with Bispectral Index (BIS) and propofol effect-site concentration (C eff ) to quantify the ability of the indexes to predict BIS and C eff . Also, box plots and coefficient of variation were used to reflect the different synchronization changes and their robustness to noise in awake, unconscious and recovery states, and the Pearson correlation coefficient (R) was used for assessing the relationship among synchronization measures, BIS and C eff . Permutation cross mutual information (PCMI) and determinism (DET) could predict BIS and follow C eff better than nonlinear interdependence (NI), mutual information based on kernel estimation (KerMI) and cross correlation. Wavelet transform coherence (WTC) in α and β frequency bands followed BIS and C eff better than that in other frequency bands. There was a significant decrease in unconscious state and a significant increase in recovery state for PCMI and NI, while the trends were opposite for KerMI, DET and WTC. Phase synchronization based on phase locking value (PSPLV) in δ, θ, α and γ1 frequency bands dropped significantly in unconscious state, whereas it had no significant synchronization in recovery state. Moreover, PCMI, NI, DET correlated closely with each other and they had a better robustness to noise and higher correlation with BIS and C eff than other synchronization indexes. Propofol caused EEG synchronization changes during the anesthetic period. Different synchronization measures had individual properties in evaluating synchronization changes in different anesthetic states, which might be related to various forms of neural activities and neurophysiological mechanisms under general anesthesia.

  12. Assessment of biological half life using in silico QSPkR approach: a self organizing molecular field analysis (SOMFA) on a series of antimicrobial quinolone drugs.

    PubMed

    Goel, Honey; Sinha, V R; Thareja, Suresh; Aggarwal, Saurabh; Kumar, Manoj

    2011-08-30

    The quinolones belong to a family of synthetic potent broad-spectrum antibiotics and particularly active against gram-negative organisms, especially Pseudomonas aeruginosa. A 3D-QSPkR approach has been used to obtain the quantitative structure pharmacokinetic relationship for a series of quinolone drugs using SOMFA. The series consisting of 28 molecules have been investigated for their pharmacokinetic performance using biological half life (t(1/2)). A statistically validated robust model for a diverse group of quinolone drugs having flexibility in structure and pharmacokinetic profile (t(1/2)) obtained using SOMFA having good cross-validated correlation coefficient r(cv)(2) (0.6847), non cross-validated correlation coefficient r(2) values (0.7310) and high F-test value (33.9663). Analysis of 3D-QSPkR models through electrostatic and shape grids provide useful information about the shape and electrostatic potential contributions on t(1/2). The analysis of SOMFA results provide an insight for the generation of novel molecular architecture of quinolones with optimal half life and improved biological profile. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Automated cerebral infarct volume measurement in follow-up noncontrast CT scans of patients with acute ischemic stroke.

    PubMed

    Boers, A M; Marquering, H A; Jochem, J J; Besselink, N J; Berkhemer, O A; van der Lugt, A; Beenen, L F; Majoie, C B

    2013-08-01

    Cerebral infarct volume as observed in follow-up CT is an important radiologic outcome measure of the effectiveness of treatment of patients with acute ischemic stroke. However, manual measurement of CIV is time-consuming and operator-dependent. The purpose of this study was to develop and evaluate a robust automated measurement of the CIV. The CIV in early follow-up CT images of 34 consecutive patients with acute ischemic stroke was segmented with an automated intensity-based region-growing algorithm, which includes partial volume effect correction near the skull, midline determination, and ventricle and hemorrhage exclusion. Two observers manually delineated the CIV. Interobserver variability of the manual assessments and the accuracy of the automated method were evaluated by using the Pearson correlation, Bland-Altman analysis, and Dice coefficients. The accuracy was defined as the correlation with the manual assessment as a reference standard. The Pearson correlation for the automated method compared with the reference standard was similar to the manual correlation (R = 0.98). The accuracy of the automated method was excellent with a mean difference of 0.5 mL with limits of agreement of -38.0-39.1 mL, which were more consistent than the interobserver variability of the 2 observers (-40.9-44.1 mL). However, the Dice coefficients were higher for the manual delineation. The automated method showed a strong correlation and accuracy with the manual reference measurement. This approach has the potential to become the standard in assessing the infarct volume as a secondary outcome measure for evaluating the effectiveness of treatment.

  14. Predicting enteric methane emission of dairy cows with milk Fourier-transform infrared spectra and gas chromatography-based milk fatty acid profiles.

    PubMed

    van Gastelen, S; Mollenhorst, H; Antunes-Fernandes, E C; Hettinga, K A; van Burgsteden, G G; Dijkstra, J; Rademaker, J L W

    2018-06-01

    The objective of the present study was to compare the prediction potential of milk Fourier-transform infrared spectroscopy (FTIR) for CH 4 emissions of dairy cows with that of gas chromatography (GC)-based milk fatty acids (MFA). Data from 9 experiments with lactating Holstein-Friesian cows, with a total of 30 dietary treatments and 218 observations, were used. Methane emissions were measured for 3 consecutive days in climate respiration chambers and expressed as production (g/d), yield (g/kg of dry matter intake; DMI), and intensity (g/kg of fat- and protein-corrected milk; FPCM). Dry matter intake was 16.3 ± 2.18 kg/d (mean ± standard deviation), FPCM yield was 25.9 ± 5.06 kg/d, CH 4 production was 366 ± 53.9 g/d, CH 4 yield was 22.5 ± 2.10 g/kg of DMI, and CH 4 intensity was 14.4 ± 2.58 g/kg of FPCM. Milk was sampled during the same days and analyzed by GC and by FTIR. Multivariate GC-determined MFA-based and FTIR-based CH 4 prediction models were developed, and subsequently, the final CH 4 prediction models were evaluated with root mean squared error of prediction and concordance correlation coefficient analysis. Further, we performed a random 10-fold cross validation to calculate the performance parameters of the models (e.g., the coefficient of determination of cross validation). The final GC-determined MFA-based CH 4 prediction models estimate CH 4 production, yield, and intensity with a root mean squared error of prediction of 35.7 g/d, 1.6 g/kg of DMI, and 1.6 g/kg of FPCM and with a concordance correlation coefficient of 0.72, 0.59, and 0.77, respectively. The final FTIR-based CH 4 prediction models estimate CH 4 production, yield, and intensity with a root mean squared error of prediction of 43.2 g/d, 1.9 g/kg of DMI, and 1.7 g/kg of FPCM and with a concordance correlation coefficient of 0.52, 0.40, and 0.72, respectively. The GC-determined MFA-based prediction models described a greater part of the observed variation in CH 4 emission than did the FTIR-based models. The cross validation results indicate that all CH 4 prediction models (both GC-determined MFA-based and FTIR-based models) are robust; the difference between the coefficient of determination and the coefficient of determination of cross validation ranged from 0.01 to 0.07. The results indicate that GC-determined MFA have a greater potential than FTIR spectra to estimate CH 4 production, yield, and intensity. Both techniques hold potential but may not yet be ready to predict CH 4 emission of dairy cows in practice. Additional CH 4 measurements are needed to improve the accuracy and robustness of GC-determined MFA and FTIR spectra for CH 4 prediction. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. The Dichotomy in Degree Correlation of Biological Networks

    PubMed Central

    Hao, Dapeng; Li, Chuanxing

    2011-01-01

    Most complex networks from different areas such as biology, sociology or technology, show a correlation on node degree where the possibility of a link between two nodes depends on their connectivity. It is widely believed that complex networks are either disassortative (links between hubs are systematically suppressed) or assortative (links between hubs are enhanced). In this paper, we analyze a variety of biological networks and find that they generally show a dichotomous degree correlation. We find that many properties of biological networks can be explained by this dichotomy in degree correlation, including the neighborhood connectivity, the sickle-shaped clustering coefficient distribution and the modularity structure. This dichotomy distinguishes biological networks from real disassortative networks or assortative networks such as the Internet and social networks. We suggest that the modular structure of networks accounts for the dichotomy in degree correlation and vice versa, shedding light on the source of modularity in biological networks. We further show that a robust and well connected network necessitates the dichotomy of degree correlation, suggestive of an evolutionary motivation for its existence. Finally, we suggest that a dichotomous degree correlation favors a centrally connected modular network, by which the integrity of network and specificity of modules might be reconciled. PMID:22164269

  16. Validation of an instrument to measure quality of life in British children with inflammatory bowel disease.

    PubMed

    Ogden, C A; Akobeng, A K; Abbott, J; Aggett, P; Sood, M R; Thomas, A G

    2011-09-01

    To validate IMPACT-III (UK), a health-related quality of life (HRQoL) instrument, in British children with inflammatory bowel disease (IBD). One hundred six children and parents were invited to participate. IMPACT-III (UK) was validated by inspection by health professionals and children to assess face and content validity, factor analysis to determine optimum domain structure, use of Cronbach alpha coefficients to test internal reliability, ANOVA to assess discriminant validity, correlation with the Child Health Questionnaire to assess concurrent validity, and use of intraclass correlation coefficients to assess test-retest reliability. The independent samples t test was used to measure differences between sexes and age groups, and between paper and computerised versions of IMPACT-III (UK). IMPACT-III (UK) had good face and content validity. The most robust factor solution was a 5-domain structure: body image, embarrassment, energy, IBD symptoms, and worries/concerns about IBD, all of which demonstrated good internal reliability (α = 0.74-0.88). Discriminant validity was demonstrated by significant (P  < 0.05, P < 0.01) differences in HRQoL scores between the severe, moderate, and inactive/mild symptom severity groups for the embarrassment scale (63.7 vs 81.0 vs 81.2), IBD symptom scale (45.0 vs 64.2 vs 80.6), and the energy scale (46.4 vs 62.1 vs 77.7). Concurrent validity of IMPACT-III (UK) with comparable domains of the Child Health Questionnaire was confirmed. Test-retest reliability was confirmed with good intraclass correlation coefficients of 0.66 to 0.84. Paper and computer versions of IMPACT-III (UK) collected comparable scores, and there were no differences between the sexes and age groups. IMPACT-III (UK) appears to be a useful tool to measure HRQoL in British children with IBD.

  17. Semantic similarity measures in the biomedical domain by leveraging a web search engine.

    PubMed

    Hsieh, Sheau-Ling; Chang, Wen-Yung; Chen, Chi-Huang; Weng, Yung-Ching

    2013-07-01

    Various researches in web related semantic similarity measures have been deployed. However, measuring semantic similarity between two terms remains a challenging task. The traditional ontology-based methodologies have a limitation that both concepts must be resided in the same ontology tree(s). Unfortunately, in practice, the assumption is not always applicable. On the other hand, if the corpus is sufficiently adequate, the corpus-based methodologies can overcome the limitation. Now, the web is a continuous and enormous growth corpus. Therefore, a method of estimating semantic similarity is proposed via exploiting the page counts of two biomedical concepts returned by Google AJAX web search engine. The features are extracted as the co-occurrence patterns of two given terms P and Q, by querying P, Q, as well as P AND Q, and the web search hit counts of the defined lexico-syntactic patterns. These similarity scores of different patterns are evaluated, by adapting support vector machines for classification, to leverage the robustness of semantic similarity measures. Experimental results validating against two datasets: dataset 1 provided by A. Hliaoutakis; dataset 2 provided by T. Pedersen, are presented and discussed. In dataset 1, the proposed approach achieves the best correlation coefficient (0.802) under SNOMED-CT. In dataset 2, the proposed method obtains the best correlation coefficient (SNOMED-CT: 0.705; MeSH: 0.723) with physician scores comparing with measures of other methods. However, the correlation coefficients (SNOMED-CT: 0.496; MeSH: 0.539) with coder scores received opposite outcomes. In conclusion, the semantic similarity findings of the proposed method are close to those of physicians' ratings. Furthermore, the study provides a cornerstone investigation for extracting fully relevant information from digitizing, free-text medical records in the National Taiwan University Hospital database.

  18. Robust, Adaptive Functional Regression in Functional Mixed Model Framework.

    PubMed

    Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S

    2011-09-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.

  19. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015

  20. A new correlation coefficient for bivariate time-series data

    NASA Astrophysics Data System (ADS)

    Erdem, Orhan; Ceyhan, Elvan; Varli, Yusuf

    2014-11-01

    The correlation in time series has received considerable attention in the literature. Its use has attained an important role in the social sciences and finance. For example, pair trading in finance is concerned with the correlation between stock prices, returns, etc. In general, Pearson’s correlation coefficient is employed in these areas although it has many underlying assumptions which restrict its use. Here, we introduce a new correlation coefficient which takes into account the lag difference of data points. We investigate the properties of this new correlation coefficient. We demonstrate that it is more appropriate for showing the direction of the covariation of the two variables over time. We also compare the performance of the new correlation coefficient with Pearson’s correlation coefficient and Detrended Cross-Correlation Analysis (DCCA) via simulated examples.

  1. Mass-improvement of the vector current in three-flavor QCD

    NASA Astrophysics Data System (ADS)

    Fritzsch, P.

    2018-06-01

    We determine two improvement coefficients which are relevant to cancel mass-dependent cutoff effects in correlation functions with operator insertions of the non-singlet local QCD vector current. This determination is based on degenerate three-flavor QCD simulations of non-perturbatively O( a) improved Wilson fermions with tree-level improved gauge action. Employing a very robust strategy that has been pioneered in the quenched approximation leads to an accurate estimate of a counterterm cancelling dynamical quark cutoff effects linear in the trace of the quark mass matrix. To our knowledge this is the first time that such an effect has been determined systematically with large significance.

  2. Mechanistic Insights into the Binding of Class IIa HDAC Inhibitors toward Spinocerebellar Ataxia Type-2: A 3D-QSAR and Pharmacophore Modeling Approach

    PubMed Central

    Sinha, Siddharth; Goyal, Sukriti; Somvanshi, Pallavi; Grover, Abhinav

    2017-01-01

    Spinocerebellar ataxia (SCA-2) type-2 is a rare neurological disorder among the nine polyglutamine disorders, mainly caused by polyQ (CAG) trinucleotide repeats expansion within gene coding ataxin-2 protein. The expanded trinucleotide repeats within the ataxin-2 protein sequesters transcriptional cofactors i.e., CREB-binding protein (CBP), Ataxin-2 binding protein 1 (A2BP1) leading to a state of hypo-acetylation and transcriptional repression. Histone de-acetylases inhibitors (HDACi) have been reported to restore transcriptional balance through inhibition of class IIa HDAC's, that leads to an increased acetylation and transcription as demonstrated through in-vivo studies on mouse models of Huntington's. In this study, 61 di-aryl cyclo-propanehydroxamic acid derivatives were used for developing three dimensional (3D) QSAR and pharmacophore models. These models were then employed for screening and selection of anti-ataxia compounds. The chosen QSAR model was observed to be statistically robust with correlation coefficient (r2) value of 0.6774, cross-validated correlation coefficient (q2) of 0.6157 and co-relation coefficient for external test set (pred_r2) of 0.7570. A high F-test value of 77.7093 signified the robustness of the model. Two potential drug leads ZINC 00608101 (SEI) and ZINC 00329110 (ACI) were selected after a coalesce procedure of pharmacophore based screening using the pharmacophore model ADDRR.20 and structural analysis using molecular docking and dynamics simulations. The pharmacophore and the 3D-QSAR model generated were further validated for their screening and prediction ability using the enrichment factor (EF), goodness of hit (GH), and receiver operating characteristics (ROC) curve analysis. The compounds SEI and ACI exhibited a docking score of −10.097 and −9.182 kcal/mol, respectively. An evaluation of binding conformation of ligand-bound protein complexes was performed with MD simulations for a time period of 30 ns along with free energy binding calculations using the g_mmpbsa technique. Prediction of inhibitory activities of the two lead compounds SEI (7.53) and ACI (6.84) using the 3D-QSAR model reaffirmed their inhibitory characteristics as potential anti-ataxia compounds. PMID:28119557

  3. Mechanistic Insights into the Binding of Class IIa HDAC Inhibitors toward Spinocerebellar Ataxia Type-2: A 3D-QSAR and Pharmacophore Modeling Approach.

    PubMed

    Sinha, Siddharth; Goyal, Sukriti; Somvanshi, Pallavi; Grover, Abhinav

    2016-01-01

    Spinocerebellar ataxia (SCA-2) type-2 is a rare neurological disorder among the nine polyglutamine disorders, mainly caused by polyQ (CAG) trinucleotide repeats expansion within gene coding ataxin-2 protein. The expanded trinucleotide repeats within the ataxin-2 protein sequesters transcriptional cofactors i.e., CREB-binding protein (CBP), Ataxin-2 binding protein 1 (A2BP1) leading to a state of hypo-acetylation and transcriptional repression. Histone de-acetylases inhibitors (HDACi) have been reported to restore transcriptional balance through inhibition of class IIa HDAC's, that leads to an increased acetylation and transcription as demonstrated through in-vivo studies on mouse models of Huntington's. In this study, 61 di-aryl cyclo-propanehydroxamic acid derivatives were used for developing three dimensional (3D) QSAR and pharmacophore models. These models were then employed for screening and selection of anti-ataxia compounds. The chosen QSAR model was observed to be statistically robust with correlation coefficient ( r 2 ) value of 0.6774, cross-validated correlation coefficient ( q 2 ) of 0.6157 and co-relation coefficient for external test set ( pred _ r 2 ) of 0.7570. A high F -test value of 77.7093 signified the robustness of the model. Two potential drug leads ZINC 00608101 (SEI) and ZINC 00329110 (ACI) were selected after a coalesce procedure of pharmacophore based screening using the pharmacophore model ADDRR.20 and structural analysis using molecular docking and dynamics simulations. The pharmacophore and the 3D-QSAR model generated were further validated for their screening and prediction ability using the enrichment factor (EF), goodness of hit (GH), and receiver operating characteristics (ROC) curve analysis. The compounds SEI and ACI exhibited a docking score of -10.097 and -9.182 kcal/mol, respectively. An evaluation of binding conformation of ligand-bound protein complexes was performed with MD simulations for a time period of 30 ns along with free energy binding calculations using the g_mmpbsa technique. Prediction of inhibitory activities of the two lead compounds SEI (7.53) and ACI (6.84) using the 3D-QSAR model reaffirmed their inhibitory characteristics as potential anti-ataxia compounds.

  4. Simultaneous Quantification of Syringic Acid and Kaempferol in Extracts of Bergenia Species Using Validated High-Performance Thin-Layer Chromatographic-Densitometric Method.

    PubMed

    Srivastava, Nishi; Srivastava, Amit; Srivastava, Sharad; Rawat, Ajay Kumar Singh; Khan, Abdul Rahman

    2016-03-01

    A rapid, sensitive, selective and robust quantitative densitometric high-performance thin-layer chromatographic method was developed and validated for separation and quantification of syringic acid (SYA) and kaempferol (KML) in the hydrolyzed extracts of Bergenia ciliata and Bergenia stracheyi. The separation was performed on silica gel 60F254 high-performance thin-layer chromatography plates using toluene : ethyl acetate : formic acid (5 : 4: 1, v/v/v) as the mobile phase. The quantification of SYA and KML was carried out using a densitometric reflection/absorption mode at 290 nm. A dense spot of SYA and KML appeared on the developed plate at a retention factor value of 0.61 ± 0.02 and 0.70 ± 0.01. A precise and accurate quantification was performed using linear regression analysis by plotting the peak area vs concentration 100-600 ng/band (correlation coefficient: r = 0.997, regression coefficient: R(2) = 0.996) for SYA and 100-600 ng/band (correlation coefficient: r = 0.995, regression coefficient: R(2) = 0.991) for KML. The developed method was validated in terms of accuracy, recovery and inter- and intraday study as per International Conference on Harmonisation guidelines. The limit of detection and limit of quantification of SYA and KML were determined, respectively, as 91.63, 142.26 and 277.67, 431.09 ng. The statistical data analysis showed that the method is reproducible and selective for the estimation of SYA and KML in extracts of B. ciliata and B. stracheyi. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Modified free volume theory of self-diffusion and molecular theory of shear viscosity of liquid carbon dioxide.

    PubMed

    Nasrabad, Afshin Eskandari; Laghaei, Rozita; Eu, Byung Chan

    2005-04-28

    In previous work on the density fluctuation theory of transport coefficients of liquids, it was necessary to use empirical self-diffusion coefficients to calculate the transport coefficients (e.g., shear viscosity of carbon dioxide). In this work, the necessity of empirical input of the self-diffusion coefficients in the calculation of shear viscosity is removed, and the theory is thus made a self-contained molecular theory of transport coefficients of liquids, albeit it contains an empirical parameter in the subcritical regime. The required self-diffusion coefficients of liquid carbon dioxide are calculated by using the modified free volume theory for which the generic van der Waals equation of state and Monte Carlo simulations are combined to accurately compute the mean free volume by means of statistical mechanics. They have been computed as a function of density along four different isotherms and isobars. A Lennard-Jones site-site interaction potential was used to model the molecular carbon dioxide interaction. The density and temperature dependence of the theoretical self-diffusion coefficients are shown to be in excellent agreement with experimental data when the minimum critical free volume is identified with the molecular volume. The self-diffusion coefficients thus computed are then used to compute the density and temperature dependence of the shear viscosity of liquid carbon dioxide by employing the density fluctuation theory formula for shear viscosity as reported in an earlier paper (J. Chem. Phys. 2000, 112, 7118). The theoretical shear viscosity is shown to be robust and yields excellent density and temperature dependence for carbon dioxide. The pair correlation function appearing in the theory has been computed by Monte Carlo simulations.

  6. Tests of Hypotheses Arising In the Correlated Random Coefficient Model*

    PubMed Central

    Heckman, James J.; Schmierer, Daniel

    2010-01-01

    This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148

  7. Design of Robust Adaptive Unbalance Response Controllers for Rotors with Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Knospe, Carl R.; Tamer, Samir M.; Fedigan, Stephen J.

    1996-01-01

    Experimental results have recently demonstrated that an adaptive open loop control strategy can be highly effective in the suppression of unbalance induced vibration on rotors supported in active magnetic bearings. This algorithm, however, relies upon a predetermined gain matrix. Typically, this matrix is determined by an optimal control formulation resulting in the choice of the pseudo-inverse of the nominal influence coefficient matrix as the gain matrix. This solution may result in problems with stability and performance robustness since the estimated influence coefficient matrix is not equal to the actual influence coefficient matrix. Recently, analysis tools have been developed to examine the robustness of this control algorithm with respect to structured uncertainty. Herein, these tools are extended to produce a design procedure for determining the adaptive law's gain matrix. The resulting control algorithm has a guaranteed convergence rate and steady state performance in spite of the uncertainty in the rotor system. Several examples are presented which demonstrate the effectiveness of this approach and its advantages over the standard optimal control formulation.

  8. A Secure and Robust Object-Based Video Authentication System

    NASA Astrophysics Data System (ADS)

    He, Dajun; Sun, Qibin; Tian, Qi

    2004-12-01

    An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).

  9. A robust recognition and accurate locating method for circular coded diagonal target

    NASA Astrophysics Data System (ADS)

    Bao, Yunna; Shang, Yang; Sun, Xiaoliang; Zhou, Jiexin

    2017-10-01

    As a category of special control points which can be automatically identified, artificial coded targets have been widely developed in the field of computer vision, photogrammetry, augmented reality, etc. In this paper, a new circular coded target designed by RockeTech technology Corp. Ltd is analyzed and studied, which is called circular coded diagonal target (CCDT). A novel detection and recognition method with good robustness is proposed in the paper, and implemented on Visual Studio. In this algorithm, firstly, the ellipse features of the center circle are used for rough positioning. Then, according to the characteristics of the center diagonal target, a circular frequency filter is designed to choose the correct center circle and eliminates non-target noise. The precise positioning of the coded target is done by the correlation coefficient fitting extreme value method. Finally, the coded target recognition is achieved by decoding the binary sequence in the outer ring of the extracted target. To test the proposed algorithm, this paper has carried out simulation experiments and real experiments. The results show that the CCDT recognition and accurate locating method proposed in this paper can robustly recognize and accurately locate the targets in complex and noisy background.

  10. A robust color image watermarking algorithm against rotation attacks

    NASA Astrophysics Data System (ADS)

    Han, Shao-cheng; Yang, Jin-feng; Wang, Rui; Jia, Gui-min

    2018-01-01

    A robust digital watermarking algorithm is proposed based on quaternion wavelet transform (QWT) and discrete cosine transform (DCT) for copyright protection of color images. The luminance component Y of a host color image in YIQ space is decomposed by QWT, and then the coefficients of four low-frequency subbands are transformed by DCT. An original binary watermark scrambled by Arnold map and iterated sine chaotic system is embedded into the mid-frequency DCT coefficients of the subbands. In order to improve the performance of the proposed algorithm against rotation attacks, a rotation detection scheme is implemented before watermark extracting. The experimental results demonstrate that the proposed watermarking scheme shows strong robustness not only against common image processing attacks but also against arbitrary rotation attacks.

  11. Modified Regression Correlation Coefficient for Poisson Regression Model

    NASA Astrophysics Data System (ADS)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  12. Fractal dimension analysis for robust ultrasonic non-destructive evaluation (NDE) of coarse grained materials

    NASA Astrophysics Data System (ADS)

    Li, Minghui; Hayward, Gordon

    2018-04-01

    Over the recent decades, there has been a growing demand on reliable and robust non-destructive evaluation (NDE) of structures and components made from coarse grained materials such as alloys, stainless steels, carbon-reinforced composites and concrete; however, when inspected using ultrasound, the flaw echoes are usually contaminated by high-level, time-invariant, and correlated grain noise originating from the microstructure and grain boundaries, leading to pretty low signal-to-noise ratio (SNR) and the flaw information being obscured or completely hidden by the grain noise. In this paper, the fractal dimension analysis of the A-scan echoes is investigated as a measure of complexity of the time series to distinguish the echoes originating from the real defects and the grain noise, and then the normalized fractal dimension coefficients are applied to the amplitudes as the weighting factor to enhance the SNR and defect detection. Experiments on industrial samples of the mild steel and the stainless steel are conducted and the results confirm the great benefits of the method.

  13. Distance correlation methods for discovering associations in large astrophysical databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martínez-Gómez, Elizabeth; Richards, Mercedes T.; Richards, Donald St. P., E-mail: elizabeth.martinez@itam.mx, E-mail: mrichards@astro.psu.edu, E-mail: richards@stat.psu.edu

    2014-01-20

    High-dimensional, large-sample astrophysical databases of galaxy clusters, such as the Chandra Deep Field South COMBO-17 database, provide measurements on many variables for thousands of galaxies and a range of redshifts. Current understanding of galaxy formation and evolution rests sensitively on relationships between different astrophysical variables; hence an ability to detect and verify associations or correlations between variables is important in astrophysical research. In this paper, we apply a recently defined statistical measure called the distance correlation coefficient, which can be used to identify new associations and correlations between astrophysical variables. The distance correlation coefficient applies to variables of any dimension,more » can be used to determine smaller sets of variables that provide equivalent astrophysical information, is zero only when variables are independent, and is capable of detecting nonlinear associations that are undetectable by the classical Pearson correlation coefficient. Hence, the distance correlation coefficient provides more information than the Pearson coefficient. We analyze numerous pairs of variables in the COMBO-17 database with the distance correlation method and with the maximal information coefficient. We show that the Pearson coefficient can be estimated with higher accuracy from the corresponding distance correlation coefficient than from the maximal information coefficient. For given values of the Pearson coefficient, the distance correlation method has a greater ability than the maximal information coefficient to resolve astrophysical data into highly concentrated horseshoe- or V-shapes, which enhances classification and pattern identification. These results are observed over a range of redshifts beyond the local universe and for galaxies from elliptical to spiral.« less

  14. Clustering Coefficients for Correlation Networks.

    PubMed

    Masuda, Naoki; Sakaki, Michiko; Ezaki, Takahiro; Watanabe, Takamitsu

    2018-01-01

    Graph theory is a useful tool for deciphering structural and functional networks of the brain on various spatial and temporal scales. The clustering coefficient quantifies the abundance of connected triangles in a network and is a major descriptive statistics of networks. For example, it finds an application in the assessment of small-worldness of brain networks, which is affected by attentional and cognitive conditions, age, psychiatric disorders and so forth. However, it remains unclear how the clustering coefficient should be measured in a correlation-based network, which is among major representations of brain networks. In the present article, we propose clustering coefficients tailored to correlation matrices. The key idea is to use three-way partial correlation or partial mutual information to measure the strength of the association between the two neighboring nodes of a focal node relative to the amount of pseudo-correlation expected from indirect paths between the nodes. Our method avoids the difficulties of previous applications of clustering coefficient (and other) measures in defining correlational networks, i.e., thresholding on the correlation value, discarding of negative correlation values, the pseudo-correlation problem and full partial correlation matrices whose estimation is computationally difficult. For proof of concept, we apply the proposed clustering coefficient measures to functional magnetic resonance imaging data obtained from healthy participants of various ages and compare them with conventional clustering coefficients. We show that the clustering coefficients decline with the age. The proposed clustering coefficients are more strongly correlated with age than the conventional ones are. We also show that the local variants of the proposed clustering coefficients (i.e., abundance of triangles around a focal node) are useful in characterizing individual nodes. In contrast, the conventional local clustering coefficients were strongly correlated with and therefore may be confounded by the node's connectivity. The proposed methods are expected to help us to understand clustering and lack thereof in correlational brain networks, such as those derived from functional time series and across-participant correlation in neuroanatomical properties.

  15. Clustering Coefficients for Correlation Networks

    PubMed Central

    Masuda, Naoki; Sakaki, Michiko; Ezaki, Takahiro; Watanabe, Takamitsu

    2018-01-01

    Graph theory is a useful tool for deciphering structural and functional networks of the brain on various spatial and temporal scales. The clustering coefficient quantifies the abundance of connected triangles in a network and is a major descriptive statistics of networks. For example, it finds an application in the assessment of small-worldness of brain networks, which is affected by attentional and cognitive conditions, age, psychiatric disorders and so forth. However, it remains unclear how the clustering coefficient should be measured in a correlation-based network, which is among major representations of brain networks. In the present article, we propose clustering coefficients tailored to correlation matrices. The key idea is to use three-way partial correlation or partial mutual information to measure the strength of the association between the two neighboring nodes of a focal node relative to the amount of pseudo-correlation expected from indirect paths between the nodes. Our method avoids the difficulties of previous applications of clustering coefficient (and other) measures in defining correlational networks, i.e., thresholding on the correlation value, discarding of negative correlation values, the pseudo-correlation problem and full partial correlation matrices whose estimation is computationally difficult. For proof of concept, we apply the proposed clustering coefficient measures to functional magnetic resonance imaging data obtained from healthy participants of various ages and compare them with conventional clustering coefficients. We show that the clustering coefficients decline with the age. The proposed clustering coefficients are more strongly correlated with age than the conventional ones are. We also show that the local variants of the proposed clustering coefficients (i.e., abundance of triangles around a focal node) are useful in characterizing individual nodes. In contrast, the conventional local clustering coefficients were strongly correlated with and therefore may be confounded by the node's connectivity. The proposed methods are expected to help us to understand clustering and lack thereof in correlational brain networks, such as those derived from functional time series and across-participant correlation in neuroanatomical properties. PMID:29599714

  16. [Correlation coefficient-based classification method of hydrological dependence variability: With auto-regression model as example].

    PubMed

    Zhao, Yu Xi; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi

    2018-04-01

    Hydrological process evaluation is temporal dependent. Hydrological time series including dependence components do not meet the data consistency assumption for hydrological computation. Both of those factors cause great difficulty for water researches. Given the existence of hydrological dependence variability, we proposed a correlationcoefficient-based method for significance evaluation of hydrological dependence based on auto-regression model. By calculating the correlation coefficient between the original series and its dependence component and selecting reasonable thresholds of correlation coefficient, this method divided significance degree of dependence into no variability, weak variability, mid variability, strong variability, and drastic variability. By deducing the relationship between correlation coefficient and auto-correlation coefficient in each order of series, we found that the correlation coefficient was mainly determined by the magnitude of auto-correlation coefficient from the 1 order to p order, which clarified the theoretical basis of this method. With the first-order and second-order auto-regression models as examples, the reasonability of the deduced formula was verified through Monte-Carlo experiments to classify the relationship between correlation coefficient and auto-correlation coefficient. This method was used to analyze three observed hydrological time series. The results indicated the coexistence of stochastic and dependence characteristics in hydrological process.

  17. Cascading failure in scale-free networks with tunable clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-Jun; Gu, Bo; Guan, Xiang-Min; Zhu, Yan-Bo; Lv, Ren-Li

    2016-02-01

    Cascading failure is ubiquitous in many networked infrastructure systems, such as power grids, Internet and air transportation systems. In this paper, we extend the cascading failure model to a scale-free network with tunable clustering and focus on the effect of clustering coefficient on system robustness. It is found that the network robustness undergoes a nonmonotonic transition with the increment of clustering coefficient: both highly and lowly clustered networks are fragile under the intentional attack, and the network with moderate clustering coefficient can better resist the spread of cascading. We then provide an extensive explanation for this constructive phenomenon via the microscopic point of view and quantitative analysis. Our work can be useful to the design and optimization of infrastructure systems.

  18. Estimation of the biserial correlation and its sampling variance for use in meta-analysis.

    PubMed

    Jacobs, Perke; Viechtbauer, Wolfgang

    2017-06-01

    Meta-analyses are often used to synthesize the findings of studies examining the correlational relationship between two continuous variables. When only dichotomous measurements are available for one of the two variables, the biserial correlation coefficient can be used to estimate the product-moment correlation between the two underlying continuous variables. Unlike the point-biserial correlation coefficient, biserial correlation coefficients can therefore be integrated with product-moment correlation coefficients in the same meta-analysis. The present article describes the estimation of the biserial correlation coefficient for meta-analytic purposes and reports simulation results comparing different methods for estimating the coefficient's sampling variance. The findings indicate that commonly employed methods yield inconsistent estimates of the sampling variance across a broad range of research situations. In contrast, consistent estimates can be obtained using two methods that appear to be unknown in the meta-analytic literature. A variance-stabilizing transformation for the biserial correlation coefficient is described that allows for the construction of confidence intervals for individual coefficients with close to nominal coverage probabilities in most of the examined conditions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. A multivariate extension of mutual information for growing neural networks.

    PubMed

    Ball, Kenneth R; Grant, Christopher; Mundy, William R; Shafer, Timothy J

    2017-11-01

    Recordings of neural network activity in vitro are increasingly being used to assess the development of neural network activity and the effects of drugs, chemicals and disease states on neural network function. The high-content nature of the data derived from such recordings can be used to infer effects of compounds or disease states on a variety of important neural functions, including network synchrony. Historically, synchrony of networks in vitro has been assessed either by determination of correlation coefficients (e.g. Pearson's correlation), by statistics estimated from cross-correlation histograms between pairs of active electrodes, and/or by pairwise mutual information and related measures. The present study examines the application of Normalized Multiinformation (NMI) as a scalar measure of shared information content in a multivariate network that is robust with respect to changes in network size. Theoretical simulations are designed to investigate NMI as a measure of complexity and synchrony in a developing network relative to several alternative approaches. The NMI approach is applied to these simulations and also to data collected during exposure of in vitro neural networks to neuroactive compounds during the first 12 days in vitro, and compared to other common measures, including correlation coefficients and mean firing rates of neurons. NMI is shown to be more sensitive to developmental effects than first order synchronous and nonsynchronous measures of network complexity. Finally, NMI is a scalar measure of global (rather than pairwise) mutual information in a multivariate network, and hence relies on less assumptions for cross-network comparisons than historical approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Estimation of influential points in any data set from coefficient of determination and its leave-one-out cross-validated counterpart.

    PubMed

    Tóth, Gergely; Bodai, Zsolt; Héberger, Károly

    2013-10-01

    Coefficient of determination (R (2)) and its leave-one-out cross-validated analogue (denoted by Q (2) or R cv (2) ) are the most frequantly published values to characterize the predictive performance of models. In this article we use R (2) and Q (2) in a reversed aspect to determine uncommon points, i.e. influential points in any data sets. The term (1 - Q (2))/(1 - R (2)) corresponds to the ratio of predictive residual sum of squares and the residual sum of squares. The ratio correlates to the number of influential points in experimental and random data sets. We propose an (approximate) F test on (1 - Q (2))/(1 - R (2)) term to quickly pre-estimate the presence of influential points in training sets of models. The test is founded upon the routinely calculated Q (2) and R (2) values and warns the model builders to verify the training set, to perform influence analysis or even to change to robust modeling.

  1. Estimation of the simple correlation coefficient.

    PubMed

    Shieh, Gwowen

    2010-11-01

    This article investigates some unfamiliar properties of the Pearson product-moment correlation coefficient for the estimation of simple correlation coefficient. Although Pearson's r is biased, except for limited situations, and the minimum variance unbiased estimator has been proposed in the literature, researchers routinely employ the sample correlation coefficient in their practical applications, because of its simplicity and popularity. In order to support such practice, this study examines the mean squared errors of r and several prominent formulas. The results reveal specific situations in which the sample correlation coefficient performs better than the unbiased and nearly unbiased estimators, facilitating recommendation of r as an effect size index for the strength of linear association between two variables. In addition, related issues of estimating the squared simple correlation coefficient are also considered.

  2. Cognitive and academic achievement changes associated with day hospital rehabilitation in children with acquired brain injury.

    PubMed

    Goldstein, Gerald; Mayfield, Joan; Thaler, Nicholas S; Walker, Jon; Allen, Daniel N

    2018-01-01

    An evaluation was made of the outcome of a day hospital rehabilitation program for children who experienced an acquired neurological illness, mainly traumatic brain injury. Participants were administered neuropsychological and academic evaluations upon entry to the program, immediately upon discharge and several months after discharge Repeated measures ANOVA results for variables selected from the Reynolds Intellectual Assessment and the Delis-Kaplan Executive Function System found that comparisons showed significant (≥p < .01) improvement occurred between the first and second assessment, generally with large effect sizes. There were some nonsignificant decrements in performance between the discharge and follow-up assessments. A correlational analysis showed that while the association between cognitive function and academic achievement was robust, correlation coefficients did not differ in strength before and following rehabilitation. The study demonstrates significant improvement in children with acquired neurological disorders following rehabilitation.

  3. iPcc: a novel feature extraction method for accurate disease class discovery and prediction

    PubMed Central

    Ren, Xianwen; Wang, Yong; Zhang, Xiang-Sun; Jin, Qi

    2013-01-01

    Gene expression profiling has gradually become a routine procedure for disease diagnosis and classification. In the past decade, many computational methods have been proposed, resulting in great improvements on various levels, including feature selection and algorithms for classification and clustering. In this study, we present iPcc, a novel method from the feature extraction perspective to further propel gene expression profiling technologies from bench to bedside. We define ‘correlation feature space’ for samples based on the gene expression profiles by iterative employment of Pearson’s correlation coefficient. Numerical experiments on both simulated and real gene expression data sets demonstrate that iPcc can greatly highlight the latent patterns underlying noisy gene expression data and thus greatly improve the robustness and accuracy of the algorithms currently available for disease diagnosis and classification based on gene expression profiles. PMID:23761440

  4. Heart-rate monitoring by air pressure and causal analysis

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Naoki; Nakajima, Hiroshi; Hata, Yutaka

    2011-06-01

    Among lots of vital signals, heart-rate (HR) is an important index for diagnose human's health condition. For instance, HR provides an early stage of cardiac disease, autonomic nerve behavior, and so forth. However, currently, HR is measured only in medical checkups and clinical diagnosis during the rested state by using electrocardiograph (ECG). Thus, some serious cardiac events in daily life could be lost. Therefore, a continuous HR monitoring during 24 hours is desired. Considering the use in daily life, the monitoring should be noninvasive and low intrusive. Thus, in this paper, an HR monitoring in sleep by using air pressure sensors is proposed. The HR monitoring is realized by employing the causal analysis among air pressure and HR. The causality is described by employing fuzzy logic. According to the experiment on 7 males at age 22-25 (23 on average), the correlation coefficient against ECG is 0.73-0.97 (0.85 on average). In addition, the cause-effect structure for HR monitoring is arranged by employing causal decomposition, and the arranged causality is applied to HR monitoring in a setting posture. According to the additional experiment on 6 males, the correlation coefficient is 0.66-0.86 (0.76 on average). Therefore, the proposed method is suggested to have enough accuracy and robustness for some daily use cases.

  5. The Attenuation of Correlation Coefficients: A Statistical Literacy Issue

    ERIC Educational Resources Information Center

    Trafimow, David

    2016-01-01

    Much of the science reported in the media depends on correlation coefficients. But the size of correlation coefficients depends, in part, on the reliability with which the correlated variables are measured. Understanding this is a statistical literacy issue.

  6. Robustness of Oscillatory Behavior in Correlated Networks

    PubMed Central

    Sasai, Takeyuki; Morino, Kai; Tanaka, Gouhei; Almendral, Juan A.; Aihara, Kazuyuki

    2015-01-01

    Understanding network robustness against failures of network units is useful for preventing large-scale breakdowns and damages in real-world networked systems. The tolerance of networked systems whose functions are maintained by collective dynamical behavior of the network units has recently been analyzed in the framework called dynamical robustness of complex networks. The effect of network structure on the dynamical robustness has been examined with various types of network topology, but the role of network assortativity, or degree–degree correlations, is still unclear. Here we study the dynamical robustness of correlated (assortative and disassortative) networks consisting of diffusively coupled oscillators. Numerical analyses for the correlated networks with Poisson and power-law degree distributions show that network assortativity enhances the dynamical robustness of the oscillator networks but the impact of network disassortativity depends on the detailed network connectivity. Furthermore, we theoretically analyze the dynamical robustness of correlated bimodal networks with two-peak degree distributions and show the positive impact of the network assortativity. PMID:25894574

  7. Visible and near-infrared bulk optical properties of raw milk.

    PubMed

    Aernouts, B; Van Beers, R; Watté, R; Huybrechts, T; Lammertyn, J; Saeys, W

    2015-10-01

    The implementation of optical sensor technology to monitor the milk quality on dairy farms and milk processing plants would support the early detection of altering production processes. Basic visible and near-infrared spectroscopy is already widely used to measure the composition of agricultural and food products. However, to obtain maximal performance, the design of such optical sensors should be optimized with regard to the optical properties of the samples to be measured. Therefore, the aim of this study was to determine the visible and near-infrared bulk absorption coefficient, bulk scattering coefficient, and scattering anisotropy spectra for a diverse set of raw milk samples originating from individual cow milkings, representing the milk variability present on dairy farms. Accordingly, this database of bulk optical properties can be used in future simulation studies to efficiently optimize and validate the design of an optical milk quality sensor. In a next step of the current study, the relation between the obtained bulk optical properties and milk quality properties was analyzed in detail. The bulk absorption coefficient spectra were found to mainly contain information on the water, fat, and casein content, whereas the bulk scattering coefficient spectra were found to be primarily influenced by the quantity and the size of the fat globules. Moreover, a strong positive correlation (r ≥ 0.975) was found between the fat content in raw milk and the measured bulk scattering coefficients in the 1,300 to 1,400 nm wavelength range. Relative to the bulk scattering coefficient, the variability on the scattering anisotropy factor was found to be limited. This is because the milk scattering anisotropy is nearly independent of the fat globule and casein micelle quantity, while it is mainly determined by the size of the fat globules. As this study shows high correlations between the sample's bulk optical properties and the milk composition and fat globule size, a sensor that allows for robust separation between the absorption and scattering properties would enable accurate prediction of the raw milk quality parameters. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  9. Application of Partial Least Square (PLS) Analysis on Fluorescence Data of 8-Anilinonaphthalene-1-Sulfonic Acid, a Polarity Dye, for Monitoring Water Adulteration in Ethanol Fuel.

    PubMed

    Kumar, Keshav; Mishra, Ashok Kumar

    2015-07-01

    Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.

  10. Multipole Vector Anomalies in the First-Year WMAP Data: A Cut-Sky Analysis

    NASA Astrophysics Data System (ADS)

    Bielewicz, P.; Eriksen, H. K.; Banday, A. J.; Górski, K. M.; Lilje, P. B.

    2005-12-01

    We apply the recently defined multipole vector framework to the frequency-specific first-year WMAP sky maps, estimating the low-l multipole coefficients from the high-latitude sky by means of a power equalization filter. While most previous analyses of this type have considered only heavily processed (and foreground-contaminated) full-sky maps, the present approach allows for greater control of residual foregrounds and therefore potentially also for cosmologically important conclusions. The low-l spherical harmonic coefficients and corresponding multipole vectors are tabulated for easy reference. Using this formalism, we reassess a set of earlier claims of both cosmological and noncosmological low-l correlations on the basis of multipole vectors. First, we show that the apparent l=3 and 8 correlation claimed by Copi and coworkers is present only in the heavily processed map produced by Tegmark and coworkers and must therefore be considered an artifact of that map. Second, the well-known quadrupole-octopole correlation is confirmed at the 99% significance level and shown to be robust with respect to frequency and sky cut. Previous claims are thus supported by our analysis. Finally, the low-l alignment with respect to the ecliptic claimed by Schwarz and coworkers is nominally confirmed in this analysis, but also shown to be very dependent on severe a posteriori choices. Indeed, we show that given the peculiar quadrupole-octopole arrangement, finding such a strong alignment with the ecliptic is not unusual.

  11. Early warning of critical transitions in biodiversity from compositional disorder.

    PubMed

    Doncaster, C Patrick; Alonso Chávez, Vasthi; Viguier, Clément; Wang, Rong; Zhang, Enlou; Dong, Xuhui; Dearing, John A; Langdon, Peter G; Dyke, James G

    2016-11-01

    Global environmental change presents a clear need for improved leading indicators of critical transitions, especially those that can be generated from compositional data and that work in empirical cases. Ecological theory of community dynamics under environmental forcing predicts an early replacement of slowly replicating and weakly competitive "canary" species by slowly replicating but strongly competitive "keystone" species. Further forcing leads to the eventual collapse of the keystone species as they are replaced by weakly competitive but fast-replicating "weedy" species in a critical transition to a significantly different state. We identify a diagnostic signal of these changes in the coefficients of a correlation between compositional disorder and biodiversity. Compositional disorder measures unpredictability in the composition of a community, while biodiversity measures the amount of species in the community. In a stochastic simulation, sequential correlations over time switch from positive to negative as keystones prevail over canaries, and back to positive with domination of weedy species. The model finds support in empirical tests on multi-decadal time series of fossil diatom and chironomid communities from lakes in China. The characteristic switch from positive to negative correlation coefficients occurs for both communities up to three decades preceding a critical transition to a sustained alternate state. This signal is robust to unequal time increments that beset the identification of early-warning signals from other metrics. © 2016 The Authors. Ecology, published by Wiley Periodicals, Inc., on behalf of the Ecological Society of America.

  12. Modeling Earth's surface topography: decomposition of the static and dynamic components

    NASA Astrophysics Data System (ADS)

    Guerri, M.; Cammarano, F.; Tackley, P. J.

    2017-12-01

    Isolating the portion of topography supported by mantle convection, the so-called dynamic topography, would give us precious information on vigor and style of the convection itself. Contrasting results on the estimate of dynamic topography motivate us to analyse the sources of uncertainties affecting its modeling. We obtain models of mantle and crust density, leveraging on seismic and mineral physics constraints. We use the models to compute isostatic topography and residual topography maps. Estimates of dynamic topography and associated synthetic geoid are obtained by instantaneous mantle flow modeling. We test various viscosity profiles and 3D viscosity distributions accounting for inferred lateral variations in temperature. We find that the patterns of residual and dynamic topography are robust, with an average correlation coefficient of 0.74 and 0.71, respectively. The amplitudes are however poorly constrained. For the static component, the considered lithospheric mantle density models result in topographies that differ, on average, 720 m, with peaks reaching 1.7 km. The crustal density models produce variations in isostatic topography averaging 350 m, with peaks of 1 km. For the dynamic component, we obtain peak-to-peak topography amplitude exceeding 3 km for all the tested mantle density and viscosity models. Such values of dynamic topography produce geoid undulations that are not in agreement with observations. Assuming chemical heterogeneities in the lower mantle, in correspondence with the LLSVPs (Large Low Shear wave Velocity Provinces), helps to decrease the amplitudes of dynamic topography and geoid, but reduces the correlation between synthetic and observed geoid. The correlation coefficients between the residual and dynamic topography maps is always less than 0.55. In general, our results indicate that, i) current knowledge of crust density, mantle density and mantle viscosity is still limited, ii) it is important to account for all the various sources of uncertainties when computing static and dynamic topography. In conclusion, a multidisciplinary approach, which involves multiple geophysics observations and constraints from mineral physics, is necessary for obtaining robust density models and, consequently, for properly estimating the dynamic topography.

  13. Identification of Correlated GRACE Monthly Harmonic Coefficients Using Pattern Recognition and Neural Networks

    NASA Astrophysics Data System (ADS)

    Piretzidis, D.; Sra, G.; Sideris, M. G.

    2016-12-01

    This study explores new methods for identifying correlation errors in harmonic coefficients derived from monthly solutions of the Gravity Recovery and Climate Experiment (GRACE) satellite mission using pattern recognition and neural network algorithms. These correlation errors are evidenced in the differences between monthly solutions and can be suppressed using a de-correlation filter. In all studies so far, the implementation of the de-correlation filter starts from a specific minimum order (i.e., 11 for RL04 and 38 for RL05) until the maximum order of the monthly solution examined. This implementation method has two disadvantages, namely, the omission of filtering correlated coefficients of order less than the minimum order and the filtering of uncorrelated coefficients of order higher than the minimum order. In the first case, the filtered solution is not completely free of correlated errors, whereas the second case results in a monthly solution that suffers from loss of geophysical signal. In the present study, a new method of implementing the de-correlation filter is suggested, by identifying and filtering only the coefficients that show indications of high correlation. Several numerical and geometric properties of the harmonic coefficient series of all orders are examined. Extreme cases of both correlated and uncorrelated coefficients are selected, and their corresponding properties are used to train a two-layer feed-forward neural network. The objective of the neural network is to identify and quantify the correlation by providing the probability of an order of coefficients to be correlated. Results show good performance of the neural network, both in the validation stage of the training procedure and in the subsequent use of the trained network to classify independent coefficients. The neural network is also capable of identifying correlated coefficients even when a small number of training samples and neurons are used (e.g.,100 and 10, respectively).

  14. SU-E-T-654: Quantifying Plan Quality Can Effectively Distinguish Between Competing Equivocal IMRT Prostate Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, A; Lo, J; Department of Radiology, Duke University Medical Center, Durham, NC

    2015-06-15

    Purpose: The purpose of this study was to create a prostate IMRT plan quality index (PQI) that may be used to quantitatively compare competing plans using a methodology that mimics physician preference. This methodology allows planners to choose between plans with equivocal characteristics, prior to physician scrutiny. Methods: An observer study was conducted to gather data from 3 radiation oncology physicians who ranked a set of 20 patients (each with 5 plans). The rankings were used to optimize a PQI that combined weighted portions of the rectum, bladder, and planning target volume DVHs, such that the relative PQI mimicked physicianmore » rankings as best as possible. Once optimized, a test study assessed the PQI by comparison to physician rankings in a new set of 25 patients (each with 4 plans). The physician group in the test study included 6 physicians, 5 of whom were not included in the modeling study. PQI scores were evaluated against the physicians’ rank list using Spearman rank correlation. Results: The optimized plan quality index combined the following DVH features: high dose regions above 40Gy/60Gy (rectum/bladder), organ weightings, and PTV shoulder coverage. Mean correlation of the PQI vs. physicians’ rankings in the modeling study was 0.507 (range: 0.345–0.706). By comparison, the mean correlation between physicians was 0.301 (range: 0.242–0.334). The mean correlation of the PQI vs. physician rankings in test study was 0.726 (range: 0.416–0.936), indicating robustness of the PQI by virtue of producing similar results to the modeling study. Intra-physician correlation was 0.564 (range: 0.398–0.689). Conclusion: The correlation coefficients of the PQI vs. physicians were similar to the correlation coefficients of the physicians with each other, implying that the PQI developed in this work shows promise in reflecting physician clinical preference when selecting between competing, dosimetrically equivocal plans.« less

  15. The degree-related clustering coefficient and its application to link prediction

    NASA Astrophysics Data System (ADS)

    Liu, Yangyang; Zhao, Chengli; Wang, Xiaojie; Huang, Qiangjuan; Zhang, Xue; Yi, Dongyun

    2016-07-01

    Link prediction plays a significant role in explaining the evolution of networks. However it is still a challenging problem that has been addressed only with topological information in recent years. Based on the belief that network nodes with a great number of common neighbors are more likely to be connected, many similarity indices have achieved considerable accuracy and efficiency. Motivated by the natural assumption that the effect of missing links on the estimation of a node's clustering ability could be related to node degree, in this paper, we propose a degree-related clustering coefficient index to quantify the clustering ability of nodes. Unlike the classical clustering coefficient, our new coefficient is highly robust when the observed bias of links is considered. Furthermore, we propose a degree-related clustering ability path (DCP) index, which applies the proposed coefficient to the link prediction problem. Experiments on 12 real-world networks show that our proposed method is highly accurate and robust compared with four common-neighbor-based similarity indices (Common Neighbors(CN), Adamic-Adar(AA), Resource Allocation(RA), and Preferential Attachment(PA)), and the recently introduced clustering ability (CA) index.

  16. The Correlation Between Atmospheric Dust Deposition to the Surface Ocean and SeaWiFS Ocean Color: A Global Satellite-Based Analysis

    NASA Technical Reports Server (NTRS)

    Erickson, D. J., III; Hernandez, J.; Ginoux, P.; Gregg, W.; Kawa, R.; Behrenfeld, M.; Esaias, W.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Since the atmospheric deposition of iron has been linked to primary productivity in various oceanic regions, we have conducted an objective study of the correlation of dust deposition and satellite remotely sensed surface ocean chlorophyll concentrations. We present a global analysis of the correlation between atmospheric dust deposition derived from a satellite-based 3-D atmospheric transport model and SeaWiFs estimates of ocean color. We use the monthly mean dust deposition fields of Ginoux et al. which are based on a global model of dust generation and transport. This model is driven by atmospheric circulation from the Data Assimilation Office (DAO) for the period 1995-1998. This global dust model is constrained by several satellite estimates of standard circulation characteristics. We then perform an analysis of the correlation between the dust deposition and the 1998 SeaWIFS ocean color data for each 2.0 deg x 2.5 deg lat/long grid point, for each month of the year. The results are surprisingly robust. The region between 40 S and 60 S has correlation coefficients from 0.6 to 0.95, statistically significant at the 0.05 level. There are swaths of high correlation at the edges of some major ocean current systems. We interpret these correlations as reflecting areas that have shear related turbulence bringing nitrogen and phosphorus from depth into the surface ocean, and the atmospheric supply of iron provides the limiting nutrient and the correlation between iron deposition and surface ocean chlorophyll is high. There is a region in the western North Pacific with high correlation, reflecting the input of Asian dust to that region. The southern hemisphere has an average correlation coefficient of 0.72 compared that in the northern hemisphere of 0.42 consistent with present conceptual models of where atmospheric iron deposition may play a role in surface ocean biogeochemical cycles. The spatial structure of the correlation fields will be discussed within the context of guiding the design of field programs.

  17. Least median of squares and iteratively re-weighted least squares as robust linear regression methods for fluorimetric determination of α-lipoic acid in capsules in ideal and non-ideal cases of linearity.

    PubMed

    Korany, Mohamed A; Gazy, Azza A; Khamis, Essam F; Ragab, Marwa A A; Kamal, Miranda F

    2018-06-01

    This study outlines two robust regression approaches, namely least median of squares (LMS) and iteratively re-weighted least squares (IRLS) to investigate their application in instrument analysis of nutraceuticals (that is, fluorescence quenching of merbromin reagent upon lipoic acid addition). These robust regression methods were used to calculate calibration data from the fluorescence quenching reaction (∆F and F-ratio) under ideal or non-ideal linearity conditions. For each condition, data were treated using three regression fittings: Ordinary Least Squares (OLS), LMS and IRLS. Assessment of linearity, limits of detection (LOD) and quantitation (LOQ), accuracy and precision were carefully studied for each condition. LMS and IRLS regression line fittings showed significant improvement in correlation coefficients and all regression parameters for both methods and both conditions. In the ideal linearity condition, the intercept and slope changed insignificantly, but a dramatic change was observed for the non-ideal condition and linearity intercept. Under both linearity conditions, LOD and LOQ values after the robust regression line fitting of data were lower than those obtained before data treatment. The results obtained after statistical treatment indicated that the linearity ranges for drug determination could be expanded to lower limits of quantitation by enhancing the regression equation parameters after data treatment. Analysis results for lipoic acid in capsules, using both fluorimetric methods, treated by parametric OLS and after treatment by robust LMS and IRLS were compared for both linearity conditions. Copyright © 2018 John Wiley & Sons, Ltd.

  18. Robust Coefficients Alpha and Omega and Confidence Intervals With Outlying Observations and Missing Data: Methods and Software.

    PubMed

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-06-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation methods for alpha and omega often implicitly assume that data are complete and normally distributed. This study proposes robust procedures to estimate both alpha and omega as well as corresponding standard errors and confidence intervals from samples that may contain potential outlying observations and missing values. The influence of outlying observations and missing data on the estimates of alpha and omega is investigated through two simulation studies. Results show that the newly developed robust method yields substantially improved alpha and omega estimates as well as better coverage rates of confidence intervals than the conventional nonrobust method. An R package coefficientalpha is developed and demonstrated to obtain robust estimates of alpha and omega.

  19. Robust Coefficients Alpha and Omega and Confidence Intervals With Outlying Observations and Missing Data

    PubMed Central

    Zhang, Zhiyong; Yuan, Ke-Hai

    2015-01-01

    Cronbach’s coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald’s omega has been used as a popular alternative to alpha in the literature. Traditional estimation methods for alpha and omega often implicitly assume that data are complete and normally distributed. This study proposes robust procedures to estimate both alpha and omega as well as corresponding standard errors and confidence intervals from samples that may contain potential outlying observations and missing values. The influence of outlying observations and missing data on the estimates of alpha and omega is investigated through two simulation studies. Results show that the newly developed robust method yields substantially improved alpha and omega estimates as well as better coverage rates of confidence intervals than the conventional nonrobust method. An R package coefficientalpha is developed and demonstrated to obtain robust estimates of alpha and omega. PMID:29795870

  20. Diagnosing cysts with correlation coefficient images from 2-dimensional freehand elastography.

    PubMed

    Booi, Rebecca C; Carson, Paul L; O'Donnell, Matthew; Richards, Michael S; Rubin, Jonathan M

    2007-09-01

    We compared the diagnostic potential of using correlation coefficient images versus elastograms from 2-dimensional (2D) freehand elastography to characterize breast cysts. In this preliminary study, which was approved by the Institutional Review Board and compliant with the Health Insurance Portability and Accountability Act, we imaged 4 consecutive human subjects (4 cysts, 1 biopsy-verified benign breast parenchyma) with freehand 2D elastography. Data were processed offline with conventional 2D phase-sensitive speckle-tracking algorithms. The correlation coefficient in the cyst and surrounding tissue was calculated, and appearances of the cysts in the correlation coefficient images and elastograms were compared. The correlation coefficient in the cysts was considerably lower (14%-37%) than in the surrounding tissue because of the lack of sufficient speckle in the cysts, as well as the prominence of random noise, reverberations, and clutter, which decorrelated quickly. Thus, the cysts were visible in all correlation coefficient images. In contrast, the elastograms associated with these cysts each had different elastographic patterns. The solid mass in this study did not have the same high decorrelation rate as the cysts, having a correlation coefficient only 2.1% lower than that of surrounding tissue. Correlation coefficient images may produce a more direct, reliable, and consistent method for characterizing cysts than elastograms.

  1. Robustness and structure of complex networks

    NASA Astrophysics Data System (ADS)

    Shao, Shuai

    This dissertation covers the two major parts of my PhD research on statistical physics and complex networks: i) modeling a new type of attack -- localized attack, and investigating robustness of complex networks under this type of attack; ii) discovering the clustering structure in complex networks and its influence on the robustness of coupled networks. Complex networks appear in every aspect of our daily life and are widely studied in Physics, Mathematics, Biology, and Computer Science. One important property of complex networks is their robustness under attacks, which depends crucially on the nature of attacks and the structure of the networks themselves. Previous studies have focused on two types of attack: random attack and targeted attack, which, however, are insufficient to describe many real-world damages. Here we propose a new type of attack -- localized attack, and study the robustness of complex networks under this type of attack, both analytically and via simulation. On the other hand, we also study the clustering structure in the network, and its influence on the robustness of a complex network system. In the first part, we propose a theoretical framework to study the robustness of complex networks under localized attack based on percolation theory and generating function method. We investigate the percolation properties, including the critical threshold of the phase transition pc and the size of the giant component Pinfinity. We compare localized attack with random attack and find that while random regular (RR) networks are more robust against localized attack, Erdoḧs-Renyi (ER) networks are equally robust under both types of attacks. As for scale-free (SF) networks, their robustness depends crucially on the degree exponent lambda. The simulation results show perfect agreement with theoretical predictions. We also test our model on two real-world networks: a peer-to-peer computer network and an airline network, and find that the real-world networks are much more vulnerable to localized attack compared with random attack. In the second part, we extend the tree-like generating function method to incorporating clustering structure in complex networks. We study the robustness of a complex network system, especially a network of networks (NON) with clustering structure in each network. We find that the system becomes less robust as we increase the clustering coefficient of each network. For a partially dependent network system, we also find that the influence of the clustering coefficient on network robustness decreases as we decrease the coupling strength, and the critical coupling strength qc, at which the first-order phase transition changes to second-order, increases as we increase the clustering coefficient.

  2. A New Methodology of Spatial Cross-Correlation Analysis

    PubMed Central

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  3. A new methodology of spatial cross-correlation analysis.

    PubMed

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  4. Estimating correlation between multivariate longitudinal data in the presence of heterogeneity.

    PubMed

    Gao, Feng; Philip Miller, J; Xiong, Chengjie; Luo, Jingqin; Beiser, Julia A; Chen, Ling; Gordon, Mae O

    2017-08-17

    Estimating correlation coefficients among outcomes is one of the most important analytical tasks in epidemiological and clinical research. Availability of multivariate longitudinal data presents a unique opportunity to assess joint evolution of outcomes over time. Bivariate linear mixed model (BLMM) provides a versatile tool with regard to assessing correlation. However, BLMMs often assume that all individuals are drawn from a single homogenous population where the individual trajectories are distributed smoothly around population average. Using longitudinal mean deviation (MD) and visual acuity (VA) from the Ocular Hypertension Treatment Study (OHTS), we demonstrated strategies to better understand the correlation between multivariate longitudinal data in the presence of potential heterogeneity. Conditional correlation (i.e., marginal correlation given random effects) was calculated to describe how the association between longitudinal outcomes evolved over time within specific subpopulation. The impact of heterogeneity on correlation was also assessed by simulated data. There was a significant positive correlation in both random intercepts (ρ = 0.278, 95% CI: 0.121-0.420) and random slopes (ρ = 0.579, 95% CI: 0.349-0.810) between longitudinal MD and VA, and the strength of correlation constantly increased over time. However, conditional correlation and simulation studies revealed that the correlation was induced primarily by participants with rapid deteriorating MD who only accounted for a small fraction of total samples. Conditional correlation given random effects provides a robust estimate to describe the correlation between multivariate longitudinal data in the presence of unobserved heterogeneity (NCT00000125).

  5. Random matrix theory analysis of cross-correlations in the US stock market: Evidence from Pearson’s correlation coefficient and detrended cross-correlation coefficient

    NASA Astrophysics Data System (ADS)

    Wang, Gang-Jin; Xie, Chi; Chen, Shou; Yang, Jiao-Jiao; Yang, Ming-Yan

    2013-09-01

    In this study, we first build two empirical cross-correlation matrices in the US stock market by two different methods, namely the Pearson’s correlation coefficient and the detrended cross-correlation coefficient (DCCA coefficient). Then, combining the two matrices with the method of random matrix theory (RMT), we mainly investigate the statistical properties of cross-correlations in the US stock market. We choose the daily closing prices of 462 constituent stocks of S&P 500 index as the research objects and select the sample data from January 3, 2005 to August 31, 2012. In the empirical analysis, we examine the statistical properties of cross-correlation coefficients, the distribution of eigenvalues, the distribution of eigenvector components, and the inverse participation ratio. From the two methods, we find some new results of the cross-correlations in the US stock market in our study, which are different from the conclusions reached by previous studies. The empirical cross-correlation matrices constructed by the DCCA coefficient show several interesting properties at different time scales in the US stock market, which are useful to the risk management and optimal portfolio selection, especially to the diversity of the asset portfolio. It will be an interesting and meaningful work to find the theoretical eigenvalue distribution of a completely random matrix R for the DCCA coefficient because it does not obey the Marčenko-Pastur distribution.

  6. Inter- and Intrasubject Similarity of Muscle Synergies During Bench Press With Slow and Fast Velocity.

    PubMed

    Samani, Afshin; Kristiansen, Mathias

    2018-01-01

    We investigated the effect of low and high bar velocity on inter- and intrasubject similarity of muscle synergies during bench press. A total of 13 trained male subjects underwent two exercise conditions: a slow- and a fast-velocity bench press. Surface electromyography was recorded from 13 muscles, and muscle synergies were extracted using a nonnegative matrix factorization algorithm. The intrasubject similarity across conditions and intersubject similarity within conditions were computed for muscle synergy vectors and activation coefficients. Two muscle synergies were sufficient to describe the dataset variability. For the second synergy activation coefficient, the intersubject similarity within the fast-velocity condition was greater than the intrasubject similarity of the activation coefficient across the conditions. An opposite pattern was observed for the first muscle synergy vector. We concluded that the activation coefficients are robust within conditions, indicating a robust temporal pattern of muscular activity across individuals, but the muscle synergy vector seemed to be individually assigned.

  7. Characterization of performance-emission indices of a diesel engine using ANFIS operating in dual-fuel mode with LPG

    NASA Astrophysics Data System (ADS)

    Chakraborty, Amitav; Roy, Sumit; Banerjee, Rahul

    2018-03-01

    This experimental work highlights the inherent capability of an adaptive-neuro fuzzy inference system (ANFIS) based model to act as a robust system identification tool (SIT) in prognosticating the performance and emission parameters of an existing diesel engine running of diesel-LPG dual fuel mode. The developed model proved its adeptness by successfully harnessing the effects of the input parameters of load, injection duration and LPG energy share on output parameters of BSFCEQ, BTE, NOX, SOOT, CO and HC. Successive evaluation of the ANFIS model, revealed high levels of resemblance with the already forecasted ANN results for the same input parameters and it was evident that similar to ANN, ANFIS also has the innate ability to act as a robust SIT. The ANFIS predicted data harmonized the experimental data with high overall accuracy. The correlation coefficient (R) values are stretched in between 0.99207 to 0.999988. The mean absolute percentage error (MAPE) tallies were recorded in the range of 0.02-0.173% with the root mean square errors (RMSE) in acceptable margins. Hence the developed model is capable of emulating the actual engine parameters with commendable ranges of accuracy, which in turn would act as a robust prediction platform in the future domains of optimization.

  8. Robust Visual Tracking via Online Discriminative and Low-Rank Dictionary Learning.

    PubMed

    Zhou, Tao; Liu, Fanghui; Bhaskar, Harish; Yang, Jie

    2017-09-12

    In this paper, we propose a novel and robust tracking framework based on online discriminative and low-rank dictionary learning. The primary aim of this paper is to obtain compact and low-rank dictionaries that can provide good discriminative representations of both target and background. We accomplish this by exploiting the recovery ability of low-rank matrices. That is if we assume that the data from the same class are linearly correlated, then the corresponding basis vectors learned from the training set of each class shall render the dictionary to become approximately low-rank. The proposed dictionary learning technique incorporates a reconstruction error that improves the reliability of classification. Also, a multiconstraint objective function is designed to enable active learning of a discriminative and robust dictionary. Further, an optimal solution is obtained by iteratively computing the dictionary, coefficients, and by simultaneously learning the classifier parameters. Finally, a simple yet effective likelihood function is implemented to estimate the optimal state of the target during tracking. Moreover, to make the dictionary adaptive to the variations of the target and background during tracking, an online update criterion is employed while learning the new dictionary. Experimental results on a publicly available benchmark dataset have demonstrated that the proposed tracking algorithm performs better than other state-of-the-art trackers.

  9. Effect of random phase mask on input plane in photorefractive authentic memory with two-wave encryption method

    NASA Astrophysics Data System (ADS)

    Mita, Akifumi; Okamoto, Atsushi; Funakoshi, Hisatoshi

    2004-06-01

    We have proposed an all-optical authentic memory with the two-wave encryption method. In the recording process, the image data are encrypted to a white noise by the random phase masks added on the input beam with the image data and the reference beam. Only reading beam with the phase-conjugated distribution of the reference beam can decrypt the encrypted data. If the encrypted data are read out with an incorrect phase distribution, the output data are transformed into a white noise. Moreover, during read out, reconstructions of the encrypted data interfere destructively resulting in zero intensity. Therefore our memory has a merit that we can detect unlawful accesses easily by measuring the output beam intensity. In our encryption method, the random phase mask on the input plane plays important roles in transforming the input image into a white noise and prohibiting to decrypt a white noise to the input image by the blind deconvolution method. Without this mask, when unauthorized users observe the output beam by using CCD in the readout with the plane wave, the completely same intensity distribution as that of Fourier transform of the input image is obtained. Therefore the encrypted image will be decrypted easily by using the blind deconvolution method. However in using this mask, even if unauthorized users observe the output beam using the same method, the encrypted image cannot be decrypted because the observed intensity distribution is dispersed at random by this mask. Thus it can be said the robustness is increased by this mask. In this report, we compare two correlation coefficients, which represents the degree of a white noise of the output image, between the output image and the input image in using this mask or not. We show that the robustness of this encryption method is increased as the correlation coefficient is improved from 0.3 to 0.1 by using this mask.

  10. Intravoxel water diffusion heterogeneity MR imaging of nasopharyngeal carcinoma using stretched exponential diffusion model.

    PubMed

    Lai, Vincent; Lee, Victor Ho Fun; Lam, Ka On; Sze, Henry Chun Kin; Chan, Queenie; Khong, Pek Lan

    2015-06-01

    To determine the utility of stretched exponential diffusion model in characterisation of the water diffusion heterogeneity in different tumour stages of nasopharyngeal carcinoma (NPC). Fifty patients with newly diagnosed NPC were prospectively recruited. Diffusion-weighted MR imaging was performed using five b values (0-2,500 s/mm(2)). Respective stretched exponential parameters (DDC, distributed diffusion coefficient; and alpha (α), water heterogeneity) were calculated. Patients were stratified into low and high tumour stage groups based on the American Joint Committee on Cancer (AJCC) staging for determination of the predictive powers of DDC and α using t test and ROC curve analyses. The mean ± standard deviation values were DDC = 0.692 ± 0.199 (×10(-3) mm(2)/s) for low stage group vs 0.794 ± 0.253 (×10(-3) mm(2)/s) for high stage group; α = 0.792 ± 0.145 for low stage group vs 0.698 ± 0.155 for high stage group. α was significantly lower in the high stage group while DDC was negatively correlated. DDC and α were both reliable independent predictors (p < 0.001), with α being more powerful. Optimal cut-off values were (sensitivity, specificity, positive likelihood ratio, negative likelihood ratio) DDC = 0.692 × 10(-3) mm(2)/s (94.4 %, 64.3 %, 2.64, 0.09), α = 0.720 (72.2 %, 100 %, -, 0.28). The heterogeneity index α is robust and can potentially help in staging and grading prediction in NPC. • Stretched exponential diffusion models can help in tissue characterisation in nasopharyngeal carcinoma • α and distributed diffusion coefficient (DDC) are negatively correlated • α is a robust heterogeneity index marker • α can potentially help in staging and grading prediction.

  11. SU-F-R-51: Radiomics in CT Perfusion Maps of Head and Neck Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesteruk, M; Riesterer, O; Veit-Haibach, P

    2016-06-15

    Purpose: The aim of this study was to test the predictive value of radiomics features of CT perfusion (CTP) for tumor control, based on a preselection of radiomics features in a robustness study. Methods: 11 patients with head and neck cancer (HNC) and 11 patients with lung cancer were included in the robustness study to preselect stable radiomics parameters. Data from 36 HNC patients treated with definitive radiochemotherapy (median follow-up 30 months) was used to build a predictive model based on these parameters. All patients underwent pre-treatment CTP. 315 texture parameters were computed for three perfusion maps: blood volume, bloodmore » flow and mean transit time. The variability of texture parameters was tested with respect to non-standardizable perfusion computation factors (noise level and artery contouring) using intraclass correlation coefficients (ICC). The parameter with the highest ICC in the correlated group of parameters (inter-parameter Spearman correlations) was tested for its predictive value. The final model to predict tumor control was built using multivariate Cox regression analysis with backward selection of the variables. For comparison, a predictive model based on tumor volume was created. Results: Ten parameters were found to be stable in both HNC and lung cancer regarding potentially non-standardizable factors after the correction for inter-parameter correlations. In the multivariate backward selection of the variables, blood flow entropy showed a highly significant impact on tumor control (p=0.03) with concordance index (CI) of 0.76. Blood flow entropy was significantly lower in the patient group with controlled tumors at 18 months (p<0.1). The new model showed a higher concordance index compared to the tumor volume model (CI=0.68). Conclusion: The preselection of variables in the robustness study allowed building a predictive radiomics-based model of tumor control in HNC despite a small patient cohort. This model was found to be superior to the volume-based model. The project was supported by the KFSP Tumor Oxygenation of the University of Zurich, by a grant of the Center for Clinical Research, University and University Hospital Zurich and by a research grant from Merck (Schweiz) AG.« less

  12. Identifying presence of correlated errors in GRACE monthly harmonic coefficients using machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Piretzidis, Dimitrios; Sra, Gurveer; Karantaidis, George; Sideris, Michael G.

    2017-04-01

    A new method for identifying correlated errors in Gravity Recovery and Climate Experiment (GRACE) monthly harmonic coefficients has been developed and tested. Correlated errors are present in the differences between monthly GRACE solutions, and can be suppressed using a de-correlation filter. In principle, the de-correlation filter should be implemented only on coefficient series with correlated errors to avoid losing useful geophysical information. In previous studies, two main methods of implementing the de-correlation filter have been utilized. In the first one, the de-correlation filter is implemented starting from a specific minimum order until the maximum order of the monthly solution examined. In the second one, the de-correlation filter is implemented only on specific coefficient series, the selection of which is based on statistical testing. The method proposed in the present study exploits the capabilities of supervised machine learning algorithms such as neural networks and support vector machines (SVMs). The pattern of correlated errors can be described by several numerical and geometric features of the harmonic coefficient series. The features of extreme cases of both correlated and uncorrelated coefficients are extracted and used for the training of the machine learning algorithms. The trained machine learning algorithms are later used to identify correlated errors and provide the probability of a coefficient series to be correlated. Regarding SVMs algorithms, an extensive study is performed with various kernel functions in order to find the optimal training model for prediction. The selection of the optimal training model is based on the classification accuracy of the trained SVM algorithm on the same samples used for training. Results show excellent performance of all algorithms with a classification accuracy of 97% - 100% on a pre-selected set of training samples, both in the validation stage of the training procedure and in the subsequent use of the trained algorithms to classify independent coefficients. This accuracy is also confirmed by the external validation of the trained algorithms using the hydrology model GLDAS NOAH. The proposed method meet the requirement of identifying and de-correlating only coefficients with correlated errors. Also, there is no need of applying statistical testing or other techniques that require prior de-correlation of the harmonic coefficients.

  13. Factors That Attenuate the Correlation Coefficient and Its Analogs.

    ERIC Educational Resources Information Center

    Dolenz, Beverly

    The correlation coefficient is an integral part of many other statistical techniques (analysis of variance, t-tests, etc.), since all analytic methods are actually correlational (G. V. Glass and K. D. Hopkins, 1984). The correlation coefficient is a statistical summary that represents the degree and direction of relationship between two variables.…

  14. Revisiting the Robustness of PET-Based Textural Features in the Context of Multi-Centric Trials.

    PubMed

    Bailly, Clément; Bodet-Milin, Caroline; Couespel, Solène; Necib, Hatem; Kraeber-Bodéré, Françoise; Ansquer, Catherine; Carlier, Thomas

    2016-01-01

    This study aimed to investigate the variability of textural features (TF) as a function of acquisition and reconstruction parameters within the context of multi-centric trials. The robustness of 15 selected TFs were studied as a function of the number of iterations, the post-filtering level, input data noise, the reconstruction algorithm and the matrix size. A combination of several reconstruction and acquisition settings was devised to mimic multi-centric conditions. We retrospectively studied data from 26 patients enrolled in a diagnostic study that aimed to evaluate the performance of PET/CT 68Ga-DOTANOC in gastro-entero-pancreatic neuroendocrine tumors. Forty-one tumors were extracted and served as the database. The coefficient of variation (COV) or the absolute deviation (for the noise study) was derived and compared statistically with SUVmax and SUVmean results. The majority of investigated TFs can be used in a multi-centric context when each parameter is considered individually. The impact of voxel size and noise in the input data were predominant as only 4 TFs presented a high/intermediate robustness against SUV-based metrics (Entropy, Homogeneity, RP and ZP). When combining several reconstruction settings to mimic multi-centric conditions, most of the investigated TFs were robust enough against SUVmax except Correlation, Contrast, LGRE, LGZE and LZLGE. Considering previously published results on either reproducibility or sensitivity against delineation approach and our findings, it is feasible to consider Homogeneity, Entropy, Dissimilarity, HGRE, HGZE and ZP as relevant for being used in multi-centric trials.

  15. Robust estimation for ordinary differential equation models.

    PubMed

    Cao, J; Wang, L; Xu, J

    2011-12-01

    Applied scientists often like to use ordinary differential equations (ODEs) to model complex dynamic processes that arise in biology, engineering, medicine, and many other areas. It is interesting but challenging to estimate ODE parameters from noisy data, especially when the data have some outliers. We propose a robust method to address this problem. The dynamic process is represented with a nonparametric function, which is a linear combination of basis functions. The nonparametric function is estimated by a robust penalized smoothing method. The penalty term is defined with the parametric ODE model, which controls the roughness of the nonparametric function and maintains the fidelity of the nonparametric function to the ODE model. The basis coefficients and ODE parameters are estimated in two nested levels of optimization. The coefficient estimates are treated as an implicit function of ODE parameters, which enables one to derive the analytic gradients for optimization using the implicit function theorem. Simulation studies show that the robust method gives satisfactory estimates for the ODE parameters from noisy data with outliers. The robust method is demonstrated by estimating a predator-prey ODE model from real ecological data. © 2011, The International Biometric Society.

  16. On interrelations of recurrences and connectivity trends between stock indices

    NASA Astrophysics Data System (ADS)

    Goswami, B.; Ambika, G.; Marwan, N.; Kurths, J.

    2012-09-01

    Financial data has been extensively studied for correlations using Pearson's cross-correlation coefficient ρ as the point of departure. We employ an estimator based on recurrence plots - the correlation of probability of recurrence (CPR) - to analyze connections between nine stock indices spread worldwide. We suggest a slight modification of the CPR approach in order to get more robust results. We examine trends in CPR for an approximately 19-month window moved along the time series and compare them to trends in ρ. Binning CPR into three levels of connectedness (strong, moderate, and weak), we extract the trends in number of connections in each bin over time. We also look at the behavior of CPR during the dot-com bubble by shifting the time series to align their peaks. CPR mainly uncovers that the markets move in and out of periods of strong connectivity erratically, instead of moving monotonically towards increasing global connectivity. This is in contrast to ρ, which gives a picture of ever-increasing correlation. CPR also exhibits that time-shifted markets have high connectivity around the dot-com bubble of 2000. We use significance tests using twin surrogates to interpret all the measures estimated in the study.

  17. ppcor: An R Package for a Fast Calculation to Semi-partial Correlation Coefficients.

    PubMed

    Kim, Seongho

    2015-11-01

    Lack of a general matrix formula hampers implementation of the semi-partial correlation, also known as part correlation, to the higher-order coefficient. This is because the higher-order semi-partial correlation calculation using a recursive formula requires an enormous number of recursive calculations to obtain the correlation coefficients. To resolve this difficulty, we derive a general matrix formula of the semi-partial correlation for fast computation. The semi-partial correlations are then implemented on an R package ppcor along with the partial correlation. Owing to the general matrix formulas, users can readily calculate the coefficients of both partial and semi-partial correlations without computational burden. The package ppcor further provides users with the level of the statistical significance with its test statistic.

  18. Evaluating fMRI methods for assessing hemispheric language dominance in healthy subjects.

    PubMed

    Baciu, Monica; Juphard, Alexandra; Cousin, Emilie; Bas, Jean François Le

    2005-08-01

    We evaluated two methods for quantifying the hemispheric language dominance in healthy subjects, by using a rhyme detection (deciding whether couple of words rhyme) and a word fluency (generating words starting with a given letter) task. One of methods called "flip method" (FM) was based on the direct statistical comparison between hemispheres' activity. The second one, the classical lateralization indices method (LIM), was based on calculating lateralization indices by taking into account the number of activated pixels within hemispheres. The main difference between methods is the statistical assessment of the inter-hemispheric difference: while FM shows if the difference between hemispheres' activity is statistically significant, LIM shows only that if there is a difference between hemispheres. The robustness of LIM and FM was assessed by calculating correlation coefficients between LIs obtained with each of these methods and manual lateralization indices MLI obtained with Edinburgh inventory. Our results showed significant correlation between LIs provided by each method and the MIL, suggesting that both methods are robust for quantifying hemispheric dominance for language in healthy subjects. In the present study we also evaluated the effect of spatial normalization, smoothing and "clustering" (NSC) on the intra-hemispheric location of activated regions and inter-hemispheric asymmetry of the activation. Our results have shown that NSC did not affect the hemispheric specialization but increased the value of the inter-hemispheric difference.

  19. Four-Component Relativistic State-Specific Multireference Perturbation Theory with a Simplified Treatment of Static Correlation.

    PubMed

    Ghosh, Anirban; Sinha Ray, Suvonil; Chaudhuri, Rajat K; Chattopadhyay, Sudip

    2017-02-23

    The relativistic multireference (MR) perturbative approach is one of the most successful tools for the description of computationally demanding molecular systems of heavy elements. We present here the ground state dissociation energy surfaces, equilibrium bond lengths, harmonic frequencies, and dissociation energies of Ag 2 , Cu 2 , Au 2 , and I 2 computed using the four-component (4c) relativistic spinors based state-specific MR perturbation theory (SSMRPT) with improved virtual orbital complete active space configuration interaction (IVO-CASCI) functions. The IVO-CASCI method is a simple, robust, useful and lower cost alternative to the complete active space self-consistent field approach for treating quasidegenerate situations. The redeeming features of the resulting method, termed as 4c-IVO-SSMRPT, lies in (i) manifestly size-extensivity, (ii) exemption from intruder problems, (iii) the freedom of convenient multipartitionings of the Hamiltonian, (iv) flexibility of the relaxed and unrelaxed descriptions of the reference coefficients, and (v) manageable cost/accuracy ratio. The present method delivers accurate descriptions of dissociation processes of heavy element systems. Close agreement with reference values has been found for the calculated molecular constants indicating that our 4c-IVOSSMRPT provides a robust and economic protocol for determining the structural properties for the ground state of heavy element molecules with eloquent MR character as it treats correlation and relativity on equal footing.

  20. Ecological analysis of the health effects of income inequality in Argentina.

    PubMed

    De Maio, Fernando G

    2008-05-01

    Despite a large body of empirical literature, a consensus has not been reached concerning the health effects of income inequality. This study contributes to ongoing debates by examining the robustness of the income inequality-population health relationship in Argentina, using five different income inequality indexes (each sensitive to inequalities in differing parts of the income spectrum) and five measures of population health. Cross-sectional, ecological study. Income and self-reported morbidity data from Argentina's 2001 Encuesta de Condiciones de Vida (Survey of living conditions) were analysed at the provincial level. Provincial rates of male/female life expectancy and infant mortality were drawn from the Instituto Nacional de Estadistica y Censos database. Life expectancy was correlated in the expected direction with provincial-level income inequality (operationalized as the Gini coefficient) for both males (r=-0.55, P<0.01) and females (r=-0.61, P<0.01), but this association was not robust for all five income inequality indexes. In contrast, infant mortality, self-reported poor health and self-reported activity limitation were not correlated with any of the income inequality indexes. This study adds further complexity to the literature on the health effects of income inequality by highlighting the important effects of operational definitions. Mortality and morbidity data cannot be used as reasonably interchangeable variables (a common practice in this literature), and the choice of income inequality indicator may influence the results.

  1. Tumour heterogeneity in glioblastoma assessed by MRI texture analysis: a potential marker of survival

    PubMed Central

    Pérez-Beteta, Julián; Luque, Belén; Arregui, Elena; Calvo, Manuel; Borrás, José M; López, Carlos; Martino, Juan; Velasquez, Carlos; Asenjo, Beatriz; Benavides, Manuel; Herruzo, Ismael; Martínez-González, Alicia; Pérez-Romasanta, Luis; Arana, Estanislao; Pérez-García, Víctor M

    2016-01-01

    Objective: The main objective of this retrospective work was the study of three-dimensional (3D) heterogeneity measures of post-contrast pre-operative MR images acquired with T1 weighted sequences of patients with glioblastoma (GBM) as predictors of clinical outcome. Methods: 79 patients from 3 hospitals were included in the study. 16 3D textural heterogeneity measures were computed including run-length matrix (RLM) features (regional heterogeneity) and co-occurrence matrix (CM) features (local heterogeneity). The significance of the results was studied using Kaplan–Meier curves and Cox proportional hazards analysis. Correlation between the variables of the study was assessed using the Spearman's correlation coefficient. Results: Kaplan–Meyer survival analysis showed that 4 of the 11 RLM features and 4 of the 5 CM features considered were robust predictors of survival. The median survival differences in the most significant cases were of over 6 months. Conclusion: Heterogeneity measures computed on the post-contrast pre-operative T1 weighted MR images of patients with GBM are predictors of survival. Advances in knowledge: Texture analysis to assess tumour heterogeneity has been widely studied. However, most works develop a two-dimensional analysis, focusing only on one MRI slice to state tumour heterogeneity. The study of fully 3D heterogeneity textural features as predictors of clinical outcome is more robust and is not dependent on the selected slice of the tumour. PMID:27319577

  2. Tumour heterogeneity in glioblastoma assessed by MRI texture analysis: a potential marker of survival.

    PubMed

    Molina, David; Pérez-Beteta, Julián; Luque, Belén; Arregui, Elena; Calvo, Manuel; Borrás, José M; López, Carlos; Martino, Juan; Velasquez, Carlos; Asenjo, Beatriz; Benavides, Manuel; Herruzo, Ismael; Martínez-González, Alicia; Pérez-Romasanta, Luis; Arana, Estanislao; Pérez-García, Víctor M

    2016-07-04

    The main objective of this retrospective work was the study of three-dimensional (3D) heterogeneity measures of post-contrast pre-operative MR images acquired with T 1 weighted sequences of patients with glioblastoma (GBM) as predictors of clinical outcome. 79 patients from 3 hospitals were included in the study. 16 3D textural heterogeneity measures were computed including run-length matrix (RLM) features (regional heterogeneity) and co-occurrence matrix (CM) features (local heterogeneity). The significance of the results was studied using Kaplan-Meier curves and Cox proportional hazards analysis. Correlation between the variables of the study was assessed using the Spearman's correlation coefficient. Kaplan-Meyer survival analysis showed that 4 of the 11 RLM features and 4 of the 5 CM features considered were robust predictors of survival. The median survival differences in the most significant cases were of over 6 months. Heterogeneity measures computed on the post-contrast pre-operative T 1 weighted MR images of patients with GBM are predictors of survival. Texture analysis to assess tumour heterogeneity has been widely studied. However, most works develop a two-dimensional analysis, focusing only on one MRI slice to state tumour heterogeneity. The study of fully 3D heterogeneity textural features as predictors of clinical outcome is more robust and is not dependent on the selected slice of the tumour.

  3. Cross-cultural adaptation of the international consultation incontinence questionnaire male lower urinary tract symptoms-long form (ICIQ-MLUTS-LF) in Persian.

    PubMed

    Pourmomeny, Abbas Ali; Mazdak, Hamid

    2017-06-01

    The purpose of this study was to translate male lower urinary tract symptoms long form (MLUTS-LF) questionnaire and determine its psychometric properties in Persian speaking subjects. Assessment instrument is essential for research, making diagnosis, and for evaluating the treatment outcomes in subjects with lower urinary tract disorders of either gender. Long form of MLUTS questionnaire is a robust self-report questionnaire that investigates the major aspects of lower urinary tract symptoms and their impact on quality of life. After getting permission from the International Consultation International Questionnaire website, the forward and backward translation MLUTS carried out by researcher team and assess content/face/construct validity, reliability in sample of MLUTS Iranian patients and, quality rating and pilot testing. The irritating and obstructing lower urinary disorders were categorized as mild, moderate, and severe in the study sample. Twenty two subjects were suffering from urinary incontinence and most of the participants had benign prostate hyperplasia (BPH). Cronbach's alpha coefficient was 0.819. Correlations between the MLUTS and International prostate symptom score (IPSS) was 0.753. The MLUTS Questionnaire showed good internal consistency, content validity, and construct validity, as measured by correlation with scores on the IPSS. The Iranian version of the MLUTS questionnaire is a valid and robust instrument that can be used in clinical settings and in research. © 2016 Wiley Periodicals, Inc.

  4. Optimal portfolio strategy with cross-correlation matrix composed by DCCA coefficients: Evidence from the Chinese stock market

    NASA Astrophysics Data System (ADS)

    Sun, Xuelian; Liu, Zixian

    2016-02-01

    In this paper, a new estimator of correlation matrix is proposed, which is composed of the detrended cross-correlation coefficients (DCCA coefficients), to improve portfolio optimization. In contrast to Pearson's correlation coefficients (PCC), DCCA coefficients acquired by the detrended cross-correlation analysis (DCCA) method can describe the nonlinear correlation between assets, and can be decomposed in different time scales. These properties of DCCA make it possible to improve the investment effect and more valuable to investigate the scale behaviors of portfolios. The minimum variance portfolio (MVP) model and the Mean-Variance (MV) model are used to evaluate the effectiveness of this improvement. Stability analysis shows the effect of two kinds of correlation matrices on the estimation error of portfolio weights. The observed scale behaviors are significant to risk management and could be used to optimize the portfolio selection.

  5. Relationships among the slopes of lines derived from various data analysis techniques and the associated correlation coefficient

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1980-01-01

    A technique for fitting a straight line to a collection of data points is given. The relationships between the slopes and correlation coefficients, and between the corresponding standard deviations and correlation coefficient are given.

  6. ROBUST: an interactive FORTRAN-77 package for exploratory data analysis using parametric, ROBUST and nonparametric location and scale estimates, data transformations, normality tests, and outlier assessment

    NASA Astrophysics Data System (ADS)

    Rock, N. M. S.

    ROBUST calculates 53 statistics, plus significance levels for 6 hypothesis tests, on each of up to 52 variables. These together allow the following properties of the data distribution for each variable to be examined in detail: (1) Location. Three means (arithmetic, geometric, harmonic) are calculated, together with the midrange and 19 high-performance robust L-, M-, and W-estimates of location (combined, adaptive, trimmed estimates, etc.) (2) Scale. The standard deviation is calculated along with the H-spread/2 (≈ semi-interquartile range), the mean and median absolute deviations from both mean and median, and a biweight scale estimator. The 23 location and 6 scale estimators programmed cover all possible degrees of robustness. (3) Normality: Distributions are tested against the null hypothesis that they are normal, using the 3rd (√ h1) and 4th ( b 2) moments, Geary's ratio (mean deviation/standard deviation), Filliben's probability plot correlation coefficient, and a more robust test based on the biweight scale estimator. These statistics collectively are sensitive to most usual departures from normality. (4) Presence of outliers. The maximum and minimum values are assessed individually or jointly using Grubbs' maximum Studentized residuals, Harvey's and Dixon's criteria, and the Studentized range. For a single input variable, outliers can be either winsorized or eliminated and all estimates recalculated iteratively as desired. The following data-transformations also can be applied: linear, log 10, generalized Box Cox power (including log, reciprocal, and square root), exponentiation, and standardization. For more than one variable, all results are tabulated in a single run of ROBUST. Further options are incorporated to assess ratios (of two variables) as well as discrete variables, and be concerned with missing data. Cumulative S-plots (for assessing normality graphically) also can be generated. The mutual consistency or inconsistency of all these measures helps to detect errors in data as well as to assess data-distributions themselves.

  7. A robust control scheme for flexible arms with friction in the joints

    NASA Technical Reports Server (NTRS)

    Rattan, Kuldip S.; Feliu, Vicente; Brown, H. Benjamin, Jr.

    1988-01-01

    A general control scheme to control flexible arms with friction in the joints is proposed in this paper. This scheme presents the advantage of being robust in the sense that it minimizes the effects of the Coulomb friction existing in the motor and the effects of changes in the dynamic friction coefficient. A justification of the robustness properties of the scheme is given in terms of the sensitivity analysis.

  8. Correlation Between Minimum Apparent Diffusion Coefficient (ADCmin) and Tumor Cellularity: A Meta-analysis.

    PubMed

    Surov, Alexey; Meyer, Hans Jonas; Wienke, Andreas

    2017-07-01

    Diffusion-weighted imaging (DWI) is a magnetic resonance imaging (MRI) technique based on measure of water diffusion that can provide information about tissue microstructure, especially about cell count. Increase of cell density induces restriction of water diffusion and decreases apparent diffusion coefficient (ADC). ADC can be divided into three sub-parameters: ADC minimum or ADC min , mean ADC or ADC mean and ADC maximum or ADC max Some studies have suggested that ADC min shows stronger correlations with cell count in comparison to other ADC fractions and may be used as a parameter for estimation of tumor cellularity. The aim of the present meta-analysis was to summarize correlation coefficients between ADC min and cellularity in different tumors based on large patient data. For this analysis, MEDLINE database was screened for associations between ADC and cell count in different tumors up to September 2016. For this work, only data regarding ADC min were included. Overall, 12 publications with 317 patients were identified. Spearman's correlation coefficient was used to analyze associations between ADC min and cellularity. The reported Pearson correlation coefficients in some publications were converted into Spearman correlation coefficients. The pooled correlation coefficient for all included studies was ρ=-0.59 (95% confidence interval (CI)=-0.72 to -0.45), heterogeneity Tau 2 =0.04 (p<0.0001), I 2 =73%, test for overall effect Z=8.67 (p<0.00001). ADC min correlated moderately with tumor cellularity. The calculated correlation coefficient is not stronger in comparison to the reported coefficient for ADC mean and, therefore, ADC min does not represent a better means to reflect cellularity. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  9. Correlation of human papillomavirus status with apparent diffusion coefficient of diffusion-weighted MRI in head and neck squamous cell carcinomas.

    PubMed

    Driessen, Juliette P; van Bemmel, Alexander J M; van Kempen, Pauline M W; Janssen, Luuk M; Terhaard, Chris H J; Pameijer, Frank A; Willems, Stefan M; Stegeman, Inge; Grolman, Wilko; Philippens, Marielle E P

    2016-04-01

    Identification of prognostic patient characteristics in head and neck squamous cell carcinoma (HNSCC) is of great importance. Human papillomavirus (HPV)-positive HNSCCs have favorable response to (chemo)radiotherapy. Apparent diffusion coefficient, derived from diffusion-weighted MRI, has also shown to predict treatment response. The purpose of this study was to evaluate the correlation between HPV status and apparent diffusion coefficient. Seventy-three patients with histologically proven HNSCC were retrospectively analyzed. Mean pretreatment apparent diffusion coefficient was calculated by delineation of total tumor volume on diffusion-weighted MRI. HPV status was analyzed and correlated to apparent diffusion coefficient. Six HNSCCs were HPV-positive. HPV-positive HNSCC showed significantly lower apparent diffusion coefficient compared to HPV-negative. This correlation was independent of other patient characteristics. In HNSCC, positive HPV status correlates with low mean apparent diffusion coefficient. The favorable prognostic value of low pretreatment apparent diffusion coefficient might be partially attributed to patients with a positive HPV status. © 2015 Wiley Periodicals, Inc. Head Neck 38: E613-E618, 2016. © 2015 Wiley Periodicals, Inc.

  10. Robust three-body water simulation model

    NASA Astrophysics Data System (ADS)

    Tainter, C. J.; Pieniazek, P. A.; Lin, Y.-S.; Skinner, J. L.

    2011-05-01

    The most common potentials used in classical simulations of liquid water assume a pairwise additive form. Although these models have been very successful in reproducing many properties of liquid water at ambient conditions, none is able to describe accurately water throughout its complicated phase diagram. The primary reason for this is the neglect of many-body interactions. To this end, a simulation model with explicit three-body interactions was introduced recently [R. Kumar and J. L. Skinner, J. Phys. Chem. B 112, 8311 (2008), 10.1021/jp8009468]. This model was parameterized to fit the experimental O-O radial distribution function and diffusion constant. Herein we reparameterize the model, fitting to a wider range of experimental properties (diffusion constant, rotational correlation time, density for the liquid, liquid/vapor surface tension, melting point, and the ice Ih density). The robustness of the model is then verified by comparing simulation to experiment for a number of other quantities (enthalpy of vaporization, dielectric constant, Debye relaxation time, temperature of maximum density, and the temperature-dependent second and third virial coefficients), with good agreement.

  11. Similarity analysis between chromosomes of Homo sapiens and monkeys with correlation coefficient, rank correlation coefficient and cosine similarity measures

    PubMed Central

    Someswara Rao, Chinta; Viswanadha Raju, S.

    2016-01-01

    In this paper, we consider correlation coefficient, rank correlation coefficient and cosine similarity measures for evaluating similarity between Homo sapiens and monkeys. We used DNA chromosomes of genome wide genes to determine the correlation between the chromosomal content and evolutionary relationship. The similarity among the H. sapiens and monkeys is measured for a total of 210 chromosomes related to 10 species. The similarity measures of these different species show the relationship between the H. sapiens and monkey. This similarity will be helpful at theft identification, maternity identification, disease identification, etc. PMID:26981409

  12. Similarity analysis between chromosomes of Homo sapiens and monkeys with correlation coefficient, rank correlation coefficient and cosine similarity measures.

    PubMed

    Someswara Rao, Chinta; Viswanadha Raju, S

    2016-03-01

    In this paper, we consider correlation coefficient, rank correlation coefficient and cosine similarity measures for evaluating similarity between Homo sapiens and monkeys. We used DNA chromosomes of genome wide genes to determine the correlation between the chromosomal content and evolutionary relationship. The similarity among the H. sapiens and monkeys is measured for a total of 210 chromosomes related to 10 species. The similarity measures of these different species show the relationship between the H. sapiens and monkey. This similarity will be helpful at theft identification, maternity identification, disease identification, etc.

  13. Smoothing effect for spatially distributed renewable resources and its impact on power grid robustness.

    PubMed

    Nagata, Motoki; Hirata, Yoshito; Fujiwara, Naoya; Tanaka, Gouhei; Suzuki, Hideyuki; Aihara, Kazuyuki

    2017-03-01

    In this paper, we show that spatial correlation of renewable energy outputs greatly influences the robustness of the power grids against large fluctuations of the effective power. First, we evaluate the spatial correlation among renewable energy outputs. We find that the spatial correlation of renewable energy outputs depends on the locations, while the influence of the spatial correlation of renewable energy outputs on power grids is not well known. Thus, second, by employing the topology of the power grid in eastern Japan, we analyze the robustness of the power grid with spatial correlation of renewable energy outputs. The analysis is performed by using a realistic differential-algebraic equations model. The results show that the spatial correlation of the energy resources strongly degrades the robustness of the power grid. Our results suggest that we should consider the spatial correlation of the renewable energy outputs when estimating the stability of power grids.

  14. QSAR study of curcumine derivatives as HIV-1 integrase inhibitors.

    PubMed

    Gupta, Pawan; Sharma, Anju; Garg, Prabha; Roy, Nilanjan

    2013-03-01

    A QSAR study was performed on curcumine derivatives as HIV-1 integrase inhibitors using multiple linear regression. The statistically significant model was developed with squared correlation coefficients (r(2)) 0.891 and cross validated r(2) (r(2) cv) 0.825. The developed model revealed that electronic, shape, size, geometry, substitution's information and hydrophilicity were important atomic properties for determining the inhibitory activity of these molecules. The model was also tested successfully for external validation (r(2) pred = 0.849) as well as Tropsha's test for model predictability. Furthermore, the domain analysis was carried out to evaluate the prediction reliability of external set molecules. The model was statistically robust and had good predictive power which can be successfully utilized for screening of new molecules.

  15. Reproducibility of resting state spinal cord networks in healthy volunteers at 7 Tesla.

    PubMed

    Barry, Robert L; Rogers, Baxter P; Conrad, Benjamin N; Smith, Seth A; Gore, John C

    2016-06-01

    We recently reported our findings of resting state functional connectivity in the human spinal cord: in a cohort of healthy volunteers we observed robust functional connectivity between left and right ventral (motor) horns and between left and right dorsal (sensory) horns (Barry et al., 2014). Building upon these results, we now quantify the within-subject reproducibility of bilateral motor and sensory networks (intraclass correlation coefficient=0.54-0.56) and explore the impact of including frequencies up to 0.13Hz. Our results suggest that frequencies above 0.08Hz may enhance the detectability of these resting state networks, which would be beneficial for practical studies of spinal cord functional connectivity. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Associations between personal exposures and ambient concentrations of nitrogen dioxide: A quantitative research synthesis

    NASA Astrophysics Data System (ADS)

    Meng, Q. Y.; Svendsgaard, D.; Kotchmar, D. J.; Pinto, J. P.

    2012-09-01

    Although positive associations between ambient NO2 concentrations and personal exposures have generally been found by exposure studies, the strength of the associations varied among studies. Differences in results could be related to differences in study design and in exposure factors. However, the effects of study design, exposure factors, and sampling and measurement errors on the strength of the personal-ambient associations have not been evaluated quantitatively in a systematic manner. A quantitative research synthesis was conducted to examine these issues based on peer-reviewed publications in the past 30 years. Factors affecting the strength of the personal-ambient associations across the studies were also examined with meta-regression. Ambient NO2 was found to be significantly associated with personal NO2 exposures, with estimates of 0.42, 0.16, and 0.72 for overall pooled, longitudinal and daily average correlation coefficients based on random-effects meta-analysis. This conclusion was robust after correction for publication bias with correlation coefficients of 0.37, 0.16 and 0.45. We found that season and some population characteristics, such as pre-existing disease, were significant factors affecting the strength of the personal-ambient associations. More meaningful and rigorous comparisons would be possible if greater detail were published on the study design (e.g. local and indoor sources, housing characteristics, etc.) and data quality (e.g., detection limits and percent of data above detection limits).

  17. Prediction of Moisture Content for Congou Black Tea Withering Leaves Using Image Features and Nonlinear Method.

    PubMed

    Liang, Gaozhen; Dong, Chunwang; Hu, Bin; Zhu, Hongkai; Yuan, Haibo; Jiang, Yongwen; Hao, Guoshuang

    2018-05-18

    Withering is the first step in the processing of congou black tea. With respect to the deficiency of traditional water content detection methods, a machine vision based NDT (Non Destructive Testing) method was established to detect the moisture content of withered leaves. First, according to the time sequences using computer visual system collected visible light images of tea leaf surfaces, and color and texture characteristics are extracted through the spatial changes of colors. Then quantitative prediction models for moisture content detection of withered tea leaves was established through linear PLS (Partial Least Squares) and non-linear SVM (Support Vector Machine). The results showed correlation coefficients higher than 0.8 between the water contents and green component mean value (G), lightness component mean value (L * ) and uniformity (U), which means that the extracted characteristics have great potential to predict the water contents. The performance parameters as correlation coefficient of prediction set (Rp), root-mean-square error of prediction (RMSEP), and relative standard deviation (RPD) of the SVM prediction model are 0.9314, 0.0411 and 1.8004, respectively. The non-linear modeling method can better describe the quantitative analytical relations between the image and water content. With superior generalization and robustness, the method would provide a new train of thought and theoretical basis for the online water content monitoring technology of automated production of black tea.

  18. Comparison of Automated Brain Volume Measures obtained with NeuroQuant and FreeSurfer.

    PubMed

    Ochs, Alfred L; Ross, David E; Zannoni, Megan D; Abildskov, Tracy J; Bigler, Erin D

    2015-01-01

    To examine intermethod reliabilities and differences between FreeSurfer and the FDA-cleared congener, NeuroQuant, both fully automated methods for structural brain MRI measurements. MRI scans from 20 normal control subjects, 20 Alzheimer's disease patients, and 20 mild traumatically brain-injured patients were analyzed with NeuroQuant and with FreeSurfer. Intermethod reliability was evaluated. Pairwise correlation coefficients, intraclass correlation coefficients, and effect size differences were computed. NeuroQuant versus FreeSurfer measures showed excellent to good intermethod reliability for the 21 regions evaluated (r: .63 to .99/ICC: .62 to .99/ES: -.33 to 2.08) except for the pallidum (r/ICC/ES = .31/.29/-2.2) and cerebellar white matter (r/ICC/ES = .31/.31/.08). Volumes reported by NeuroQuant were generally larger than those reported by FreeSurfer with the whole brain parenchyma volume reported by NeuroQuant 6.50% larger than the volume reported by FreeSurfer. There was no systematic difference in results between the 3 subgroups. NeuroQuant and FreeSurfer showed good to excellent intermethod reliability in volumetric measurements for all brain regions examined with the only exceptions being the pallidum and cerebellar white matter. This finding was robust for normal individuals, patients with Alzheimer's disease, and patients with mild traumatic brain injury. Copyright © 2015 by the American Society of Neuroimaging.

  19. Cross-cultural application of the Korean version of Ureteral Stent Symptoms Questionnaire.

    PubMed

    Park, Jinsung; Shin, Dong Wook; You, Changhee; Chung, Kyung Jin; Han, Deok Hyun; Joshi, Hrishi B; Park, Hyung Keun

    2012-11-01

    We validated the Korean version of the Ureteral Stent Symptoms Questionnaire (USSQ) in patients with an indwelling ureteral stent. Linguistic validation of the original USSQ was performed through a standard process including translation, back translation, and pilot study. A total of 65 patients who underwent ureteroscopic surgery were asked to complete the Korean USSQ as well as EuroQOL (male and female), the International Prostate Symptom Score (male), and Urogenital Distress Inventory-6 (female). Patients were evaluated at weeks 1 and 2 after stent placement and at week 4 after removal. Sixty-four healthy subjects without a ureteral stent were also asked to complete the Korean USSQ once. The psychometric properties of the questionnaire were analyzed. Internal consistencies (Cronbach α coefficients: 0.73-0.83) and test-retest reliability (Spearman correlation coefficient: ≥0.6) were satisfactory for urinary symptom, body pain, general health, and work performance domains. Most USSQ domains showed moderate correlations with each other. Convergent validity determined by correlation between other instruments and corresponding USSQ domain was satisfactory. Sensitivity to change and discriminant validity were also good in most domains (P<0.01). Only a small proportion of the study population had an active sexual life, with the stent in situ, limiting its analysis. The Korean version of the USSQ is a reliable and valid instrument that can be self-administered by Korean patients with a ureteral stent in the clinical and research settings. Further clinical studies in the Korean settings would be useful to provide robust data on sensitivity to change.

  20. Improving image segmentation performance and quantitative analysis via a computer-aided grading methodology for optical coherence tomography retinal image analysis.

    PubMed

    Debuc, Delia Cabrera; Salinas, Harry M; Ranganathan, Sudarshan; Tátrai, Erika; Gao, Wei; Shen, Meixiao; Wang, Jianhua; Somfai, Gábor M; Puliafito, Carmen A

    2010-01-01

    We demonstrate quantitative analysis and error correction of optical coherence tomography (OCT) retinal images by using a custom-built, computer-aided grading methodology. A total of 60 Stratus OCT (Carl Zeiss Meditec, Dublin, California) B-scans collected from ten normal healthy eyes are analyzed by two independent graders. The average retinal thickness per macular region is compared with the automated Stratus OCT results. Intergrader and intragrader reproducibility is calculated by Bland-Altman plots of the mean difference between both gradings and by Pearson correlation coefficients. In addition, the correlation between Stratus OCT and our methodology-derived thickness is also presented. The mean thickness difference between Stratus OCT and our methodology is 6.53 microm and 26.71 microm when using the inner segment/outer segment (IS/OS) junction and outer segment/retinal pigment epithelium (OS/RPE) junction as the outer retinal border, respectively. Overall, the median of the thickness differences as a percentage of the mean thickness is less than 1% and 2% for the intragrader and intergrader reproducibility test, respectively. The measurement accuracy range of the OCT retinal image analysis (OCTRIMA) algorithm is between 0.27 and 1.47 microm and 0.6 and 1.76 microm for the intragrader and intergrader reproducibility tests, respectively. Pearson correlation coefficients demonstrate R(2)>0.98 for all Early Treatment Diabetic Retinopathy Study (ETDRS) regions. Our methodology facilitates a more robust and localized quantification of the retinal structure in normal healthy controls and patients with clinically significant intraretinal features.

  1. Improving image segmentation performance and quantitative analysis via a computer-aided grading methodology for optical coherence tomography retinal image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Debuc, Delia; Salinas, Harry M.; Ranganathan, Sudarshan; Tátrai, Erika; Gao, Wei; Shen, Meixiao; Wang, Jianhua; Somfai, Gábor M.; Puliafito, Carmen A.

    2010-07-01

    We demonstrate quantitative analysis and error correction of optical coherence tomography (OCT) retinal images by using a custom-built, computer-aided grading methodology. A total of 60 Stratus OCT (Carl Zeiss Meditec, Dublin, California) B-scans collected from ten normal healthy eyes are analyzed by two independent graders. The average retinal thickness per macular region is compared with the automated Stratus OCT results. Intergrader and intragrader reproducibility is calculated by Bland-Altman plots of the mean difference between both gradings and by Pearson correlation coefficients. In addition, the correlation between Stratus OCT and our methodology-derived thickness is also presented. The mean thickness difference between Stratus OCT and our methodology is 6.53 μm and 26.71 μm when using the inner segment/outer segment (IS/OS) junction and outer segment/retinal pigment epithelium (OS/RPE) junction as the outer retinal border, respectively. Overall, the median of the thickness differences as a percentage of the mean thickness is less than 1% and 2% for the intragrader and intergrader reproducibility test, respectively. The measurement accuracy range of the OCT retinal image analysis (OCTRIMA) algorithm is between 0.27 and 1.47 μm and 0.6 and 1.76 μm for the intragrader and intergrader reproducibility tests, respectively. Pearson correlation coefficients demonstrate R2>0.98 for all Early Treatment Diabetic Retinopathy Study (ETDRS) regions. Our methodology facilitates a more robust and localized quantification of the retinal structure in normal healthy controls and patients with clinically significant intraretinal features.

  2. Handheld echocardiography during hospitalization for acute myocardial infarction.

    PubMed

    Cullen, Michael W; Geske, Jeffrey B; Anavekar, Nandan S; Askew, J Wells; Lewis, Bradley R; Oh, Jae K

    2017-11-01

    Handheld echocardiography (HHE) is concordant with standard transthoracic echocardiography (TTE) in a variety of settings but has not been thoroughly compared to traditional TTE in patients with acute myocardial infarction (AMI). Completed by experienced operators, HHE provides accurate diagnostic capabilities compared with standard TTE in AMI patients. This study prospectively enrolled patients admitted to the coronary care unit with AMI. Experienced sonographers performed HHE with a V-scan. All patients underwent clinical TTE. Each HHE was interpreted by 2 experts blinded to standard TTE. Agreement was assessed with κ statistics and concordance correlation coefficients. Analysis included 82 patients (mean age, 66 years; 74% male). On standard TTE, mean left ventricular (LV) ejection fraction was 46%. Correlation coefficients between HHE and TTE were 0.75 (95% confidence interval: 0.66 to 0.82) for LV ejection fraction and 0.69 (95% confidence interval: 0.58 to 0.77) for wall motion score index. The κ statistics ranged from 0.47 to 0.56 for LV enlargement, 0.55 to 0.79 for mitral regurgitation, and 0.44 to 0.57 for inferior vena cava dilatation. The κ statistics were highest for the anterior (0.81) and septal (0.71) apex and lowest for the mid inferolateral (0.36) and basal inferoseptal (0.36) walls. In patients with AMI, HHE and standard TTE demonstrate good correlation for LV function and wall motion. Agreement was less robust for structural abnormalities and specific wall segments. In experienced hands, HHE can provide a focused assessment of LV function in patients hospitalized with AMI; however, HHE should not substitute for comprehensive TTE. © 2017 Wiley Periodicals, Inc.

  3. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  4. Correlation coefficient measurement of the mode-locked laser tones using four-wave mixing.

    PubMed

    Anthur, Aravind P; Panapakkam, Vivek; Vujicic, Vidak; Merghem, Kamel; Lelarge, Francois; Ramdane, Abderrahim; Barry, Liam P

    2016-06-01

    We use four-wave mixing to measure the correlation coefficient of comb tones in a quantum-dash mode-locked laser under passive and active locked regimes. We study the uncertainty in the measurement of the correlation coefficient of the proposed method.

  5. Empirical correlations for axial dispersion coefficient and Peclet number in fixed-bed columns.

    PubMed

    Rastegar, Seyed Omid; Gu, Tingyue

    2017-03-24

    In this work, a new correlation for the axial dispersion coefficient was obtained using experimental data in the literature for axial dispersion in fixed-bed columns packed with particles. The Chung and Wen correlation, the De Ligny correlation are two popular empirical correlations. However, the former lacks the molecular diffusion term and the latter does not consider bed voidage. The new axial dispersion coefficient correlation in this work was based on additional experimental data in the literature by considering both molecular diffusion and bed voidage. It is more comprehensive and accurate. The Peclet number correlation from the new axial dispersion coefficient correlation on the average leads to 12% lower Peclet number values compared to the values from the Chung and Wen correlation, and in many cases much smaller than those from the De Ligny correlation. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Data-Driven Robust M-LS-SVR-Based NARX Modeling for Estimation and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking.

    PubMed

    Zhou, Ping; Guo, Dongwei; Wang, Hong; Chai, Tianyou

    2017-09-29

    Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVR (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. This indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.

  7. Data-Driven Robust M-LS-SVR-Based NARX Modeling for Estimation and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ping; Guo, Dongwei; Wang, Hong

    Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVRmore » (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. In conclusion, this indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.« less

  8. Data-Driven Robust M-LS-SVR-Based NARX Modeling for Estimation and Control of Molten Iron Quality Indices in Blast Furnace Ironmaking

    DOE PAGES

    Zhou, Ping; Guo, Dongwei; Wang, Hong; ...

    2017-09-29

    Optimal operation of an industrial blast furnace (BF) ironmaking process largely depends on a reliable measurement of molten iron quality (MIQ) indices, which are not feasible using the conventional sensors. This paper proposes a novel data-driven robust modeling method for the online estimation and control of MIQ indices. First, a nonlinear autoregressive exogenous (NARX) model is constructed for the MIQ indices to completely capture the nonlinear dynamics of the BF process. Then, considering that the standard least-squares support vector regression (LS-SVR) cannot directly cope with the multioutput problem, a multitask transfer learning is proposed to design a novel multioutput LS-SVRmore » (M-LS-SVR) for the learning of the NARX model. Furthermore, a novel M-estimator is proposed to reduce the interference of outliers and improve the robustness of the M-LS-SVR model. Since the weights of different outlier data are properly given by the weight function, their corresponding contributions on modeling can properly be distinguished, thus a robust modeling result can be achieved. Finally, a novel multiobjective evaluation index on the modeling performance is developed by comprehensively considering the root-mean-square error of modeling and the correlation coefficient on trend fitting, based on which the nondominated sorting genetic algorithm II is used to globally optimize the model parameters. Both experiments using industrial data and industrial applications illustrate that the proposed method can eliminate the adverse effect caused by the fluctuation of data in BF process efficiently. In conclusion, this indicates its stronger robustness and higher accuracy. Moreover, control testing shows that the developed model can be well applied to realize data-driven control of the BF process.« less

  9. SAGE III Aerosol Extinction Validation in the Arctic Winter: Comparisons with SAGE II and POAM III

    NASA Technical Reports Server (NTRS)

    Thomason, L. W.; Poole, L. R.; Randall, C. E.

    2007-01-01

    The use of SAGE III multiwavelength aerosol extinction coefficient measurements to infer PSC type is contingent on the robustness of both the extinction magnitude and its spectral variation. Past validation with SAGE II and other similar measurements has shown that the SAGE III extinction coefficient measurements are reliable though the comparisons have been greatly weighted toward measurements made at mid-latitudes. Some aerosol comparisons made in the Arctic winter as a part of SOLVE II suggested that SAGE III values, particularly at longer wavelengths, are too small with the implication that both the magnitude and the wavelength dependence are not reliable. Comparisons with POAM III have also suggested a similar discrepancy. Herein, we use SAGE II data as a common standard for comparison of SAGE III and POAM III measurements in the Arctic winters of 2002/2003 through 2004/2005. During the winter, SAGE II measurements are made infrequently at the same latitudes as these instruments. We have mitigated this problem through the use potential vorticity as a spatial coordinate and thus greatly increased the number of coincident events. We find that SAGE II and III extinction coefficient measurements show a high degree of compatibility at both 1020 nm and 450 nm except a 10-20% bias at both wavelengths. In addition, the 452 to 1020-nm extinction ratio shows a consistent bias of approx. 30% throughout the lower stratosphere. We also find that SAGE II and POAM III are on average consistent though the comparisons show a much higher variability and larger bias than SAGE II/III comparisons. In addition, we find that the two data sets are not well correlated below 18 km. Overall, we find both the extinction values and the spectral dependence from SAGE III are robust and we find no evidence of a significant defect within the Arctic vortex.

  10. Proposed method to estimate the liquid-vapor accommodation coefficient based on experimental sonoluminescence data.

    PubMed

    Puente, Gabriela F; Bonetto, Fabián J

    2005-05-01

    We used the temporal evolution of the bubble radius in single-bubble sonoluminescence to estimate the water liquid-vapor accommodation coefficient. The rapid changes in the bubble radius that occur during the bubble collapse and rebounds are a function of the actual value of the accommodation coefficient. We selected bubble radius measurements obtained from two different experimental techniques in conjunction with a robust parameter estimation strategy and we obtained that for water at room temperature the mass accommodation coefficient is in the confidence interval [0.217,0.329].

  11. Testing the Difference of Correlated Agreement Coefficients for Statistical Significance

    ERIC Educational Resources Information Center

    Gwet, Kilem L.

    2016-01-01

    This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…

  12. Sparse coding for flexible, robust 3D facial-expression synthesis.

    PubMed

    Lin, Yuxu; Song, Mingli; Quynh, Dao Thi Phuong; He, Ying; Chen, Chun

    2012-01-01

    Computer animation researchers have been extensively investigating 3D facial-expression synthesis for decades. However, flexible, robust production of realistic 3D facial expressions is still technically challenging. A proposed modeling framework applies sparse coding to synthesize 3D expressive faces, using specified coefficients or expression examples. It also robustly recovers facial expressions from noisy and incomplete data. This approach can synthesize higher-quality expressions in less time than the state-of-the-art techniques.

  13. Lack of robustness of textural measures obtained from 3D brain tumor MRIs impose a need for standardization.

    PubMed

    Molina, David; Pérez-Beteta, Julián; Martínez-González, Alicia; Martino, Juan; Velasquez, Carlos; Arana, Estanislao; Pérez-García, Víctor M

    2017-01-01

    Textural measures have been widely explored as imaging biomarkers in cancer. However, their robustness under dynamic range and spatial resolution changes in brain 3D magnetic resonance images (MRI) has not been assessed. The aim of this work was to study potential variations of textural measures due to changes in MRI protocols. Twenty patients harboring glioblastoma with pretreatment 3D T1-weighted MRIs were included in the study. Four different spatial resolution combinations and three dynamic ranges were studied for each patient. Sixteen three-dimensional textural heterogeneity measures were computed for each patient and configuration including co-occurrence matrices (CM) features and run-length matrices (RLM) features. The coefficient of variation was used to assess the robustness of the measures in two series of experiments corresponding to (i) changing the dynamic range and (ii) changing the matrix size. No textural measures were robust under dynamic range changes. Entropy was the only textural feature robust under spatial resolution changes (coefficient of variation under 10% in all cases). Textural measures of three-dimensional brain tumor images are not robust neither under dynamic range nor under matrix size changes. Standards should be harmonized to use textural features as imaging biomarkers in radiomic-based studies. The implications of this work go beyond the specific tumor type studied here and pose the need for standardization in textural feature calculation of oncological images.

  14. Covariate selection with group lasso and doubly robust estimation of causal effects

    PubMed Central

    Koch, Brandon; Vock, David M.; Wolfson, Julian

    2017-01-01

    Summary The efficiency of doubly robust estimators of the average causal effect (ACE) of a treatment can be improved by including in the treatment and outcome models only those covariates which are related to both treatment and outcome (i.e., confounders) or related only to the outcome. However, it is often challenging to identify such covariates among the large number that may be measured in a given study. In this paper, we propose GLiDeR (Group Lasso and Doubly Robust Estimation), a novel variable selection technique for identifying confounders and predictors of outcome using an adaptive group lasso approach that simultaneously performs coefficient selection, regularization, and estimation across the treatment and outcome models. The selected variables and corresponding coefficient estimates are used in a standard doubly robust ACE estimator. We provide asymptotic results showing that, for a broad class of data generating mechanisms, GLiDeR yields a consistent estimator of the ACE when either the outcome or treatment model is correctly specified. A comprehensive simulation study shows that GLiDeR is more efficient than doubly robust methods using standard variable selection techniques and has substantial computational advantages over a recently proposed doubly robust Bayesian model averaging method. We illustrate our method by estimating the causal treatment effect of bilateral versus single-lung transplant on forced expiratory volume in one year after transplant using an observational registry. PMID:28636276

  15. Covariate selection with group lasso and doubly robust estimation of causal effects.

    PubMed

    Koch, Brandon; Vock, David M; Wolfson, Julian

    2018-03-01

    The efficiency of doubly robust estimators of the average causal effect (ACE) of a treatment can be improved by including in the treatment and outcome models only those covariates which are related to both treatment and outcome (i.e., confounders) or related only to the outcome. However, it is often challenging to identify such covariates among the large number that may be measured in a given study. In this article, we propose GLiDeR (Group Lasso and Doubly Robust Estimation), a novel variable selection technique for identifying confounders and predictors of outcome using an adaptive group lasso approach that simultaneously performs coefficient selection, regularization, and estimation across the treatment and outcome models. The selected variables and corresponding coefficient estimates are used in a standard doubly robust ACE estimator. We provide asymptotic results showing that, for a broad class of data generating mechanisms, GLiDeR yields a consistent estimator of the ACE when either the outcome or treatment model is correctly specified. A comprehensive simulation study shows that GLiDeR is more efficient than doubly robust methods using standard variable selection techniques and has substantial computational advantages over a recently proposed doubly robust Bayesian model averaging method. We illustrate our method by estimating the causal treatment effect of bilateral versus single-lung transplant on forced expiratory volume in one year after transplant using an observational registry. © 2017, The International Biometric Society.

  16. Correlation transfer from basal ganglia to thalamus in Parkinson's disease

    PubMed Central

    Pamela, Reitsma; Brent, Doiron; Jonathan, Rubin

    2011-01-01

    Spike trains from neurons in the basal ganglia of parkinsonian primates show increased pairwise correlations, oscillatory activity, and burst rate compared to those from neurons recorded during normal brain activity. However, it is not known how these changes affect the behavior of downstream thalamic neurons. To understand how patterns of basal ganglia population activity may affect thalamic spike statistics, we study pairs of model thalamocortical (TC) relay neurons receiving correlated inhibitory input from the internal segment of the globus pallidus (GPi), a primary output nucleus of the basal ganglia. We observe that the strength of correlations of TC neuron spike trains increases with the GPi correlation level, and bursty firing patterns such as those seen in the parkinsonian GPi allow for stronger transfer of correlations than do firing patterns found under normal conditions. We also show that the T-current in the TC neurons does not significantly affect correlation transfer, despite its pronounced effects on spiking. Oscillatory firing patterns in GPi are shown to affect the timescale at which correlations are best transferred through the system. To explain this last result, we analytically compute the spike count correlation coefficient for oscillatory cases in a reduced point process model. Our analysis indicates that the dependence of the timescale of correlation transfer is robust to different levels of input spike and rate correlations and arises due to differences in instantaneous spike correlations, even when the long timescale rhythmic modulations of neurons are identical. Overall, these results show that parkinsonian firing patterns in GPi do affect the transfer of correlations to the thalamus. PMID:22355287

  17. Estimating Seven Coefficients of Pairwise Relatedness Using Population-Genomic Data

    PubMed Central

    Ackerman, Matthew S.; Johri, Parul; Spitze, Ken; Xu, Sen; Doak, Thomas G.; Young, Kimberly; Lynch, Michael

    2017-01-01

    Population structure can be described by genotypic-correlation coefficients between groups of individuals, the most basic of which are the pairwise relatedness coefficients between any two individuals. There are nine pairwise relatedness coefficients in the most general model, and we show that these can be reduced to seven coefficients for biallelic loci. Although all nine coefficients can be estimated from pedigrees, six coefficients have been beyond empirical reach. We provide a numerical optimization procedure that estimates all seven reduced coefficients from population-genomic data. Simulations show that the procedure is nearly unbiased, even at 3× coverage, and errors in five of the seven coefficients are statistically uncorrelated. The remaining two coefficients have a negative correlation of errors, but their sum provides an unbiased assessment of the overall correlation of heterozygosity between two individuals. Application of these new methods to four populations of the freshwater crustacean Daphnia pulex reveal the occurrence of half siblings in our samples, as well as a number of identical individuals that are likely obligately asexual clone mates. Statistically significant negative estimates of these pairwise relatedness coefficients, including inbreeding coefficients that were typically negative, underscore the difficulties that arise when interpreting genotypic correlations as estimations of the probability that alleles are identical by descent. PMID:28341647

  18. Development of a robust modeling tool for radiation-induced segregation in austenitic stainless steels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Ying; Field, Kevin G; Allen, Todd R.

    2015-09-01

    Irradiation-assisted stress corrosion cracking (IASCC) of austenitic stainless steels in Light Water Reactor (LWR) components has been linked to changes in grain boundary composition due to irradiation induced segregation (RIS). This work developed a robust RIS modeling tool to account for thermodynamics and kinetics of the atom and defect transportation under combined thermal and radiation conditions. The diffusion flux equations were based on the Perks model formulated through the linear theory of the thermodynamics of irreversible processes. Both cross and non-cross phenomenological diffusion coefficients in the flux equations were considered and correlated to tracer diffusion coefficients through Manning’s relation. Themore » preferential atomvacancy coupling was described by the mobility model, whereas the preferential atom-interstitial coupling was described by the interstitial binding model. The composition dependence of the thermodynamic factor was modeled using the CALPHAD approach. Detailed analysis on the diffusion fluxes near and at grain boundaries of irradiated austenitic stainless steels suggested the dominant diffusion mechanism for chromium and iron is via vacancy, while that for nickel can swing from the vacancy to the interstitial dominant mechanism. The diffusion flux in the vicinity of a grain boundary was found to be greatly influenced by the composition gradient formed from the transient state, leading to the oscillatory behavior of alloy compositions in this region. This work confirms that both vacancy and interstitial diffusion, and segregation itself, have important roles in determining the microchemistry of Fe, Cr, and Ni at irradiated grain boundaries in austenitic stainless steels.« less

  19. Attenuation of the Squared Canonical Correlation Coefficient under Varying Estimates of Score Reliability

    ERIC Educational Resources Information Center

    Wilson, Celia M.

    2010-01-01

    Research pertaining to the distortion of the squared canonical correlation coefficient has traditionally been limited to the effects of sampling error and associated correction formulas. The purpose of this study was to compare the degree of attenuation of the squared canonical correlation coefficient under varying conditions of score reliability.…

  20. Thin and Slow Smoke Detection by Using Frequency Image

    NASA Astrophysics Data System (ADS)

    Zheng, Guang; Oe, Shunitiro

    In this paper, a new method to detect thin and slow smoke for early fire alarm by using frequency image has been proposed. The correlation coefficient of the frequency image between the current stage and the initial stage are calculated, so are the gray image correlation coefficient of the color image. When the thin smoke close to transparent enters into the camera view, the correlation coefficient of the frequency image becomes small, while the gray image correlation coefficient of the color image hardly change and keep large. When something which is not transparent, like human beings, etc., enters into the camera view, the correlation coefficient of the frequency image becomes small, as well as that of color image. Based on the difference of correlation coefficient between frequency image and color image in different situations, the thin smoke can be detected. Also, considering the movement of the thin smoke, miss detection caused by the illustration change or noise can be avoided. Several experiments in different situations are carried out, and the experimental results show the effect of the proposed method.

  1. Statistical Study of Turbulence: Spectral Functions and Correlation Coefficients

    NASA Technical Reports Server (NTRS)

    Frenkiel, Francois N.

    1958-01-01

    In reading the publications on turbulence of different authors, one often runs the risk of confusing the various correlation coefficients and turbulence spectra. We have made a point of defining, by appropriate concepts, the differences which exist between these functions. Besides, we introduce in the symbols a few new characteristics of turbulence. In the first chapter, we study some relations between the correlation coefficients and the different turbulence spectra. Certain relations are given by means of demonstrations which could be called intuitive rather than mathematical. In this way we demonstrate that the correlation coefficients between the simultaneous turbulent velocities at two points are identical, whether studied in Lagrange's or in Euler's systems. We then consider new spectra of turbulence, obtained by study of the simultaneous velocities along a straight line of given direction. We determine some relations between these spectra and the correlation coefficients. Examining the relation between the spectrum of the turbulence measured at a fixed point and the longitudinal-correlation curve given by G. I. Taylor, we find that this equation is exact only when the coefficient is very small.

  2. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    PubMed

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  3. Quantifying the range of cross-correlated fluctuations using a q- L dependent AHXA coefficient

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Wang, Lin; Chen, Yuming

    2018-03-01

    Recently, based on analogous height cross-correlation analysis (AHXA), a cross-correlation coefficient ρ×(L) has been proposed to quantify the levels of cross-correlation on different temporal scales for bivariate series. A limitation of this coefficient is that it cannot capture the full information of cross-correlations on amplitude of fluctuations. In fact, it only detects the cross-correlation at a specific order fluctuation, which might neglect some important information inherited from other order fluctuations. To overcome this disadvantage, in this work, based on the scaling of the qth order covariance and time delay L, we define a two-parameter dependent cross-correlation coefficient ρq(L) to detect and quantify the range and level of cross-correlations. This new version of ρq(L) coefficient leads to the formation of a ρq(L) surface, which not only is able to quantify the level of cross-correlations, but also allows us to identify the range of fluctuation amplitudes that are correlated in two given signals. Applications to the classical ARFIMA models and the binomial multifractal series illustrate the feasibility of this new coefficient ρq(L) . In addition, a statistical test is proposed to quantify the existence of cross-correlations between two given series. Applying our method to the real life empirical data from the 1999-2000 California electricity market, we find that the California power crisis in 2000 destroys the cross-correlation between the price and the load series but does not affect the correlation of the load series during and before the crisis.

  4. Quantifying colocalization by correlation: the Pearson correlation coefficient is superior to the Mander's overlap coefficient.

    PubMed

    Adler, Jeremy; Parmryd, Ingela

    2010-08-01

    The Pearson correlation coefficient (PCC) and the Mander's overlap coefficient (MOC) are used to quantify the degree of colocalization between fluorophores. The MOC was introduced to overcome perceived problems with the PCC. The two coefficients are mathematically similar, differing in the use of either the absolute intensities (MOC) or of the deviation from the mean (PCC). A range of correlated datasets, which extend to the limits of the PCC, only evoked a limited response from the MOC. The PCC is unaffected by changes to the offset while the MOC increases when the offset is positive. Both coefficients are independent of gain. The MOC is a confusing hybrid measurement, that combines correlation with a heavily weighted form of co-occurrence, favors high intensity combinations, downplays combinations in which either or both intensities are low and ignores blank pixels. The PCC only measures correlation. A surprising finding was that the addition of a second uncorrelated population can substantially increase the measured correlation, demonstrating the importance of excluding background pixels. Overall, since the MOC is unresponsive to substantial changes in the data and is hard to interpret, it is neither an alternative to nor a useful substitute for the PCC. The MOC is not suitable for making measurements of colocalization either by correlation or co-occurrence.

  5. Correlation between malaria incidence and prevalence of soil-transmitted helminths in Colombia: an ecologic evaluation.

    PubMed

    Valencia, Carlos Andrés; Fernández, Julián Alfredo; Cucunubá, Zulma Milena; Reyes, Patricia; López, Myriam Consuelo; Duque, Sofía

    2010-01-01

    Recent studies have suggested an association between the soil-transmitted helminth infections and malaria incidence. However, published evidence is still insufficient and diverging. Since 1977, new ecologic studies have not been carried out to explore this association. Ecologic studies could explore this correlation on a population level, assessing its potential importance on public health. The aim of this evaluation is to explore the association between soil-transmitted helminths prevalence and malaria incidence, at an ecologic level in Colombia. Using data from the National Health Survey, which was carried out in 1980 in Colombia, we calculated Spearman correlation coefficients between the prevalence of: Ascaris lumbricoides, Trichuris trichiura and hookworm, with the 1980 malaria incidence data of the same year provided from the Colombian Malaria National Eradication Service. A robust regression analysis with least trimmed squares was performed. Falciparum malaria incidence and Ascaris lumbricoides prevalence had a low correlation (R²= 0.086) but this correlation was stronger into the clusters of towns with prevalence of Ascaris lumbricoides infection above 30% were only included (R²= 0.916). This work showed an ecologic correlation in Colombia between malaria incidence and soil-transmitted helminths prevalence. This could suggest that either there is an association between these two groups of parasites, or could be explained by the presence of common structural determinants for both diseases.

  6. Can Imaging Parameters Provide Information Regarding Histopathology in Head and Neck Squamous Cell Carcinoma? A Meta-Analysis.

    PubMed

    Surov, Alexey; Meyer, Hans Jonas; Wienke, Andreas

    2018-04-01

    Our purpose was to provide data regarding relationships between different imaging and histopathological parameters in HNSCC. MEDLINE library was screened for associations between different imaging parameters and histopathological features in HNSCC up to December 2017. Only papers containing correlation coefficients between different imaging parameters and histopathological findings were acquired for the analysis. Associations between 18 F-FDG positron emission tomography (PET) and KI 67 were reported in 8 studies (236 patients). The pooled correlation coefficient was 0.20 (95% CI = [-0.04; 0.44]). Furthermore, in 4 studies (64 patients), associations between 18 F-fluorothymidine PET and KI 67 were analyzed. The pooled correlation coefficient between SUV max and KI 67 was 0.28 (95% CI = [-0.06; 0.94]). In 2 studies (23 patients), relationships between KI 67 and dynamic contrast-enhanced magnetic resonance imaging were reported. The pooled correlation coefficient between K trans and KI 67 was -0.68 (95% CI = [-0.91; -0.44]). Two studies (31 patients) investigated correlation between apparent diffusion coefficient (ADC) and KI 67. The pooled correlation coefficient was -0.61 (95% CI = [-0.84; -0.38]). In 2 studies (117 patients), relationships between 18 F-FDG PET and p53 were analyzed. The pooled correlation coefficient was 0.0 (95% CI = [-0.87; 0.88]). There were 3 studies (48 patients) that investigated associations between ADC and tumor cell count in HNSCC. The pooled correlation coefficient was -0.53 (95% CI = [-0.74; -0.32]). Associations between 18 F-FDG PET and HIF-1α were investigated in 3 studies (72 patients). The pooled correlation coefficient was 0.44 (95% CI = [-0.20; 1.08]). ADC may predict cell count and proliferation activity, and SUV max may predict expression of HIF-1α in HNSCC. SUV max cannot be used as surrogate marker for expression of KI 67 and p53. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Optical temperature compensation schemes of spectral modulation sensors for aircraft engine control

    NASA Astrophysics Data System (ADS)

    Berkcan, Ertugrul

    1993-02-01

    Optical temperature compensation schemes for the ratiometric interrogation of spectral modulation sensors for source temperature robustness are presented. We have obtained better than 50 - 100X decrease of the temperature coefficient of the sensitivity using these types of compensation. We have also developed a spectrographic interrogation scheme that provides increased source temperature robustness; this affords a significantly improved accuracy over FADEC temperature ranges as well as temperature coefficient of the sensitivity that is substantially and further reduced. This latter compensation scheme can be integrated in a small E/O package including the detection, analog and digital signal processing. We find that these interrogation schemes can be used within a detector spatially multiplexed architecture.

  8. A novel coefficient for detecting and quantifying asymmetry of California electricity market based on asymmetric detrended cross-correlation analysis.

    PubMed

    Wang, Fang

    2016-06-01

    In order to detect and quantify asymmetry of two time series, a novel cross-correlation coefficient is proposed based on recent asymmetric detrended cross-correlation analysis (A-DXA), which we called A-DXA coefficient. The A-DXA coefficient, as an important extension of DXA coefficient ρDXA, contains two directional asymmetric cross-correlated indexes, describing upwards and downwards asymmetric cross-correlations, respectively. By using the information of directional covariance function of two time series and directional variance function of each series itself instead of power-law between the covariance function and time scale, the proposed A-DXA coefficient can well detect asymmetry between the two series no matter whether the cross-correlation is significant or not. By means of the proposed A-DXA coefficient conducted over the asymmetry for California electricity market, we found that the asymmetry between the prices and loads is not significant for daily average data in 1999 yr market (before electricity crisis) but extremely significant for those in 2000 yr market (during the crisis). To further uncover the difference of asymmetry between the years 1999 and 2000, a modified H statistic (MH) and ΔMH statistic are proposed. One of the present contributions is that the high MH values calculated for hourly data exist in majority months in 2000 market. Another important conclusion is that the cross-correlation with downwards dominates over the whole 1999 yr in contrast to the cross-correlation with upwards dominates over the 2000 yr.

  9. Detecting PM2.5's Correlations between Neighboring Cities Using a Time-Lagged Cross-Correlation Coefficient.

    PubMed

    Wang, Fang; Wang, Lin; Chen, Yuming

    2017-08-31

    In order to investigate the time-dependent cross-correlations of fine particulate (PM2.5) series among neighboring cities in Northern China, in this paper, we propose a new cross-correlation coefficient, the time-lagged q-L dependent height crosscorrelation coefficient (denoted by p q (τ, L)), which incorporates the time-lag factor and the fluctuation amplitude information into the analogous height cross-correlation analysis coefficient. Numerical tests are performed to illustrate that the newly proposed coefficient ρ q (τ, L) can be used to detect cross-correlations between two series with time lags and to identify different range of fluctuations at which two series possess cross-correlations. Applying the new coefficient to analyze the time-dependent cross-correlations of PM2.5 series between Beijing and the three neighboring cities of Tianjin, Zhangjiakou, and Baoding, we find that time lags between the PM2.5 series with larger fluctuations are longer than those between PM2.5 series withsmaller fluctuations. Our analysis also shows that cross-correlations between the PM2.5 series of two neighboring cities are significant and the time lags between two PM2.5 series of neighboring cities are significantly non-zero. These findings providenew scientific support on the view that air pollution in neighboring cities can affect one another not simultaneously but with a time lag.

  10. Revisiting the Robustness of PET-Based Textural Features in the Context of Multi-Centric Trials

    PubMed Central

    Bailly, Clément; Bodet-Milin, Caroline; Couespel, Solène; Necib, Hatem; Kraeber-Bodéré, Françoise; Ansquer, Catherine; Carlier, Thomas

    2016-01-01

    Purpose This study aimed to investigate the variability of textural features (TF) as a function of acquisition and reconstruction parameters within the context of multi-centric trials. Methods The robustness of 15 selected TFs were studied as a function of the number of iterations, the post-filtering level, input data noise, the reconstruction algorithm and the matrix size. A combination of several reconstruction and acquisition settings was devised to mimic multi-centric conditions. We retrospectively studied data from 26 patients enrolled in a diagnostic study that aimed to evaluate the performance of PET/CT 68Ga-DOTANOC in gastro-entero-pancreatic neuroendocrine tumors. Forty-one tumors were extracted and served as the database. The coefficient of variation (COV) or the absolute deviation (for the noise study) was derived and compared statistically with SUVmax and SUVmean results. Results The majority of investigated TFs can be used in a multi-centric context when each parameter is considered individually. The impact of voxel size and noise in the input data were predominant as only 4 TFs presented a high/intermediate robustness against SUV-based metrics (Entropy, Homogeneity, RP and ZP). When combining several reconstruction settings to mimic multi-centric conditions, most of the investigated TFs were robust enough against SUVmax except Correlation, Contrast, LGRE, LGZE and LZLGE. Conclusion Considering previously published results on either reproducibility or sensitivity against delineation approach and our findings, it is feasible to consider Homogeneity, Entropy, Dissimilarity, HGRE, HGZE and ZP as relevant for being used in multi-centric trials. PMID:27467882

  11. Estimating EQ-5D values from the Oswestry Disability Index and numeric rating scales for back and leg pain.

    PubMed

    Carreon, Leah Y; Bratcher, Kelly R; Das, Nandita; Nienhuis, Jacob B; Glassman, Steven D

    2014-04-15

    Cross-sectional cohort. The purpose of this study is to determine whether the EuroQOL-5D (EQ-5D) can be derived from commonly available low back disease-specific health-related quality of life measures. The Oswestry Disability Index (ODI) and numeric rating scales (0-10) for back pain (BP) and leg pain (LP) are widely used disease-specific measures in patients with lumbar degenerative disorders. Increasingly, the EQ-5D is being used as a measure of utility due to ease of administration and scoring. The EQ-5D, ODI, BP, and LP were prospectively collected in 14,544 patients seen in clinic for lumbar degenerative disorders. Pearson correlation coefficients for paired observations from multiple time points between ODI, BP, LP, and EQ-5D were determined. Regression modeling was done to compute the EQ-5D score from the ODI, BP, and LP. The mean age was 53.3 ± 16.4 years and 41% were male. Correlations between the EQ-5D and the ODI, BP, and LP were statistically significant (P < 0.0001) with correlation coefficients of -0.77, -0.50, and -0.57, respectively. The regression equation: [0.97711 + (-0.00687 × ODI) + (-0.01488 × LP) + (-0.01008 × BP)] to predict EQ-5D, had an R2 of 0.61 and a root mean square error of 0.149. The model using ODI alone had an R2 of 0.57 and a root mean square error of 0.156. The model using the individual ODI items had an R2 of 0.64 and a root mean square error of 0.143. The correlation coefficient between the observed and estimated EQ-5D score was 0.78. There was no statistically significant difference between the actual EQ-5D (0.553 ± 0.238) and the estimated EQ-5D score (0.553 ± 0.186) using the ODI, BP, and LP regression model. However, rounding off the coefficients to less than 5 decimal places produced less accurate results. Unlike previous studies showing a robust relationship between low back-specific measures and the Short Form-6D, a similar relationship was not seen between the ODI, BP, LP, and the EQ-5D. Thus, the EQ-5D cannot be accurately estimated from the ODI, BP, and LP. 2.

  12. Limits of the memory coefficient in measuring correlated bursts

    NASA Astrophysics Data System (ADS)

    Jo, Hang-Hyun; Hiraoka, Takayuki

    2018-03-01

    Temporal inhomogeneities in event sequences of natural and social phenomena have been characterized in terms of interevent times and correlations between interevent times. The inhomogeneities of interevent times have been extensively studied, while the correlations between interevent times, often called correlated bursts, are far from being fully understood. For measuring the correlated bursts, two relevant approaches were suggested, i.e., memory coefficient and burst size distribution. Here a burst size denotes the number of events in a bursty train detected for a given time window. Empirical analyses have revealed that the larger memory coefficient tends to be associated with the heavier tail of the burst size distribution. In particular, empirical findings in human activities appear inconsistent, such that the memory coefficient is close to 0, while burst size distributions follow a power law. In order to comprehend these observations, by assuming the conditional independence between consecutive interevent times, we derive the analytical form of the memory coefficient as a function of parameters describing interevent time and burst size distributions. Our analytical result can explain the general tendency of the larger memory coefficient being associated with the heavier tail of burst size distribution. We also find that the apparently inconsistent observations in human activities are compatible with each other, indicating that the memory coefficient has limits to measure the correlated bursts.

  13. Robust spike classification based on frequency domain neural waveform features.

    PubMed

    Yang, Chenhui; Yuan, Yuan; Si, Jennie

    2013-12-01

    We introduce a new spike classification algorithm based on frequency domain features of the spike snippets. The goal for the algorithm is to provide high classification accuracy, low false misclassification, ease of implementation, robustness to signal degradation, and objectivity in classification outcomes. In this paper, we propose a spike classification algorithm based on frequency domain features (CFDF). It makes use of frequency domain contents of the recorded neural waveforms for spike classification. The self-organizing map (SOM) is used as a tool to determine the cluster number intuitively and directly by viewing the SOM output map. After that, spike classification can be easily performed using clustering algorithms such as the k-Means. In conjunction with our previously developed multiscale correlation of wavelet coefficient (MCWC) spike detection algorithm, we show that the MCWC and CFDF detection and classification system is robust when tested on several sets of artificial and real neural waveforms. The CFDF is comparable to or outperforms some popular automatic spike classification algorithms with artificial and real neural data. The detection and classification of neural action potentials or neural spikes is an important step in single-unit-based neuroscientific studies and applications. After the detection of neural snippets potentially containing neural spikes, a robust classification algorithm is applied for the analysis of the snippets to (1) extract similar waveforms into one class for them to be considered coming from one unit, and to (2) remove noise snippets if they do not contain any features of an action potential. Usually, a snippet is a small 2 or 3 ms segment of the recorded waveform, and differences in neural action potentials can be subtle from one unit to another. Therefore, a robust, high performance classification system like the CFDF is necessary. In addition, the proposed algorithm does not require any assumptions on statistical properties of the noise and proves to be robust under noise contamination.

  14. Correlation Coefficients: Appropriate Use and Interpretation.

    PubMed

    Schober, Patrick; Boer, Christa; Schwarte, Lothar A

    2018-05-01

    Correlation in the broadest sense is a measure of an association between variables. In correlated data, the change in the magnitude of 1 variable is associated with a change in the magnitude of another variable, either in the same (positive correlation) or in the opposite (negative correlation) direction. Most often, the term correlation is used in the context of a linear relationship between 2 continuous variables and expressed as Pearson product-moment correlation. The Pearson correlation coefficient is typically used for jointly normally distributed data (data that follow a bivariate normal distribution). For nonnormally distributed continuous data, for ordinal data, or for data with relevant outliers, a Spearman rank correlation can be used as a measure of a monotonic association. Both correlation coefficients are scaled such that they range from -1 to +1, where 0 indicates that there is no linear or monotonic association, and the relationship gets stronger and ultimately approaches a straight line (Pearson correlation) or a constantly increasing or decreasing curve (Spearman correlation) as the coefficient approaches an absolute value of 1. Hypothesis tests and confidence intervals can be used to address the statistical significance of the results and to estimate the strength of the relationship in the population from which the data were sampled. The aim of this tutorial is to guide researchers and clinicians in the appropriate use and interpretation of correlation coefficients.

  15. Statistical Considerations in Choosing a Test Reliability Coefficient. ACT Research Report Series, 2012 (10)

    ERIC Educational Resources Information Center

    Woodruff, David; Wu, Yi-Fang

    2012-01-01

    The purpose of this paper is to illustrate alpha's robustness and usefulness, using actual and simulated educational test data. The sampling properties of alpha are compared with the sampling properties of several other reliability coefficients: Guttman's lambda[subscript 2], lambda[subscript 4], and lambda[subscript 6]; test-retest reliability;…

  16. Prediction of octanol-air partition coefficients for polychlorinated biphenyls (PCBs) using 3D-QSAR models.

    PubMed

    Chen, Ying; Cai, Xiaoyu; Jiang, Long; Li, Yu

    2016-02-01

    Based on the experimental data of octanol-air partition coefficients (KOA) for 19 polychlorinated biphenyl (PCB) congeners, two types of QSAR methods, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA), are used to establish 3D-QSAR models using the structural parameters as independent variables and using logKOA values as the dependent variable with the Sybyl software to predict the KOA values of the remaining 190 PCB congeners. The whole data set (19 compounds) was divided into a training set (15 compounds) for model generation and a test set (4 compounds) for model validation. As a result, the cross-validation correlation coefficient (q(2)) obtained by the CoMFA and CoMSIA models (shuffled 12 times) was in the range of 0.825-0.969 (>0.5), the correlation coefficient (r(2)) obtained was in the range of 0.957-1.000 (>0.9), and the SEP (standard error of prediction) of test set was within the range of 0.070-0.617, indicating that the models were robust and predictive. Randomly selected from a set of models, CoMFA analysis revealed that the corresponding percentages of the variance explained by steric and electrostatic fields were 23.9% and 76.1%, respectively, while CoMSIA analysis by steric, electrostatic and hydrophobic fields were 0.6%, 92.6%, and 6.8%, respectively. The electrostatic field was determined as a primary factor governing the logKOA. The correlation analysis of the relationship between the number of Cl atoms and the average logKOA values of PCBs indicated that logKOA values gradually increased as the number of Cl atoms increased. Simultaneously, related studies on PCB detection in the Arctic and Antarctic areas revealed that higher logKOA values indicate a stronger PCB migration ability. From CoMFA and CoMSIA contour maps, logKOA decreased when substituents possessed electropositive groups at the 2-, 3-, 3'-, 5- and 6- positions, which could reduce the PCB migration ability. These results are expected to be beneficial in predicting logKOA values of PCB homologues and derivatives and in providing a theoretical foundation for further elucidation of the global migration behaviour of PCBs. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. UV Spectrophotometric Method for Estimation of Polypeptide-K in Bulk and Tablet Dosage Forms

    NASA Astrophysics Data System (ADS)

    Kaur, P.; Singh, S. Kumar; Gulati, M.; Vaidya, Y.

    2016-01-01

    An analytical method for estimation of polypeptide-k using UV spectrophotometry has been developed and validated for bulk as well as tablet dosage form. The developed method was validated for linearity, precision, accuracy, specificity, robustness, detection, and quantitation limits. The method has shown good linearity over the range from 100.0 to 300.0 μg/ml with a correlation coefficient of 0.9943. The percentage recovery of 99.88% showed that the method was highly accurate. The precision demonstrated relative standard deviation of less than 2.0%. The LOD and LOQ of the method were found to be 4.4 and 13.33, respectively. The study established that the proposed method is reliable, specific, reproducible, and cost-effective for the determination of polypeptide-k.

  18. Local versus global knowledge in the Barabási-Albert scale-free network model.

    PubMed

    Gómez-Gardeñes, Jesús; Moreno, Yamir

    2004-03-01

    The scale-free model of Barabási and Albert (BA) gave rise to a burst of activity in the field of complex networks. In this paper, we revisit one of the main assumptions of the model, the preferential attachment (PA) rule. We study a model in which the PA rule is applied to a neighborhood of newly created nodes and thus no global knowledge of the network is assumed. We numerically show that global properties of the BA model such as the connectivity distribution and the average shortest path length are quite robust when there is some degree of local knowledge. In contrast, other properties such as the clustering coefficient and degree-degree correlations differ and approach the values measured for real-world networks.

  19. Correlates of anxiety and depression among patients with type 2 diabetes mellitus.

    PubMed

    Balhara, Yatan Pal Singh; Sagar, Rajesh

    2011-07-01

    Research has established the relation between diabetes and depression. Both diabetes and anxiety/depression are independently associated with increased morbidity and mortality. The present study aims at assessing the prevalence of anxiety/depression among outpatients receiving treatment for type 2 diabetes. The study was conducted in the endocrinology outpatient department of an urban tertiary care center. The instruments used included a semi-structured questionnaire, HbA1c levels, fasting blood glucose and postprandial blood glucose, Brief Patient Health Questionnaire, and Hospital Anxiety and Depression Scale (HADS). Analysis was carried out using the SPSS version 16.0. Pearson's correlation coefficient was calculated to find out the correlations. ANOVA was carried out for the in between group comparisons. There was a significant correlation between the HADS-Anxiety scale and Body Mass Index (BMI) with a correlation coefficient of 0.34 (P = 0.008). Also, a significant correlation existed between HADS-Depression scale and BMI (correlation coefficient, 0.36; P = 0.004). Significant correlation were observed between the duration of daily physical exercise and HADS-Anxiety (coefficient of correlation, -0.25; P = 0.04) scores. HADS-Anxiety scores were found to be related to HbA1c levels (correlation-coefficient, 0.41; P = 0.03) and postprandial blood glucose levels (correlation-coefficient, 0.51; P = 0.02). Monitoring of biochemical parameters like HbA1c and postprandial blood glucose levels and BMI could be a guide to development of anxiety in these patients. Also, physical exercise seems to have a protective effect on anxiety in those with type 2 diabetes mellitus.

  20. Prediction of friction coefficients for gases

    NASA Technical Reports Server (NTRS)

    Taylor, M. F.

    1969-01-01

    Empirical relations are used for correlating laminar and turbulent friction coefficients for gases, with large variations in the physical properties, flowing through smooth tubes. These relations have been used to correlate friction coefficients for hydrogen, helium, nitrogen, carbon dioxide and air.

  1. QSPR modeling of octanol/water partition coefficient of antineoplastic agents by balance of correlations.

    PubMed

    Toropov, Andrey A; Toropova, Alla P; Raska, Ivan; Benfenati, Emilio

    2010-04-01

    Three different splits into the subtraining set (n = 22), the set of calibration (n = 21), and the test set (n = 12) of 55 antineoplastic agents have been examined. By the correlation balance of SMILES-based optimal descriptors quite satisfactory models for the octanol/water partition coefficient have been obtained on all three splits. The correlation balance is the optimization of a one-variable model with a target function that provides both the maximal values of the correlation coefficient for the subtraining and calibration set and the minimum of the difference between the above-mentioned correlation coefficients. Thus, the calibration set is a preliminary test set. Copyright (c) 2009 Elsevier Masson SAS. All rights reserved.

  2. Lack of robustness of textural measures obtained from 3D brain tumor MRIs impose a need for standardization

    PubMed Central

    Pérez-Beteta, Julián; Martínez-González, Alicia; Martino, Juan; Velasquez, Carlos; Arana, Estanislao; Pérez-García, Víctor M.

    2017-01-01

    Purpose Textural measures have been widely explored as imaging biomarkers in cancer. However, their robustness under dynamic range and spatial resolution changes in brain 3D magnetic resonance images (MRI) has not been assessed. The aim of this work was to study potential variations of textural measures due to changes in MRI protocols. Materials and methods Twenty patients harboring glioblastoma with pretreatment 3D T1-weighted MRIs were included in the study. Four different spatial resolution combinations and three dynamic ranges were studied for each patient. Sixteen three-dimensional textural heterogeneity measures were computed for each patient and configuration including co-occurrence matrices (CM) features and run-length matrices (RLM) features. The coefficient of variation was used to assess the robustness of the measures in two series of experiments corresponding to (i) changing the dynamic range and (ii) changing the matrix size. Results No textural measures were robust under dynamic range changes. Entropy was the only textural feature robust under spatial resolution changes (coefficient of variation under 10% in all cases). Conclusion Textural measures of three-dimensional brain tumor images are not robust neither under dynamic range nor under matrix size changes. Standards should be harmonized to use textural features as imaging biomarkers in radiomic-based studies. The implications of this work go beyond the specific tumor type studied here and pose the need for standardization in textural feature calculation of oncological images. PMID:28586353

  3. How expressions of forgiveness, purpose, and religiosity relate to emotional intelligence and self-concept in urban fifth-grade students.

    PubMed

    Van Dyke, Cydney J; Elias, Maurice J

    2008-10-01

    This study investigated how the tendency to express forgiveness, purpose, and religiosity in a free-response essay relates to emotional intelligence and self-concept in 89 5th-graders (mean age = 10.84 years) from an urban public school district in New Jersey. Readers coded essays for expressions of forgiveness, purpose, and religiosity using originally developed rubrics. These data were compared with self-reports on scales of emotional intelligence and self-concept. It was hypothesized that expressions of the predictor variables would correlate positively with emotional intelligence and self-concept. In contrast to expressions of purpose, which were common among students, expressions of forgiveness and religiosity were infrequent. Furthermore, forgiveness was not significantly related to either criterion variable; purpose was positively related to self-concept (but not to emotional intelligence); and religiosity was negatively related to emotional intelligence (but not to self-concept). Correlational analyses by gender revealed a possible trend toward more robust relationships being observed among females than males; however, the differences between the correlation coefficients observed among males and females failed to reach statistical significance. Several of the study's unanticipated findings suggest the need for further empirical work investigating the psychological correlates of these constructs in children. PsycINFO Database Record 2009 APA.

  4. ``Seeing'' electroencephalogram through the skull: imaging prefrontal cortex with fast optical signal

    NASA Astrophysics Data System (ADS)

    Medvedev, Andrei V.; Kainerstorfer, Jana M.; Borisov, Sergey V.; Gandjbakhche, Amir H.; Vanmeter, John

    2010-11-01

    Near-infrared spectroscopy is a novel imaging technique potentially sensitive to both brain hemodynamics (slow signal) and neuronal activity (fast optical signal, FOS). The big challenge of measuring FOS noninvasively lies in the presumably low signal-to-noise ratio. Thus, detectability of the FOS has been controversially discussed. We present reliable detection of FOS from 11 individuals concurrently with electroencephalogram (EEG) during a Go-NoGo task. Probes were placed bilaterally over prefrontal cortex. Independent component analysis (ICA) was used for artifact removal. Correlation coefficient in the best correlated FOS-EEG ICA pairs was highly significant (p < 10-8), and event-related optical signal (EROS) was found in all subjects. Several EROS components were similar to the event-related potential (ERP) components. The most robust ``optical N200'' at t = 225 ms coincided with the N200 ERP; both signals showed significant difference between targets and nontargets, and their timing correlated with subject's reaction time. Correlation between FOS and EEG even in single trials provides further evidence that at least some FOS components ``reflect'' electrical brain processes directly. The data provide evidence for the early involvement of prefrontal cortex in rapid object recognition. EROS is highly localized and can provide cost-effective imaging tools for cortical mapping of cognitive processes.

  5. “Seeing” electroencephalogram through the skull: imaging prefrontal cortex with fast optical signal

    PubMed Central

    Medvedev, Andrei V.; Kainerstorfer, Jana M.; Borisov, Sergey V.; Gandjbakhche, Amir H.; VanMeter, John

    2010-01-01

    Near-infrared spectroscopy is a novel imaging technique potentially sensitive to both brain hemodynamics (slow signal) and neuronal activity (fast optical signal, FOS). The big challenge of measuring FOS noninvasively lies in the presumably low signal-to-noise ratio. Thus, detectability of the FOS has been controversially discussed. We present reliable detection of FOS from 11 individuals concurrently with electroencephalogram (EEG) during a Go-NoGo task. Probes were placed bilaterally over prefrontal cortex. Independent component analysis (ICA) was used for artifact removal. Correlation coefficient in the best correlated FOS–EEG ICA pairs was highly significant (p < 10−8), and event-related optical signal (EROS) was found in all subjects. Several EROS components were similar to the event-related potential (ERP) components. The most robust “optical N200” at t = 225 ms coincided with the N200 ERP; both signals showed significant difference between targets and nontargets, and their timing correlated with subject’s reaction time. Correlation between FOS and EEG even in single trials provides further evidence that at least some FOS components “reflect” electrical brain processes directly. The data provide evidence for the early involvement of prefrontal cortex in rapid object recognition. EROS is highly localized and can provide cost-effective imaging tools for cortical mapping of cognitive processes. PMID:21198150

  6. Dynamics analysis of SIR epidemic model with correlation coefficients and clustering coefficient in networks.

    PubMed

    Zhang, Juping; Yang, Chan; Jin, Zhen; Li, Jia

    2018-07-14

    In this paper, the correlation coefficients between nodes in states are used as dynamic variables, and we construct SIR epidemic dynamic models with correlation coefficients by using the pair approximation method in static networks and dynamic networks, respectively. Considering the clustering coefficient of the network, we analytically investigate the existence and the local asymptotic stability of each equilibrium of these models and derive threshold values for the prevalence of diseases. Additionally, we obtain two equivalent epidemic thresholds in dynamic networks, which are compared with the results of the mean field equations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. A novel coefficient for detecting and quantifying asymmetry of California electricity market based on asymmetric detrended cross-correlation analysis

    NASA Astrophysics Data System (ADS)

    Wang, Fang

    2016-06-01

    In order to detect and quantify asymmetry of two time series, a novel cross-correlation coefficient is proposed based on recent asymmetric detrended cross-correlation analysis (A-DXA), which we called A-DXA coefficient. The A-DXA coefficient, as an important extension of DXA coefficient ρ D X A , contains two directional asymmetric cross-correlated indexes, describing upwards and downwards asymmetric cross-correlations, respectively. By using the information of directional covariance function of two time series and directional variance function of each series itself instead of power-law between the covariance function and time scale, the proposed A-DXA coefficient can well detect asymmetry between the two series no matter whether the cross-correlation is significant or not. By means of the proposed A-DXA coefficient conducted over the asymmetry for California electricity market, we found that the asymmetry between the prices and loads is not significant for daily average data in 1999 yr market (before electricity crisis) but extremely significant for those in 2000 yr market (during the crisis). To further uncover the difference of asymmetry between the years 1999 and 2000, a modified H statistic (MH) and ΔMH statistic are proposed. One of the present contributions is that the high MH values calculated for hourly data exist in majority months in 2000 market. Another important conclusion is that the cross-correlation with downwards dominates over the whole 1999 yr in contrast to the cross-correlation with upwards dominates over the 2000 yr.

  8. Evaluation of the Correlation Coefficient of Polyethylene Glycol Treated and Direct Prolactin Results and Comparability with Different Assay System Results.

    PubMed

    Pal, Shyamali

    2017-12-01

    The presence of Macro prolactin is a significant cause of elevated prolactin resulting in misdiagnosis in all automated systems. Poly ethylene glycol (PEG) pretreatment is the preventive process but such process includes the probability of loss of a fraction of bioactive prolactin. Surprisingly, PEG treated EQAS & IQAS samples in Cobas e 411 are found out to be correlating with direct results of at least 3 immunoassay systems and treated and untreated Cobas e 411 results are comparable by a correlation coefficient. Comparison of EQAS, IQAS and patient samples were done to find out the trueness of such correlation factor. Study with patient's results have established the correlation coefficient is valid for very small concentration of prolactin also. EQAS, IQAS and 150 patient samples were treated with PEG and prolactin results of treated and untreated samples obtained from Roche Cobas e 411. 25 patient's results (treated) were compared with direct results in Advia Centaur, Architect I & Access2 systems. Correlation coefficient was obtained from trend line of the treated and untreated results. Two tailed p-value obtained from regression coefficient(r) and sample size. The correlation coefficient is in the range (0.761-0.771). Reverse correlation range is (1.289-1.301). r value of two sets of calculated results were 0.995. Two tailed p- value is zero approving dismissal of null hypothesis. The z-score of EQAS does not always assure authenticity of resultsPEG precipitation is correlated by the factor 0.761 even in very small concentrationsAbbreviationsGFCgel filtration chromatographyPEGpolyethylene glycolEQASexternal quality assurance systemM-PRLmacro prolactinPRLprolactinECLIAelectro-chemiluminescence immunoassayCLIAclinical laboratory improvement amendmentsIQASinternal quality assurance systemrregression coefficient.

  9. Reproducibility of graph metrics of human brain structural networks.

    PubMed

    Duda, Jeffrey T; Cook, Philip A; Gee, James C

    2014-01-01

    Recent interest in human brain connectivity has led to the application of graph theoretical analysis to human brain structural networks, in particular white matter connectivity inferred from diffusion imaging and fiber tractography. While these methods have been used to study a variety of patient populations, there has been less examination of the reproducibility of these methods. A number of tractography algorithms exist and many of these are known to be sensitive to user-selected parameters. The methods used to derive a connectivity matrix from fiber tractography output may also influence the resulting graph metrics. Here we examine how these algorithm and parameter choices influence the reproducibility of proposed graph metrics on a publicly available test-retest dataset consisting of 21 healthy adults. The dice coefficient is used to examine topological similarity of constant density subgraphs both within and between subjects. Seven graph metrics are examined here: mean clustering coefficient, characteristic path length, largest connected component size, assortativity, global efficiency, local efficiency, and rich club coefficient. The reproducibility of these network summary measures is examined using the intraclass correlation coefficient (ICC). Graph curves are created by treating the graph metrics as functions of a parameter such as graph density. Functional data analysis techniques are used to examine differences in graph measures that result from the choice of fiber tracking algorithm. The graph metrics consistently showed good levels of reproducibility as measured with ICC, with the exception of some instability at low graph density levels. The global and local efficiency measures were the most robust to the choice of fiber tracking algorithm.

  10. Absolute measurement of cerebral optical coefficients, hemoglobin concentration and oxygen saturation in old and young adults with near-infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Hallacoglu, Bertan; Sassaroli, Angelo; Wysocki, Michael; Guerrero-Berroa, Elizabeth; Schnaider Beeri, Michal; Haroutunian, Vahram; Shaul, Merav; Rosenberg, Irwin H.; Troen, Aron M.; Fantini, Sergio

    2012-08-01

    We present near-infrared spectroscopy measurement of absolute cerebral hemoglobin concentration and saturation in a large sample of 36 healthy elderly (mean age, 85±6 years) and 19 young adults (mean age, 28±4 years). Non-invasive measurements were obtained on the forehead using a commercially available multi-distance frequency-domain system and analyzed using a diffusion theory model for a semi-infinite, homogeneous medium with semi-infinite boundary conditions. Our study included repeat measurements, taken five months apart, on 16 elderly volunteers that demonstrate intra-subject reproducibility of the absolute measurements with cross-correlation coefficients of 0.9 for absorption coefficient (μa), oxy-hemoglobin concentration ([HbO2]), and total hemoglobin concentration ([HbT]), 0.7 for deoxy-hemoglobin concentration ([Hb]), 0.8 for hemoglobin oxygen saturation (StO2), and 0.7 for reduced scattering coefficient (). We found significant differences between the two age groups. Compared to young subjects, elderly subjects had lower cerebral [HbO2], [Hb], [HbT], and StO2 by 10±4 μM, 4±3 μM, 14±5 μM, and 6%±5%, respectively. Our results demonstrate the reliability and robustness of multi-distance near-infrared spectroscopy measurements based on a homogeneous model in the human forehead on a large sample of human subjects. Absolute, non-invasive optical measurements on the brain, such as those presented here, can significantly advance the development of NIRS technology as a tool for monitoring resting/basal cerebral perfusion, hemodynamics, oxygenation, and metabolism.

  11. Absolute measurement of cerebral optical coefficients, hemoglobin concentration and oxygen saturation in old and young adults with near-infrared spectroscopy.

    PubMed

    Hallacoglu, Bertan; Sassaroli, Angelo; Wysocki, Michael; Guerrero-Berroa, Elizabeth; Schnaider Beeri, Michal; Haroutunian, Vahram; Shaul, Merav; Rosenberg, Irwin H; Troen, Aron M; Fantini, Sergio

    2012-08-01

    We present near-infrared spectroscopy measurement of absolute cerebral hemoglobin concentration and saturation in a large sample of 36 healthy elderly (mean age, 85 ± 6 years) and 19 young adults (mean age, 28 ± 4 years). Non-invasive measurements were obtained on the forehead using a commercially available multi-distance frequency-domain system and analyzed using a diffusion theory model for a semi-infinite, homogeneous medium with semi-infinite boundary conditions. Our study included repeat measurements, taken five months apart, on 16 elderly volunteers that demonstrate intra-subject reproducibility of the absolute measurements with cross-correlation coefficients of 0.9 for absorption coefficient (μa), oxy-hemoglobin concentration ([HbO2]), and total hemoglobin concentration ([HbT]), 0.7 for deoxy-hemoglobin concentration ([Hb]), 0.8 for hemoglobin oxygen saturation (StO2), and 0.7 for reduced scattering coefficient (μ's). We found significant differences between the two age groups. Compared to young subjects, elderly subjects had lower cerebral [HbO2], [Hb], [HbT], and StO2 by 10 ± 4 μM, 4 ± 3 μM, 14 ± 5 μM, and 6%±5%, respectively. Our results demonstrate the reliability and robustness of multi-distance near-infrared spectroscopy measurements based on a homogeneous model in the human forehead on a large sample of human subjects. Absolute, non-invasive optical measurements on the brain, such as those presented here, can significantly advance the development of NIRS technology as a tool for monitoring resting/basal cerebral perfusion, hemodynamics, oxygenation, and metabolism.

  12. A global method for identifying dependences between helio-geophysical and biological series by filtering the precedents (outliers)

    NASA Astrophysics Data System (ADS)

    Ozheredov, V. A.; Breus, T. K.; Gurfinkel, Yu. I.; Matveeva, T. A.

    2014-12-01

    A new approach to finding the dependence between heliophysical and meteorological factors and physiological parameters is considered that is based on the preliminary filtering of precedents (outliers). The sought-after dependence is masked by extraneous influences which cannot be taken into account. Therefore, the typically calculated correlation between the external-influence ( x) and physiology ( y) parameters is extremely low and does not allow their interdependence to be conclusively proved. A robust method for removing the precedents (outliers) from the database is proposed that is based on the intelligent sorting of the polynomial curves of possible dependences y( x), followed by filtering out the precedents which are far away from y( x) and optimizing the coefficient of nonlinear correlation between the regular, i.e., remaining, precedents. This optimization problem is shown to be a search for a maximum in the absence of the concept of gradient and requires the use of a genetic algorithm based on the Gray code. The relationships between the various medical and biological parameters and characteristics of the space and terrestrial weather are obtained and verified using the cross-validation method. It is proven that, by filtering out no more than 20% of precedents, it is possible to obtain a nonlinear correlation coefficient of no less than 0.5. A juxtaposition of the proposed method for filtering precedents (outliers) and the least-square method (LSM) for determining the optimal polynomial using multiple independent tests (Monte Carlo method) of models, which are as close as possible to real dependences, has shown that the LSM determination loses much in comparison to the proposed method.

  13. The role of remote wind forcing in the subinertial current variability in the central and northern parts of the South Brazil Bight

    NASA Astrophysics Data System (ADS)

    Dottori, Marcelo; Castro, Belmiro Mendes

    2018-06-01

    Data analysis of continental shelf currents and coastal sea level, together with the application of a semi-analytical model, are used to estimate the importance of remote wind forcing on the subinertial variability of the current in the central and northern areas of the South Brazil Bight. Results from both the data analysis and from the semi-analytical model are robust in showing subinertial variability that propagates along-shelf leaving the coast to the left in accordance with theoretical studies of Continental Shelf Waves (CSW). Both the subinertial variability observed in along-shelf currents and sea level oscillations present different propagation speeds for the narrow northern part of the SBB ( 6-7 m/s) and the wide central SBB region ( 11 m/s), those estimates being in agreement with the modeled CSW propagation speed. On the inner and middle shelf, observed along-shelf subinertial currents show higher correlation coefficients with the winds located southward and earlier in time than with the local wind at the current meter mooring position and at the time of measurement. The inclusion of the remote (located southwestward) wind forcing improves the prediction of the subinertial currents when compared to the currents forced only by the local wind, since the along-shelf-modeled currents present correlation coefficients with observed along-shelf currents up to 20% higher on the inner and middle shelf when the remote wind is included. For most of the outer shelf, on the other hand, this is not observed since usually, the correlation between the currents and the synoptic winds is not statistically significant.

  14. The IMPACT-III (HR) questionnaire: a valid measure of health-related quality of life in Croatian children with inflammatory bowel disease.

    PubMed

    Abdovic, Slaven; Mocic Pavic, Ana; Milosevic, Milan; Persic, Mladen; Senecic-Cala, Irena; Kolacek, Sanja

    2013-12-01

    To assess the reliability and validity of IMPACT-III (HR), a disease-specific, health-related quality of life instrument in Croatian children with inflammatory bowel disease. In a multicenter study, 104 children participated in a validation study of IMPACT-III (HR) cross-culturally adapted for Croatia. Factor analysis was used to determine optimal domain structure for this cohort, analysis of Cronbach's alpha coefficients to test internal reliability, ANOVA to assess discriminant validity, and correlation with Pediatric Quality of Life Inventory, Version 4.0 (PedsQL) using Pearson correlation coefficients to assess concurrent validity. Cronbach's alpha for the IMPACT-III (HR) total score was 0.92. The most robust factor solution was a 5-domain structure: Symptoms, Concerns, Socializing, Body Image, and Worry about Stool, all of which demonstrated good internal reliability (α=0.60-0.89), but two items were dropped to achieve this. Discriminant validity was demonstrated by significant differences (P<0.001) in mean IMPACT-III (HR) scores between quiescent and mild or moderate-severe disease activity groups for total (148 vs. 139 or 125) and following factor scores: Symptoms (84 vs. 71 or 61), Socializing (91 vs. 83 or 76), and Worry about Stool (significant only between quiescent and moderate-severe groups, 90 vs. 62, respectively). Concurrent validity of IMPACT-III (HR) with PedsQL showed significant correlation, which was strongest when similar domains were compared. IMPACT-III (HR) appears to be useful tool to measure health-related quality of life in Croatian children with Crohn's disease and ulcerative colitis. Copyright © 2012 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.

  15. The relationship between affective state and the rhythmicity of activity in bipolar disorder.

    PubMed

    Gonzalez, Robert; Tamminga, Carol A; Tohen, Mauricio; Suppes, Trisha

    2014-04-01

    The aim of this study was to test the relationships between mood state and rhythm disturbances as measured via actigraphy in bipolar disorder by assessing the correlations between manic and depressive symptoms as measured via Young Mania Rating Scale (YMRS) and 30-item Inventory for Depressive Symptomatology, Clinician-Rated (IDS-C-30) scores and the actigraphic measurements of rhythm, the 24-hour autocorrelation coefficient and circadian quotient. The research was conducted at the University of Texas Southwestern Medical Center at Dallas from February 2, 2009, to March 30, 2010. 42 patients with a DSM-IV-TR diagnosis of bipolar I disorder were included in the study. YMRS and the IDS-C-30 were used to determine symptom severity. Subjects wore the actigraph continuously for 7 days. The 24-hour autocorrelation coefficient was used as an indicator of overall rhythmicity. The circadian quotient was used to characterize the strength of a circadian rhythm. A greater severity of manic symptoms correlated with a lower degree of rhythmicity and less robust rhythms of locomotor activity as indicated by lower 24-hour autocorrelation (r = -0.3406, P = .03) and circadian quotient (r = -0.5485, P = .0002) variables, respectively. No relationship was noted between the degree of depression and 24-hour autocorrelation scores (r = -0.1190, P = .45) or circadian quotient (r = 0.0083, P = .96). Correlation was noted between the 24-hour autocorrelation and circadian quotient scores (r = 0.6347, P < .0001). These results support the notion that circadian rhythm disturbances are associated with bipolar disorder and that these disturbances may be associated with clinical signatures of the disorder. Further assessment of rhythm disturbances in bipolar disorder is warranted. © Copyright 2014 Physicians Postgraduate Press, Inc.

  16. The role of remote wind forcing in the subinertial current variability in the central and northern parts of the South Brazil Bight

    NASA Astrophysics Data System (ADS)

    Dottori, Marcelo; Castro, Belmiro Mendes

    2018-05-01

    Data analysis of continental shelf currents and coastal sea level, together with the application of a semi-analytical model, are used to estimate the importance of remote wind forcing on the subinertial variability of the current in the central and northern areas of the South Brazil Bight. Results from both the data analysis and from the semi-analytical model are robust in showing subinertial variability that propagates along-shelf leaving the coast to the left in accordance with theoretical studies of Continental Shelf Waves (CSW). Both the subinertial variability observed in along-shelf currents and sea level oscillations present different propagation speeds for the narrow northern part of the SBB ( 6-7 m/s) and the wide central SBB region ( 11 m/s), those estimates being in agreement with the modeled CSW propagation speed. On the inner and middle shelf, observed along-shelf subinertial currents show higher correlation coefficients with the winds located southward and earlier in time than with the local wind at the current meter mooring position and at the time of measurement. The inclusion of the remote (located southwestward) wind forcing improves the prediction of the subinertial currents when compared to the currents forced only by the local wind, since the along-shelf-modeled currents present correlation coefficients with observed along-shelf currents up to 20% higher on the inner and middle shelf when the remote wind is included. For most of the outer shelf, on the other hand, this is not observed since usually, the correlation between the currents and the synoptic winds is not statistically significant.

  17. Image-adaptive and robust digital wavelet-domain watermarking for images

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Zhang, Liping

    2018-03-01

    We propose a new frequency domain wavelet based watermarking technique. The key idea of our scheme is twofold: multi-tier solution representation of image and odd-even quantization embedding/extracting watermark. Because many complementary watermarks need to be hidden, the watermark image designed is image-adaptive. The meaningful and complementary watermark images was embedded into the original image (host image) by odd-even quantization modifying coefficients, which was selected from the detail wavelet coefficients of the original image, if their magnitudes are larger than their corresponding Just Noticeable Difference thresholds. The tests show good robustness against best-known attacks such as noise addition, image compression, median filtering, clipping as well as geometric transforms. Further research may improve the performance by refining JND thresholds.

  18. Anthropometry of Women of the U.S. Army--1977. Report Number 4. Correlation Coefficients

    DTIC Science & Technology

    1980-02-01

    S.... •, 0 76 x:. ADo5 //64 ! TECHNICAL REPORT NATICK/TR-80/016 (/ II ANTHROPOMETRY OF WOMEN OF THE U.S. ARMY-1977 Report No. 4 Correlation...NUMBER NATICK/TR-80/016 4. TITLE (and Subtitle) 5. TYPE OF REPORT & PERIOD COVERED ANTHROPOMETRY OF WOMEN OF THE U.S. ARMY--1977 Technical Report REPORT NO... Anthropometry Survey(s) Coefficients of correlation Measurement(s) U.S. Army Correlation coefficients Body size Military personnel Equation(s) Sizes

  19. Robust Coefficients Alpha and Omega and Confidence Intervals with Outlying Observations and Missing Data Methods and Software

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-01-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation…

  20. Robust Coefficients Alpha and Omega and Confidence Intervals with Outlying Observations and Missing Data: Methods and Software

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Yuan, Ke-Hai

    2016-01-01

    Cronbach's coefficient alpha is a widely used reliability measure in social, behavioral, and education sciences. It is reported in nearly every study that involves measuring a construct through multiple items. With non-tau-equivalent items, McDonald's omega has been used as a popular alternative to alpha in the literature. Traditional estimation…

  1. Real-Time Tracking of Selective Auditory Attention From M/EEG: A Bayesian Filtering Approach

    PubMed Central

    Miran, Sina; Akram, Sahar; Sheikhattar, Alireza; Simon, Jonathan Z.; Zhang, Tao; Babadi, Behtash

    2018-01-01

    Humans are able to identify and track a target speaker amid a cacophony of acoustic interference, an ability which is often referred to as the cocktail party phenomenon. Results from several decades of studying this phenomenon have culminated in recent years in various promising attempts to decode the attentional state of a listener in a competing-speaker environment from non-invasive neuroimaging recordings such as magnetoencephalography (MEG) and electroencephalography (EEG). To this end, most existing approaches compute correlation-based measures by either regressing the features of each speech stream to the M/EEG channels (the decoding approach) or vice versa (the encoding approach). To produce robust results, these procedures require multiple trials for training purposes. Also, their decoding accuracy drops significantly when operating at high temporal resolutions. Thus, they are not well-suited for emerging real-time applications such as smart hearing aid devices or brain-computer interface systems, where training data might be limited and high temporal resolutions are desired. In this paper, we close this gap by developing an algorithmic pipeline for real-time decoding of the attentional state. Our proposed framework consists of three main modules: (1) Real-time and robust estimation of encoding or decoding coefficients, achieved by sparse adaptive filtering, (2) Extracting reliable markers of the attentional state, and thereby generalizing the widely-used correlation-based measures thereof, and (3) Devising a near real-time state-space estimator that translates the noisy and variable attention markers to robust and statistically interpretable estimates of the attentional state with minimal delay. Our proposed algorithms integrate various techniques including forgetting factor-based adaptive filtering, ℓ1-regularization, forward-backward splitting algorithms, fixed-lag smoothing, and Expectation Maximization. We validate the performance of our proposed framework using comprehensive simulations as well as application to experimentally acquired M/EEG data. Our results reveal that the proposed real-time algorithms perform nearly as accurately as the existing state-of-the-art offline techniques, while providing a significant degree of adaptivity, statistical robustness, and computational savings. PMID:29765298

  2. Real-Time Tracking of Selective Auditory Attention From M/EEG: A Bayesian Filtering Approach.

    PubMed

    Miran, Sina; Akram, Sahar; Sheikhattar, Alireza; Simon, Jonathan Z; Zhang, Tao; Babadi, Behtash

    2018-01-01

    Humans are able to identify and track a target speaker amid a cacophony of acoustic interference, an ability which is often referred to as the cocktail party phenomenon. Results from several decades of studying this phenomenon have culminated in recent years in various promising attempts to decode the attentional state of a listener in a competing-speaker environment from non-invasive neuroimaging recordings such as magnetoencephalography (MEG) and electroencephalography (EEG). To this end, most existing approaches compute correlation-based measures by either regressing the features of each speech stream to the M/EEG channels (the decoding approach) or vice versa (the encoding approach). To produce robust results, these procedures require multiple trials for training purposes. Also, their decoding accuracy drops significantly when operating at high temporal resolutions. Thus, they are not well-suited for emerging real-time applications such as smart hearing aid devices or brain-computer interface systems, where training data might be limited and high temporal resolutions are desired. In this paper, we close this gap by developing an algorithmic pipeline for real-time decoding of the attentional state. Our proposed framework consists of three main modules: (1) Real-time and robust estimation of encoding or decoding coefficients, achieved by sparse adaptive filtering, (2) Extracting reliable markers of the attentional state, and thereby generalizing the widely-used correlation-based measures thereof, and (3) Devising a near real-time state-space estimator that translates the noisy and variable attention markers to robust and statistically interpretable estimates of the attentional state with minimal delay. Our proposed algorithms integrate various techniques including forgetting factor-based adaptive filtering, ℓ 1 -regularization, forward-backward splitting algorithms, fixed-lag smoothing, and Expectation Maximization. We validate the performance of our proposed framework using comprehensive simulations as well as application to experimentally acquired M/EEG data. Our results reveal that the proposed real-time algorithms perform nearly as accurately as the existing state-of-the-art offline techniques, while providing a significant degree of adaptivity, statistical robustness, and computational savings.

  3. DCCA cross-correlation coefficients reveals the change of both synchronization and oscillation in EEG of Alzheimer disease patients

    NASA Astrophysics Data System (ADS)

    Chen, Yingyuan; Cai, Lihui; Wang, Ruofan; Song, Zhenxi; Deng, Bin; Wang, Jiang; Yu, Haitao

    2018-01-01

    Alzheimer's disease (AD) is a degenerative disorder of neural system that affects mainly the older population. Recently, many researches show that the EEG of AD patients can be characterized by EEG slowing, enhanced complexity of the EEG signals, and EEG synchrony. In order to examine the neural synchrony at multi scales, and to find a biomarker that help detecting AD in diagnosis, detrended cross-correlation analysis (DCCA) of EEG signals is applied in this paper. Several parameters, namely DCCA coefficients in the whole brain, DCCA coefficients at a specific scale, maximum DCCA coefficient over the span of all time scales and the corresponding scale of such coefficients, were extracted to examine the synchronization, respectively. The results show that DCCA coefficients have a trend of increase as scale increases, and decreases as electrode distance increases. Comparing DCCA coefficients in AD patients with healthy controls, a decrease of synchronization in the whole brain, and a bigger scale corresponding to maximum correlation is discovered in AD patients. The change of max-correlation scale may relate to the slowing of oscillatory activities. Linear combination of max DCCA coefficient and max-correlation scale reaches a classification accuracy of 90%. From the above results, it is reasonable to conclude that DCCA coefficient reveals the change of both oscillation and synchrony in AD, and thus is a powerful tool to differentiate AD patients from healthy elderly individuals.

  4. Changes in government spending on healthcare and population mortality in the European union, 1995–2010: a cross-sectional ecological study

    PubMed Central

    Watkins, Johnathan; Atun, Rifat; Williams, Callum; Zeltner, Thomas; Maruthappu, Mahiben

    2015-01-01

    Objective Economic measures such as unemployment and gross domestic product are correlated with changes in health outcomes. We aimed to examine the effects of changes in government healthcare spending, an increasingly important measure given constrained government budgets in several European Union countries. Design Multivariate regression analysis was used to assess the effect of changes in healthcare spending as a proportion of total government expenditure, government healthcare spending as a proportion of gross domestic product and government healthcare spending measured in purchasing power parity per capita, on five mortality indicators. Additional variables were controlled for to ensure robustness of data. One to five year lag analyses were conducted. Setting and Participants European Union countries 1995–2010. Main outcome measures Neonatal mortality, postneonatal mortality, one to five years of age mortality, under five years of age mortality, adult male mortality, adult female mortality. Results A 1% decrease in government healthcare spending was associated with significant increase in all mortality metrics: neonatal mortality (coefficient −0.1217, p = 0.0001), postneonatal mortality (coefficient −0.0499, p = 0.0018), one to five years of age mortality (coefficient −0.0185, p = 0.0002), under five years of age mortality (coefficient −0.1897, p = 0.0003), adult male mortality (coefficient −2.5398, p = 0.0000) and adult female mortality (coefficient −1.4492, p = 0.0000). One per cent decrease in healthcare spending, measured as a proportion of gross domestic product and in purchasing power parity, was both associated with significant increases (p < 0.05) in all metrics. Five years after the 1% decrease in healthcare spending, significant increases (p < 0.05) continued to be observed in all mortality metrics. Conclusions Decreased government healthcare spending is associated with increased population mortality in the short and long term. Policy interventions implemented in response to the financial crisis may be associated with worsening population health. PMID:26510733

  5. Changes in government spending on healthcare and population mortality in the European union, 1995-2010: a cross-sectional ecological study.

    PubMed

    Budhdeo, Sanjay; Watkins, Johnathan; Atun, Rifat; Williams, Callum; Zeltner, Thomas; Maruthappu, Mahiben

    2015-12-01

    Economic measures such as unemployment and gross domestic product are correlated with changes in health outcomes. We aimed to examine the effects of changes in government healthcare spending, an increasingly important measure given constrained government budgets in several European Union countries. Multivariate regression analysis was used to assess the effect of changes in healthcare spending as a proportion of total government expenditure, government healthcare spending as a proportion of gross domestic product and government healthcare spending measured in purchasing power parity per capita, on five mortality indicators. Additional variables were controlled for to ensure robustness of data. One to five year lag analyses were conducted. European Union countries 1995-2010. Neonatal mortality, postneonatal mortality, one to five years of age mortality, under five years of age mortality, adult male mortality, adult female mortality. A 1% decrease in government healthcare spending was associated with significant increase in all mortality metrics: neonatal mortality (coefficient -0.1217, p = 0.0001), postneonatal mortality (coefficient -0.0499, p = 0.0018), one to five years of age mortality (coefficient -0.0185, p = 0.0002), under five years of age mortality (coefficient -0.1897, p = 0.0003), adult male mortality (coefficient -2.5398, p = 0.0000) and adult female mortality (coefficient -1.4492, p = 0.0000). One per cent decrease in healthcare spending, measured as a proportion of gross domestic product and in purchasing power parity, was both associated with significant increases (p < 0.05) in all metrics. Five years after the 1% decrease in healthcare spending, significant increases (p < 0.05) continued to be observed in all mortality metrics. Decreased government healthcare spending is associated with increased population mortality in the short and long term. Policy interventions implemented in response to the financial crisis may be associated with worsening population health. © The Royal Society of Medicine.

  6. Statistics corner: A guide to appropriate use of correlation coefficient in medical research.

    PubMed

    Mukaka, M M

    2012-09-01

    Correlation is a statistical method used to assess a possible linear association between two continuous variables. It is simple both to calculate and to interpret. However, misuse of correlation is so common among researchers that some statisticians have wished that the method had never been devised at all. The aim of this article is to provide a guide to appropriate use of correlation in medical research and to highlight some misuse. Examples of the applications of the correlation coefficient have been provided using data from statistical simulations as well as real data. Rule of thumb for interpreting size of a correlation coefficient has been provided.

  7. [Correlation of molecular weight and nanofiltration mass transfer coefficient of phenolic acid composition from Salvia miltiorrhiza].

    PubMed

    Li, Cun-Yu; Wu, Xin; Gu, Jia-Mei; Li, Hong-Yang; Peng, Guo-Ping

    2018-04-01

    Based on the molecular sieving and solution-diffusion effect in nanofiltration separation, the correlation between initial concentration and mass transfer coefficient of three typical phenolic acids from Salvia miltiorrhiza was fitted to analyze the relationship among mass transfer coefficient, molecular weight and concentration. The experiment showed a linear relationship between operation pressure and membrane flux. Meanwhile, the membrane flux was gradually decayed with the increase of solute concentration. On the basis of the molecular sieving and solution-diffusion effect, the mass transfer coefficient and initial concentration of three phenolic acids showed a power function relationship, and the regression coefficients were all greater than 0.9. The mass transfer coefficient and molecular weight of three phenolic acids were negatively correlated with each other, and the order from high to low is protocatechualdehyde >rosmarinic acid> salvianolic acid B. The separation mechanism of nanofiltration for phenolic acids was further clarified through the analysis of the correlation of molecular weight and nanofiltration mass transfer coefficient. The findings provide references for nanofiltration separation, especially for traditional Chinese medicine with phenolic acids. Copyright© by the Chinese Pharmaceutical Association.

  8. Biostatistics Series Module 6: Correlation and Linear Regression.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Correlation and linear regression are the most commonly used techniques for quantifying the association between two numeric variables. Correlation quantifies the strength of the linear relationship between paired variables, expressing this as a correlation coefficient. If both variables x and y are normally distributed, we calculate Pearson's correlation coefficient ( r ). If normality assumption is not met for one or both variables in a correlation analysis, a rank correlation coefficient, such as Spearman's rho (ρ) may be calculated. A hypothesis test of correlation tests whether the linear relationship between the two variables holds in the underlying population, in which case it returns a P < 0.05. A 95% confidence interval of the correlation coefficient can also be calculated for an idea of the correlation in the population. The value r 2 denotes the proportion of the variability of the dependent variable y that can be attributed to its linear relation with the independent variable x and is called the coefficient of determination. Linear regression is a technique that attempts to link two correlated variables x and y in the form of a mathematical equation ( y = a + bx ), such that given the value of one variable the other may be predicted. In general, the method of least squares is applied to obtain the equation of the regression line. Correlation and linear regression analysis are based on certain assumptions pertaining to the data sets. If these assumptions are not met, misleading conclusions may be drawn. The first assumption is that of linear relationship between the two variables. A scatter plot is essential before embarking on any correlation-regression analysis to show that this is indeed the case. Outliers or clustering within data sets can distort the correlation coefficient value. Finally, it is vital to remember that though strong correlation can be a pointer toward causation, the two are not synonymous.

  9. Biostatistics Series Module 6: Correlation and Linear Regression

    PubMed Central

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Correlation and linear regression are the most commonly used techniques for quantifying the association between two numeric variables. Correlation quantifies the strength of the linear relationship between paired variables, expressing this as a correlation coefficient. If both variables x and y are normally distributed, we calculate Pearson's correlation coefficient (r). If normality assumption is not met for one or both variables in a correlation analysis, a rank correlation coefficient, such as Spearman's rho (ρ) may be calculated. A hypothesis test of correlation tests whether the linear relationship between the two variables holds in the underlying population, in which case it returns a P < 0.05. A 95% confidence interval of the correlation coefficient can also be calculated for an idea of the correlation in the population. The value r2 denotes the proportion of the variability of the dependent variable y that can be attributed to its linear relation with the independent variable x and is called the coefficient of determination. Linear regression is a technique that attempts to link two correlated variables x and y in the form of a mathematical equation (y = a + bx), such that given the value of one variable the other may be predicted. In general, the method of least squares is applied to obtain the equation of the regression line. Correlation and linear regression analysis are based on certain assumptions pertaining to the data sets. If these assumptions are not met, misleading conclusions may be drawn. The first assumption is that of linear relationship between the two variables. A scatter plot is essential before embarking on any correlation-regression analysis to show that this is indeed the case. Outliers or clustering within data sets can distort the correlation coefficient value. Finally, it is vital to remember that though strong correlation can be a pointer toward causation, the two are not synonymous. PMID:27904175

  10. Comparison of global optimization approaches for robust calibration of hydrologic model parameters

    NASA Astrophysics Data System (ADS)

    Jung, I. W.

    2015-12-01

    Robustness of the calibrated parameters of hydrologic models is necessary to provide a reliable prediction of future performance of watershed behavior under varying climate conditions. This study investigated calibration performances according to the length of calibration period, objective functions, hydrologic model structures and optimization methods. To do this, the combination of three global optimization methods (i.e. SCE-UA, Micro-GA, and DREAM) and four hydrologic models (i.e. SAC-SMA, GR4J, HBV, and PRMS) was tested with different calibration periods and objective functions. Our results showed that three global optimization methods provided close calibration performances under different calibration periods, objective functions, and hydrologic models. However, using the agreement of index, normalized root mean square error, Nash-Sutcliffe efficiency as the objective function showed better performance than using correlation coefficient and percent bias. Calibration performances according to different calibration periods from one year to seven years were hard to generalize because four hydrologic models have different levels of complexity and different years have different information content of hydrological observation. Acknowledgements This research was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  11. A comparison of confidence interval methods for the intraclass correlation coefficient in community-based cluster randomization trials with a binary outcome.

    PubMed

    Braschel, Melissa C; Svec, Ivana; Darlington, Gerarda A; Donner, Allan

    2016-04-01

    Many investigators rely on previously published point estimates of the intraclass correlation coefficient rather than on their associated confidence intervals to determine the required size of a newly planned cluster randomized trial. Although confidence interval methods for the intraclass correlation coefficient that can be applied to community-based trials have been developed for a continuous outcome variable, fewer methods exist for a binary outcome variable. The aim of this study is to evaluate confidence interval methods for the intraclass correlation coefficient applied to binary outcomes in community intervention trials enrolling a small number of large clusters. Existing methods for confidence interval construction are examined and compared to a new ad hoc approach based on dividing clusters into a large number of smaller sub-clusters and subsequently applying existing methods to the resulting data. Monte Carlo simulation is used to assess the width and coverage of confidence intervals for the intraclass correlation coefficient based on Smith's large sample approximation of the standard error of the one-way analysis of variance estimator, an inverted modified Wald test for the Fleiss-Cuzick estimator, and intervals constructed using a bootstrap-t applied to a variance-stabilizing transformation of the intraclass correlation coefficient estimate. In addition, a new approach is applied in which clusters are randomly divided into a large number of smaller sub-clusters with the same methods applied to these data (with the exception of the bootstrap-t interval, which assumes large cluster sizes). These methods are also applied to a cluster randomized trial on adolescent tobacco use for illustration. When applied to a binary outcome variable in a small number of large clusters, existing confidence interval methods for the intraclass correlation coefficient provide poor coverage. However, confidence intervals constructed using the new approach combined with Smith's method provide nominal or close to nominal coverage when the intraclass correlation coefficient is small (<0.05), as is the case in most community intervention trials. This study concludes that when a binary outcome variable is measured in a small number of large clusters, confidence intervals for the intraclass correlation coefficient may be constructed by dividing existing clusters into sub-clusters (e.g. groups of 5) and using Smith's method. The resulting confidence intervals provide nominal or close to nominal coverage across a wide range of parameters when the intraclass correlation coefficient is small (<0.05). Application of this method should provide investigators with a better understanding of the uncertainty associated with a point estimator of the intraclass correlation coefficient used for determining the sample size needed for a newly designed community-based trial. © The Author(s) 2015.

  12. Snapping Sharks, Maddening Mindreaders, and Interactive Images: Teaching Correlation.

    ERIC Educational Resources Information Center

    Mitchell, Mark L.

    Understanding correlation coefficients is difficult for students. A free computer program that helps introductory psychology students distinguish between positive and negative correlation, and which also teaches them to understand the differences between correlation coefficients of different size is described in this paper. The program is…

  13. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com; Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001; Gupta, Shikha

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models wasmore » performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive abilities of the interspecies GRNN model to predict the carcinogenic potency of diverse chemicals. - Highlights: • Global robust models constructed for carcinogenicity prediction of diverse chemicals. • Tanimoto/BDS test revealed structural diversity of chemicals and nonlinearity in data. • PNN/GRNN successfully predicted carcinogenicity/carcinogenic potency of chemicals. • Developed interspecies PNN/GRNN models for carcinogenicity prediction. • Proposed models can be used as tool to predict carcinogenicity of new chemicals.« less

  14. Correlation-based perfusion mapping using time-resolved MR angiography: A feasibility study for patients with suspicions of steno-occlusive craniocervical arteries.

    PubMed

    Nam, Yoonho; Jang, Jinhee; Park, Sonya Youngju; Choi, Hyun Seok; Jung, So-Lyung; Ahn, Kook-Jin; Kim, Bum-Soo

    2018-05-22

    To explore the feasibility of using correlation-based time-delay (CTD) maps produced from time-resolved MR angiography (TRMRA) to diagnose perfusion abnormalities in patients suspected to have steno-occlusive lesions in the craniocervical arteries. Twenty-seven patients who were suspected to have steno-occlusive lesions in the craniocervical arteries underwent both TRMRA and brain single-photon emission computed tomography (SPECT). TRMRA was performed on the supra-aortic area after intravenous injection of a 0.03 mmol/kg gadolinium-based contrast agent. Time-to-peak (TTP) maps and CTD maps of the brain were automatically generated from TRMRA data, and their quality was assessed. Detection of perfusion abnormalities was compared between CTD maps and the time-series maximal intensity projection (MIP) images from TRMRA and TTP maps. Correlation coefficients between quantitative changes in SPECT and parametric maps for the abnormal perfusion areas were calculated. The CTD maps were of significantly superior quality than TTP maps (p < 0.01). For perfusion abnormality detection, CTD maps (kappa 0.84, 95% confidence interval [CI] 0.67-1.00) showed better agreement with SPECT than TTP maps (0.66, 0.46-0.85). For perfusion deficit detection, CTD maps showed higher accuracy (85.2%, 95% CI 66.3-95.8) than MIP images (66.7%, 46-83.5), with marginal significance (p = 0.07). In abnormal perfusion areas, correlation coefficients between SPECT and CTD (r = 0.74, 95% CI 0.34-0.91) were higher than those between SPECT and TTP (r = 0.66, 0.20-0.88). CTD maps generated from TRMRA were of high quality and offered good diagnostic performance for detecting perfusion abnormalities associated with steno-occlusive arterial lesions in the craniocervical area. • Generation of perfusion parametric maps from time-resolved MR angiography is clinically useful. • Correlation-based delay maps can be used to detect perfusion abnormalities associated with steno-occlusive craniocervical arteries. • Estimation of correlation-based delay is robust for low signal-to-noise 4D MR data.

  15. Surrogate endpoints for overall survival in metastatic melanoma: a meta-analysis of randomised controlled trials

    PubMed Central

    Flaherty, Keith T; Hennig, Michael; Lee, Sandra J; Ascierto, Paolo A; Dummer, Reinhard; Eggermont, Alexander M M; Hauschild, Axel; Kefford, Richard; Kirkwood, John M; Long, Georgina V; Lorigan, Paul; Mackensen, Andreas; McArthur, Grant; O'Day, Steven; Patel, Poulam M; Robert, Caroline; Schadendorf, Dirk

    2015-01-01

    Summary Background Recent phase 3 trials have shown an overall survival benefit in metastatic melanoma. We aimed to assess whether progression-free survival (PFS) could be regarded as a reliable surrogate for overall survival through a meta-analysis of randomised trials. Methods We systematically reviewed randomised trials comparing treatment regimens in metastatic melanoma that included dacarbazine as the control arm, and which reported both PFS and overall survival with a standard hazard ratio (HR). We correlated HRs for overall survival and PFS, weighted by sample size or by precision of the HR estimate, assuming fixed and random effects. We did sensitivity analyses according to presence of crossover, trial size, and dacarbazine dose. Findings After screening 1649 reports and meeting abstracts published before Sept 8, 2013, we identified 12 eligible randomised trials that enrolled 4416 patients with metastatic melanoma. Irrespective of weighting strategy, we noted a strong correlation between the treatment effects for PFS and overall survival, which seemed independent of treatment type. Pearson correlation coefficients were 0.71 (95% CI 0.29–0.90) with a random-effects assumption, 0.85 (0.59–0.95) with a fixed-effects assumption, and 0.89 (0.68–0.97) with sample-size weighting. For nine trials without crossover, the correlation coefficient was 0.96 (0.81–0.99), which decreased to 0.93 (0.74–0.98) when two additional trials with less than 50% crossover were included. Inclusion of mature follow-up data after at least 50% crossover (in vemurafenib and dabrafenib phase 3 trials) weakened the PFS to overall survival correlation (0.55, 0.03–0.84). Inclusion of trials with no or little crossover with the random-effects assumption yielded a conservative statement of the PFS to overall survival correlation of 0.85 (0.51–0.96). Interpretation PFS can be regarded as a robust surrogate for overall survival in dacarbazine-controlled randomised trials of metastatic melanoma; we postulate that this association will hold as treatment standards evolve and are adopted as the control arm in future trials. Funding None. PMID:24485879

  16. Relationship between indices of obesity obtained by anthropometry and dual-energy X-ray absorptiometry: The Fourth and Fifth Korea National Health and Nutrition Examination Survey (KNHANES IV and V, 2008-2011).

    PubMed

    Kim, Seul Gi; Ko, Ki dong; Hwang, In Cheol; Suh, Heuy Sun; Kay, Shelley; Caterson, Ian; Kim, Kyoung Kon

    2015-01-01

    Body mass index (BMI), waist circumference (WC), and even dual-energy X-ray absorptiometry (DXA) are used for obesity diagnosis. However, it is not known which DXA-derived index of obesity correlates best with BMI and/or WC and it is not clear whether such an index is accurate or not. The aim of this study is to show the relationship between anthropometric measurements (BMI, WC) and body fat indices from DXA and to determine which DXA indices are strongly related to BMI and WC. This study was based on data obtained from the Fourth and Fifth Korea National Health and Nutrition Examination Survey (KNHANES IV-V). DXA measurements were performed on survey subjects over 10 years old from July 2008 through to May 2011. Of these, 18 198 individuals, aged 19 years and older for whom DXA data were available, were included. Weighted Pearson's correlated coefficients (r) were calculated among indices, according to sex, age group and menopause, and the coefficients were compared with each other. BMI correlates most with trunk body fat mass in kg (r=0.831) and then with total body fat in kg (r=0.774, P<0.00043 for difference of r). In the older age group, BMI correlates with total body fat mass (r=0.822) better than with trunk fat mass (r=0.817, P<0.00043). WC correlates with trunk body fat mass most in both genders and all age groups (0.804≤r≤0.906). Correlations of BMI (r=0.645 for men, 0.689 for women) and WC (r=0.678 for men, 0.634 for women) to body fat percentages (%) were less robust than those to body fat mass. BMI and WC reflect trunk and total body fat in kg more than body fat percentage derived by DXA. Copyright © 2014 Asian Oceanian Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  17. 40 CFR 53.34 - Test procedure for methods for PM10 and Class I methods for PM2.5.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... linear regression parameters (slope, intercept, and correlation coefficient) describing the relationship... correlation coefficient. (2) To pass the test for comparability, the slope, intercept, and correlation...

  18. Functional connectivity and structural covariance between regions of interest can be measured more accurately using multivariate distance correlation.

    PubMed

    Geerligs, Linda; Cam-Can; Henson, Richard N

    2016-07-15

    Studies of brain-wide functional connectivity or structural covariance typically use measures like the Pearson correlation coefficient, applied to data that have been averaged across voxels within regions of interest (ROIs). However, averaging across voxels may result in biased connectivity estimates when there is inhomogeneity within those ROIs, e.g., sub-regions that exhibit different patterns of functional connectivity or structural covariance. Here, we propose a new measure based on "distance correlation"; a test of multivariate dependence of high dimensional vectors, which allows for both linear and non-linear dependencies. We used simulations to show how distance correlation out-performs Pearson correlation in the face of inhomogeneous ROIs. To evaluate this new measure on real data, we use resting-state fMRI scans and T1 structural scans from 2 sessions on each of 214 participants from the Cambridge Centre for Ageing & Neuroscience (Cam-CAN) project. Pearson correlation and distance correlation showed similar average connectivity patterns, for both functional connectivity and structural covariance. Nevertheless, distance correlation was shown to be 1) more reliable across sessions, 2) more similar across participants, and 3) more robust to different sets of ROIs. Moreover, we found that the similarity between functional connectivity and structural covariance estimates was higher for distance correlation compared to Pearson correlation. We also explored the relative effects of different preprocessing options and motion artefacts on functional connectivity. Because distance correlation is easy to implement and fast to compute, it is a promising alternative to Pearson correlations for investigating ROI-based brain-wide connectivity patterns, for functional as well as structural data. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  19. [Prediction of soil adsorption coefficients of organic compounds in a wide range of soil types by soil column liquid chromatography].

    PubMed

    Guo, Rongbo; Chen, Jiping; Zhang, Qing; Wu, Wenzhong; Liang, Xinmiao

    2004-01-01

    Using the methanol-water mixtures as mobile phases of soil column liquid chromatography (SCLC), prediction of soil adsorption coefficients (K(d)) by SCLC was validated in a wide range of soil types. The correlations between the retention factors measured by SCLC and soil adsorption coefficients measured by batch experiments were studied for five soils with different properties, i.e., Eurosoil 1#, 2#, 3#, 4# and 5#. The results show that good correlations existed between the retention factors and soil adsorption coefficients for Eurosoil 1#, 2#, 3# and 4#. For Eurosoil 5# which has a pH value of near 3, the correlation between retention factors and soil adsorption coefficients was unsatisfactory using methanol-water as mobile phase of SCLC. However, a good correlation was obtained using a methanol-buffer mixture with pH 3 as the mobile phase. This study proved that the SCLC is suitable for the prediction of soil adsorption coefficients.

  20. Temporal correlation coefficient for directed networks.

    PubMed

    Büttner, Kathrin; Salau, Jennifer; Krieter, Joachim

    2016-01-01

    Previous studies dealing with network theory focused mainly on the static aggregation of edges over specific time window lengths. Thus, most of the dynamic information gets lost. To assess the quality of such a static aggregation the temporal correlation coefficient can be calculated. It measures the overall possibility for an edge to persist between two consecutive snapshots. Up to now, this measure is only defined for undirected networks. Therefore, we introduce the adaption of the temporal correlation coefficient to directed networks. This new methodology enables the distinction between ingoing and outgoing edges. Besides a small example network presenting the single calculation steps, we also calculated the proposed measurements for a real pig trade network to emphasize the importance of considering the edge direction. The farm types at the beginning of the pork supply chain showed clearly higher values for the outgoing temporal correlation coefficient compared to the farm types at the end of the pork supply chain. These farm types showed higher values for the ingoing temporal correlation coefficient. The temporal correlation coefficient is a valuable tool to understand the structural dynamics of these systems, as it assesses the consistency of the edge configuration. The adaption of this measure for directed networks may help to preserve meaningful additional information about the investigated network that might get lost if the edge directions are ignored.

  1. Correlation characteristics of phase and amplitude chimeras in an ensemble of nonlocally coupled maps

    NASA Astrophysics Data System (ADS)

    Vadivasova, T. E.; Strelkova, G. I.; Bogomolov, S. A.; Anishchenko, V. S.

    2017-01-01

    Correlation characteristics of chimera states have been calculated using the coefficient of mutual correlation of elements in a closed-ring ensemble of nonlocally coupled chaotic maps. Quantitative differences between the coefficients of mutual correlation for phase and amplitude chimeras are established for the first time.

  2. Reliability testing of a portfolio assessment tool for postgraduate family medicine training in South Africa

    PubMed Central

    Mash, Bob; Derese, Anselme

    2013-01-01

    Abstract Background Competency-based education and the validity and reliability of workplace-based assessment of postgraduate trainees have received increasing attention worldwide. Family medicine was recognised as a speciality in South Africa six years ago and a satisfactory portfolio of learning is a prerequisite to sit the national exit exam. A massive scaling up of the number of family physicians is needed in order to meet the health needs of the country. Aim The aim of this study was to develop a reliable, robust and feasible portfolio assessment tool (PAT) for South Africa. Methods Six raters each rated nine portfolios from the Stellenbosch University programme, using the PAT, to test for inter-rater reliability. This rating was repeated three months later to determine test–retest reliability. Following initial analysis and feedback the PAT was modified and the inter-rater reliability again assessed on nine new portfolios. An acceptable intra-class correlation was considered to be > 0.80. Results The total score was found to be reliable, with a coefficient of 0.92. For test–retest reliability, the difference in mean total score was 1.7%, which was not statistically significant. Amongst the subsections, only assessment of the educational meetings and the logbook showed reliability coefficients > 0.80. Conclusion This was the first attempt to develop a reliable, robust and feasible national portfolio assessment tool to assess postgraduate family medicine training in the South African context. The tool was reliable for the total score, but the low reliability of several sections in the PAT helped us to develop 12 recommendations regarding the use of the portfolio, the design of the PAT and the training of raters.

  3. Comparison of RNFL thickness and RPE-normalized RNFL attenuation coefficient for glaucoma diagnosis

    NASA Astrophysics Data System (ADS)

    Vermeer, K. A.; van der Schoot, J.; Lemij, H. G.; de Boer, J. F.

    2013-03-01

    Recently, a method to determine the retinal nerve fiber layer (RNFL) attenuation coefficient, based on normalization on the retinal pigment epithelium, was introduced. In contrast to conventional RNFL thickness measures, this novel measure represents a scattering property of the RNFL tissue. In this paper, we compare the RNFL thickness and the RNFL attenuation coefficient on 10 normal and 8 glaucomatous eyes by analyzing the correlation coefficient and the receiver operator curves (ROCs). The thickness and attenuation coefficient showed moderate correlation (r=0.82). Smaller correlation coefficients were found within normal (r=0.55) and glaucomatous (r=0.48) eyes. The full separation between normal and glaucomatous eyes based on the RNFL attenuation coefficient yielded an area under the ROC (AROC) of 1.0. The AROC for the RNFL thickness was 0.9875. No statistically significant difference between the two measures was found by comparing the AROC. RNFL attenuation coefficients may thus replace current RNFL thickness measurements or be combined with it to improve glaucoma diagnosis.

  4. Impact of aerosols and adverse atmospheric conditions on the data quality for spectral analysis of the H.E.S.S. telescopes

    NASA Astrophysics Data System (ADS)

    Hahn, J.; de los Reyes, R.; Bernlöhr, K.; Krüger, P.; Lo, Y. T. E.; Chadwick, P. M.; Daniel, M. K.; Deil, C.; Gast, H.; Kosack, K.; Marandon, V.

    2014-02-01

    The Earth's atmosphere is an integral part of the detector in ground-based imaging atmospheric Cherenkov telescope (IACT) experiments and has to be taken into account in the calibration. Atmospheric and hardware-related deviations from simulated conditions can result in the mis-reconstruction of primary particle energies and therefore of source spectra. During the eight years of observations with the High Energy Stereoscopic System (H.E.S.S.) in Namibia, the overall yield in Cherenkov photons has varied strongly with time due to gradual hardware aging, together with adjustments of the hardware components, and natural, as well as anthropogenic, variations of the atmospheric transparency. Here we present robust data selection criteria that minimize these effects over the full data set of the H.E.S.S. experiment and introduce the Cherenkov transparency coefficient as a new atmospheric monitoring quantity. The influence of atmospheric transparency, as quantified by this coefficient, on energy reconstruction and spectral parameters is examined and its correlation with the aerosol optical depth (AOD) of independent MISR satellite measurements and local measurements of atmospheric clarity is investigated.

  5. Fetal lung apparent diffusion coefficient measurement using diffusion-weighted MRI at 3 Tesla: Correlation with gestational age.

    PubMed

    Afacan, Onur; Gholipour, Ali; Mulkern, Robert V; Barnewolt, Carol E; Estroff, Judy A; Connolly, Susan A; Parad, Richard B; Bairdain, Sigrid; Warfield, Simon K

    2016-12-01

    To evaluate the feasibility of using diffusion-weighted magnetic resonance imaging (DW-MRI) to assess the fetal lung apparent diffusion coefficient (ADC) at 3 Tesla (T). Seventy-one pregnant women (32 second trimester, 39 third trimester) were scanned with a twice-refocused Echo-planar diffusion-weighted imaging sequence with 6 different b-values in 3 orthogonal diffusion orientations at 3T. After each scan, a region-of-interest (ROI) mask was drawn to select a region in the fetal lung and an automated robust maximum likelihood estimation algorithm was used to compute the ADC parameter. The amount of motion in each scan was visually rated. When scans with unacceptable levels of motion were eliminated, the lung ADC values showed a strong association with gestational age (P < 0.01), increasing dramatically between 16 and 27 weeks and then achieving a plateau around 27 weeks. We show that to get reliable estimates of ADC values of fetal lungs, a multiple b-value acquisition, where motion is either corrected or considered, can be performed. J. Magn. Reson. Imaging 2016;44:1650-1655. © 2016 International Society for Magnetic Resonance in Medicine.

  6. Depression correlates with quality of life in people with epilepsy independent of the measures used.

    PubMed

    Agrawal, Niruj; Bird, Jacob S; von Oertzen, Tim J; Cock, Hannah; Mitchell, Alex J; Mula, Marco

    2016-09-01

    A number of studies have suggested that depressed mood is one of the most important predictors of quality of life (QoL) in patients with epilepsy. However, the QoL measure used in previous studies was limited to the Quality of Life in Epilepsy (QOLIE) scales. It could be questioned whether correlation of QOLIE with measures of depression is influenced by the properties of the instruments used rather than being a valid effect. By using visual analogue scales, the current study aimed to clarify whether depression and QoL are truly correlated in patients with epilepsy. Data from a sample of 261 outpatients with epilepsy attending the Epilepsy Clinics of the Atkinson Morley Outpatient Department, St George's Hospital in London, were analyzed. Patients were screened using the European Quality-of-Life scale (EQ-5D-3L) which includes an overall visual analogue score (EQ-VAS), the Emotional Thermometer (ET7), the Beck Depression inventory-II (BDI-II), the Hospital Anxiety and Depression scale (HADS), and the Major Depression inventory (MDI). Depression was found to significantly correlate with EQ-VAS score with r coefficient ranging from 0.42 to 0.51 and r(2) coefficients ranging between 0.18 and 0.26. In addition, we identified patients who were depressed according to DSM-IV criteria (MD) and those with atypical forms of depression (AD). The EQ-5D-3L scores in these subjects compared with those without depression (ND) showed a different impact of AD and MD on QoL. The relationship between depression and QoL in people with epilepsy has been demonstrated to be a robust and valid effect, not a result of potential bias of the specific measures used. However, the strength of the association is influenced by the individual instrument. Atypical or subsyndromic forms of depression are as relevant as DSM-based depression in terms of impact on QoL. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Channel correlation of free space optical communication systems with receiver diversity in non-Kolmogorov atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Fu, Yulong; Tan, Liying; Yu, Siyuan; Xie, Xiaolong

    2018-05-01

    Spatial diversity as an effective technique to mitigate the turbulence fading has been widely utilized in free space optical (FSO) communication systems. The received signals, however, will suffer from channel correlation due to insufficient spacing between component antennas. In this paper, the new expressions of the channel correlation coefficient and specifically its components (the large- and small-scale channel correlation coefficients) for a plane wave with aperture effects are derived for horizontal link in moderate-to-strong turbulence, using a non-Kolmogorov spectrum that has a generalized power law in the range of 3-4 instead of the fixed classical Kolmogorov power law of 11/3. And then the influence of power law variations on the channel correlation coefficient and its components are analysed. The numerical results indicated that various value of the power law lead to varying effects on the channel correlation coefficient and its components. This work will help with the further investigation on the fading correlation in spatial diversity systems.

  8. The Robustness of LISREL Estimates in Structural Equation Models with Categorical Variables.

    ERIC Educational Resources Information Center

    Ethington, Corinna A.

    This study examined the effect of type of correlation matrix on the robustness of LISREL maximum likelihood and unweighted least squares structural parameter estimates for models with categorical manifest variables. Two types of correlation matrices were analyzed; one containing Pearson product-moment correlations and one containing tetrachoric,…

  9. The Physical Significance of the Synthetic Running Correlation Coefficient and Its Applications in Oceanic and Atmospheric Studies

    NASA Astrophysics Data System (ADS)

    Zhao, Jinping; Cao, Yong; Wang, Xin

    2018-06-01

    In order to study the temporal variations of correlations between two time series, a running correlation coefficient (RCC) could be used. An RCC is calculated for a given time window, and the window is then moved sequentially through time. The current calculation method for RCCs is based on the general definition of the Pearson product-moment correlation coefficient, calculated with the data within the time window, which we call the local running correlation coefficient (LRCC). The LRCC is calculated via the two anomalies corresponding to the two local means, meanwhile, the local means also vary. It is cleared up that the LRCC reflects only the correlation between the two anomalies within the time window but fails to exhibit the contributions of the two varying means. To address this problem, two unchanged means obtained from all available data are adopted to calculate an RCC, which is called the synthetic running correlation coefficient (SRCC). When the anomaly variations are dominant, the two RCCs are similar. However, when the variations of the means are dominant, the difference between the two RCCs becomes obvious. The SRCC reflects the correlations of both the anomaly variations and the variations of the means. Therefore, the SRCCs from different time points are intercomparable. A criterion for the superiority of the RCC algorithm is that the average value of the RCC should be close to the global correlation coefficient calculated using all data. The SRCC always meets this criterion, while the LRCC sometimes fails. Therefore, the SRCC is better than the LRCC for running correlations. We suggest using the SRCC to calculate the RCCs.

  10. Observed intra-cluster correlation coefficients in a cluster survey sample of patient encounters in general practice in Australia

    PubMed Central

    Knox, Stephanie A; Chondros, Patty

    2004-01-01

    Background Cluster sample study designs are cost effective, however cluster samples violate the simple random sample assumption of independence of observations. Failure to account for the intra-cluster correlation of observations when sampling through clusters may lead to an under-powered study. Researchers therefore need estimates of intra-cluster correlation for a range of outcomes to calculate sample size. We report intra-cluster correlation coefficients observed within a large-scale cross-sectional study of general practice in Australia, where the general practitioner (GP) was the primary sampling unit and the patient encounter was the unit of inference. Methods Each year the Bettering the Evaluation and Care of Health (BEACH) study recruits a random sample of approximately 1,000 GPs across Australia. Each GP completes details of 100 consecutive patient encounters. Intra-cluster correlation coefficients were estimated for patient demographics, morbidity managed and treatments received. Intra-cluster correlation coefficients were estimated for descriptive outcomes and for associations between outcomes and predictors and were compared across two independent samples of GPs drawn three years apart. Results Between April 1999 and March 2000, a random sample of 1,047 Australian general practitioners recorded details of 104,700 patient encounters. Intra-cluster correlation coefficients for patient demographics ranged from 0.055 for patient sex to 0.451 for language spoken at home. Intra-cluster correlations for morbidity variables ranged from 0.005 for the management of eye problems to 0.059 for management of psychological problems. Intra-cluster correlation for the association between two variables was smaller than the descriptive intra-cluster correlation of each variable. When compared with the April 2002 to March 2003 sample (1,008 GPs) the estimated intra-cluster correlation coefficients were found to be consistent across samples. Conclusions The demonstrated precision and reliability of the estimated intra-cluster correlations indicate that these coefficients will be useful for calculating sample sizes in future general practice surveys that use the GP as the primary sampling unit. PMID:15613248

  11. Effect of improper scan alignment on retinal nerve fiber layer thickness measurements using Stratus optical coherence tomograph.

    PubMed

    Vizzeri, Gianmarco; Bowd, Christopher; Medeiros, Felipe A; Weinreb, Robert N; Zangwill, Linda M

    2008-08-01

    Misalignment of the Stratus optical coherence tomograph scan circle placed by the operator around the optic nerve head (ONH) during each retinal nerve fiber layer (RNFL) examination can affect the instrument reproducibility and its theoretical ability to detect true structural changes in the RNFL thickness over time. We evaluated the effect of scan circle placement on RNFL measurements. Observational clinical study. Sixteen eyes of 8 normal participants were examined using the Stratus optical coherence tomograph Fast RNFL thickness acquisition protocol (software version 4.0.7; Carl Zeiss Meditec, Dublin, CA). Four consecutive images were taken by the same operator with the circular scan centered on the optic nerve head. Four images each with the scan displaced superiorly, inferiorly, temporally, and nasally were also acquired. Differences in average and sectoral RNFL thicknesses were determined. For the centered scans, the coefficients of variation (CV) and the intraclass correlation coefficient for the average RNFL thickness measured were calculated. When the average RNFL thickness of the centered scans was compared with the average RNFL thickness of the displaced scans individually using analysis of variance with post-hoc analysis, no difference was found between the average RNFL thickness of the nasally (105.2 microm), superiorly (106.2 microm), or inferiorly (104.1 microm) displaced scans and the centered scans (106.4 microm). However, a significant difference (analysis of variance with Dunnett's test: F=8.82, P<0.0001) was found between temporally displaced scans (115.8 microm) and centered scans. Significant differences in sectoral RNFL thickness measurements were found between centered and each displaced scan. The coefficient of variation for average RNFL thickness was 1.75% and intraclass correlation coefficient was 0.95. In normal eyes, average RNFL thickness measurements are robust and similar with significant superior, inferior, and nasal scan displacement, but average RNFL thickness is greater when scans are displaced temporally. Parapapillary scan misalignment produces significant changes in RNFL assessment characterized by an increase in measured RNFL thickness in the quadrant in which the scan is closer to the disc, and a significant decrease in RNFL thickness in the quadrant in which the scan is displaced further from the optic disc.

  12. Development of a robust and cost-effective 3D respiratory motion monitoring system using the kinect device: Accuracy comparison with the conventional stereovision navigation system.

    PubMed

    Bae, Myungsoo; Lee, Sangmin; Kim, Namkug

    2018-07-01

    To develop and validate a robust and cost-effective 3D respiratory monitoring system based on a Kinect device with a custom-made simple marker. A 3D respiratory monitoring system comprising the simple marker and the Microsoft Kinect v2 device was developed. The marker was designed for simple and robust detection, and the tracking algorithm was developed using the depth, RGB, and infra-red images acquired from the Kinect sensor. A Kalman filter was used to suppress movement noises. The major movements of the marker attached to the four different locations of body surface were determined from the initially collected tracking points of the marker while breathing. The signal level of respiratory motion with the tracking point was estimated along the major direction vector. The accuracy of the results was evaluated through a comparison with those of the conventional stereovision navigation system (NDI Polaris Spectra). Sixteen normal volunteers were enrolled to evaluate the accuracy of this system. The correlation coefficients between the respiratory motion signal from the Kinect device and conventional navigation system ranged from 0.970 to 0.999 and from 0.837 to 0.995 at the abdominal and thoracic surfaces, respectively. The respiratory motion signal from this system was obtained at 27-30 frames/s. This system with the Kinect v2 device and simple marker could be used for cost-effective, robust and accurate 3D respiratory motion monitoring. In addition, this system is as reliable for respiratory motion signal generation and as practically useful as the conventional stereovision navigation system and is less sensitive to patient posture. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Feature Genes Selection Using Supervised Locally Linear Embedding and Correlation Coefficient for Microarray Classification

    PubMed Central

    Wang, Yun; Huang, Fangzhou

    2018-01-01

    The selection of feature genes with high recognition ability from the gene expression profiles has gained great significance in biology. However, most of the existing methods have a high time complexity and poor classification performance. Motivated by this, an effective feature selection method, called supervised locally linear embedding and Spearman's rank correlation coefficient (SLLE-SC2), is proposed which is based on the concept of locally linear embedding and correlation coefficient algorithms. Supervised locally linear embedding takes into account class label information and improves the classification performance. Furthermore, Spearman's rank correlation coefficient is used to remove the coexpression genes. The experiment results obtained on four public tumor microarray datasets illustrate that our method is valid and feasible. PMID:29666661

  14. Feature Genes Selection Using Supervised Locally Linear Embedding and Correlation Coefficient for Microarray Classification.

    PubMed

    Xu, Jiucheng; Mu, Huiyu; Wang, Yun; Huang, Fangzhou

    2018-01-01

    The selection of feature genes with high recognition ability from the gene expression profiles has gained great significance in biology. However, most of the existing methods have a high time complexity and poor classification performance. Motivated by this, an effective feature selection method, called supervised locally linear embedding and Spearman's rank correlation coefficient (SLLE-SC 2 ), is proposed which is based on the concept of locally linear embedding and correlation coefficient algorithms. Supervised locally linear embedding takes into account class label information and improves the classification performance. Furthermore, Spearman's rank correlation coefficient is used to remove the coexpression genes. The experiment results obtained on four public tumor microarray datasets illustrate that our method is valid and feasible.

  15. Reliable quantification of BOLD fMRI cerebrovascular reactivity despite poor breath-hold performance.

    PubMed

    Bright, Molly G; Murphy, Kevin

    2013-12-01

    Cerebrovascular reactivity (CVR) can be mapped using BOLD fMRI to provide a clinical insight into vascular health that can be used to diagnose cerebrovascular disease. Breath-holds are a readily accessible method for producing the required arterial CO2 increases but their implementation into clinical studies is limited by concerns that patients will demonstrate highly variable performance of breath-hold challenges. This study assesses the repeatability of CVR measurements despite poor task performance, to determine if and how robust results could be achieved with breath-holds in patients. Twelve healthy volunteers were scanned at 3 T. Six functional scans were acquired, each consisting of 6 breath-hold challenges (10, 15, or 20 s duration) interleaved with periods of paced breathing. These scans simulated the varying breath-hold consistency and ability levels that may occur in patient data. Uniform ramps, time-scaled ramps, and end-tidal CO2 data were used as regressors in a general linear model in order to measure CVR at the grey matter, regional, and voxelwise level. The intraclass correlation coefficient (ICC) quantified the repeatability of the CVR measurement for each breath-hold regressor type and scale of interest across the variable task performances. The ramp regressors did not fully account for variability in breath-hold performance and did not achieve acceptable repeatability (ICC<0.4) in several regions analysed. In contrast, the end-tidal CO2 regressors resulted in "excellent" repeatability (ICC=0.82) in the average grey matter data, and resulted in acceptable repeatability in all smaller regions tested (ICC>0.4). Further analysis of intra-subject CVR variability across the brain (ICCspatial and voxelwise correlation) supported the use of end-tidal CO2 data to extract robust whole-brain CVR maps, despite variability in breath-hold performance. We conclude that the incorporation of end-tidal CO2 monitoring into scanning enables robust, repeatable measurement of CVR that makes breath-hold challenges suitable for routine clinical practice. © 2013.

  16. Robust estimation of pulse wave transit time using group delay.

    PubMed

    Meloni, Antonella; Zymeski, Heather; Pepe, Alessia; Lombardi, Massimo; Wood, John C

    2014-03-01

    To evaluate the efficiency of a novel transit time (Δt) estimation method from cardiovascular magnetic resonance flow curves. Flow curves were estimated from phase contrast images of 30 patients. Our method (TT-GD: transit time group delay) operates in the frequency domain and models the ascending aortic waveform as an input passing through a discrete-component "filter," producing the observed descending aortic waveform. The GD of the filter represents the average time delay (Δt) across individual frequency bands of the input. This method was compared with two previously described time-domain methods: TT-point using the half-maximum of the curves and TT-wave using cross-correlation. High temporal resolution flow images were studied at multiple downsampling rates to study the impact of differences in temporal resolution. Mean Δts obtained with the three methods were comparable. The TT-GD method was the most robust to reduced temporal resolution. While the TT-GD and the TT-wave produced comparable results for velocity and flow waveforms, the TT-point resulted in significant shorter Δts when calculated from velocity waveforms (difference: 1.8±2.7 msec; coefficient of variability: 8.7%). The TT-GD method was the most reproducible, with an intraobserver variability of 3.4% and an interobserver variability of 3.7%. Compared to the traditional TT-point and TT-wave methods, the TT-GD approach was more robust to the choice of temporal resolution, waveform type, and observer. Copyright © 2013 Wiley Periodicals, Inc.

  17. Pearson's Correlation between Three Variables; Using Students' Basic Knowledge of Geometry for an Exercise in Mathematical Statistics

    ERIC Educational Resources Information Center

    Vos, Pauline

    2009-01-01

    When studying correlations, how do the three bivariate correlation coefficients between three variables relate? After transforming Pearson's correlation coefficient r into a Euclidean distance, undergraduate students can tackle this problem using their secondary school knowledge of geometry (Pythagoras' theorem and similarity of triangles).…

  18. Prediction of toxic metals concentration using artificial intelligence techniques

    NASA Astrophysics Data System (ADS)

    Gholami, R.; Kamkar-Rouhani, A.; Doulati Ardejani, F.; Maleki, Sh.

    2011-12-01

    Groundwater and soil pollution are noted to be the worst environmental problem related to the mining industry because of the pyrite oxidation, and hence acid mine drainage generation, release and transport of the toxic metals. The aim of this paper is to predict the concentration of Ni and Fe using a robust algorithm named support vector machine (SVM). Comparison of the obtained results of SVM with those of the back-propagation neural network (BPNN) indicates that the SVM can be regarded as a proper algorithm for the prediction of toxic metals concentration due to its relative high correlation coefficient and the associated running time. As a matter of fact, the SVM method has provided a better prediction of the toxic metals Fe and Ni and resulted the running time faster compared with that of the BPNN.

  19. Cerebral autoregulation in the preterm newborn using near-infrared spectroscopy: a comparison of time-domain and frequency-domain analyses

    NASA Astrophysics Data System (ADS)

    Eriksen, Vibeke R.; Hahn, Gitte H.; Greisen, Gorm

    2015-03-01

    The aim was to compare two conventional methods used to describe cerebral autoregulation (CA): frequency-domain analysis and time-domain analysis. We measured cerebral oxygenation (as a surrogate for cerebral blood flow) and mean arterial blood pressure (MAP) in 60 preterm infants. In the frequency domain, outcome variables were coherence and gain, whereas the cerebral oximetry index (COx) and the regression coefficient were the outcome variables in the time domain. Correlation between coherence and COx was poor. The disagreement between the two methods was due to the MAP and cerebral oxygenation signals being in counterphase in three cases. High gain and high coherence may arise spuriously when cerebral oxygenation decreases as MAP increases; hence, time-domain analysis appears to be a more robust-and simpler-method to describe CA.

  20. Two-Way Gene Interaction From Microarray Data Based on Correlation Methods.

    PubMed

    Alavi Majd, Hamid; Talebi, Atefeh; Gilany, Kambiz; Khayyer, Nasibeh

    2016-06-01

    Gene networks have generated a massive explosion in the development of high-throughput techniques for monitoring various aspects of gene activity. Networks offer a natural way to model interactions between genes, and extracting gene network information from high-throughput genomic data is an important and difficult task. The purpose of this study is to construct a two-way gene network based on parametric and nonparametric correlation coefficients. The first step in constructing a Gene Co-expression Network is to score all pairs of gene vectors. The second step is to select a score threshold and connect all gene pairs whose scores exceed this value. In the foundation-application study, we constructed two-way gene networks using nonparametric methods, such as Spearman's rank correlation coefficient and Blomqvist's measure, and compared them with Pearson's correlation coefficient. We surveyed six genes of venous thrombosis disease, made a matrix entry representing the score for the corresponding gene pair, and obtained two-way interactions using Pearson's correlation, Spearman's rank correlation, and Blomqvist's coefficient. Finally, these methods were compared with Cytoscape, based on BIND, and Gene Ontology, based on molecular function visual methods; R software version 3.2 and Bioconductor were used to perform these methods. Based on the Pearson and Spearman correlations, the results were the same and were confirmed by Cytoscape and GO visual methods; however, Blomqvist's coefficient was not confirmed by visual methods. Some results of the correlation coefficients are not the same with visualization. The reason may be due to the small number of data.

  1. Relationship of cranial robusticity to cranial form, geography and climate in Homo sapiens.

    PubMed

    Baab, Karen L; Freidline, Sarah E; Wang, Steven L; Hanson, Timothy

    2010-01-01

    Variation in cranial robusticity among modern human populations is widely acknowledged but not well-understood. While the use of "robust" cranial traits in hominin systematics and phylogeny suggests that these characters are strongly heritable, this hypothesis has not been tested. Alternatively, cranial robusticity may be a response to differences in diet/mastication or it may be an adaptation to cold, harsh environments. This study quantifies the distribution of cranial robusticity in 14 geographically widespread human populations, and correlates this variation with climatic variables, neutral genetic distances, cranial size, and cranial shape. With the exception of the occipital torus region, all traits were positively correlated with each other, suggesting that they should not be treated as individual characters. While males are more robust than females within each of the populations, among the independent variables (cranial shape, size, climate, and neutral genetic distances), only shape is significantly correlated with inter-population differences in robusticity. Two-block partial least-squares analysis was used to explore the relationship between cranial shape (captured by three-dimensional landmark data) and robusticity across individuals. Weak support was found for the hypothesis that robusticity was related to mastication as the shape associated with greater robusticity was similar to that described for groups that ate harder-to-process diets. Specifically, crania with more prognathic faces, expanded glabellar and occipital regions, and (slightly) longer skulls were more robust than those with rounder vaults and more orthognathic faces. However, groups with more mechanically demanding diets (hunter-gatherers) were not always more robust than groups practicing some form of agriculture.

  2. Correlation Between Geometric Similarity of Ice Shapes and the Resulting Aerodynamic Performance Degradation: A Preliminary Investigation Using WIND

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Chung, James

    1999-01-01

    Aerodynamic performance calculations were performed using WIND on ten experimental ice shapes and the corresponding ten ice shapes predicted by LEWICE 2.0. The resulting data for lift coefficient and drag coefficient are presented. The difference in aerodynamic results between the experimental ice shapes and the LEWICE ice shapes were compared to the quantitative difference in ice shape geometry presented in an earlier report. Correlations were generated to determine the geometric features which have the most effect on performance degradation. Results show that maximum lift and stall angle can be correlated to the upper horn angle and the leading edge minimum thickness. Drag coefficient can be correlated to the upper horn angle and the frequency-weighted average of the Fourier coefficients. Pitching moment correlated with the upper horn angle and to a much lesser extent to the upper and lower horn thicknesses.

  3. Confidence Intervals and "F" Tests for Intraclass Correlation Coefficients Based on Three-Way Mixed Effects Models

    ERIC Educational Resources Information Center

    Zhou, Hong; Muellerleile, Paige; Ingram, Debra; Wong, Seok P.

    2011-01-01

    Intraclass correlation coefficients (ICCs) are commonly used in behavioral measurement and psychometrics when a researcher is interested in the relationship among variables of a common class. The formulas for deriving ICCs, or generalizability coefficients, vary depending on which models are specified. This article gives the equations for…

  4. Uses and Misuses of the Correlation Coefficient.

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Daniel, Larry G.

    The purpose of this paper is to provide an in-depth critical analysis of the use and misuse of correlation coefficients. Various analytical and interpretational misconceptions are reviewed, beginning with the egregious assumption that correlational statistics may be useful in inferring causality. Additional misconceptions, stemming from…

  5. Evaluation of icing drag coefficient correlations applied to iced propeller performance prediction

    NASA Technical Reports Server (NTRS)

    Miller, Thomas L.; Shaw, R. J.; Korkan, K. D.

    1987-01-01

    Evaluation of three empirical icing drag coefficient correlations is accomplished through application to a set of propeller icing data. The various correlations represent the best means currently available for relating drag rise to various flight and atmospheric conditions for both fixed-wing and rotating airfoils, and the work presented here ilustrates and evaluates one such application of the latter case. The origins of each of the correlations are discussed, and their apparent capabilities and limitations are summarized. These correlations have been made to be an integral part of a computer code, ICEPERF, which has been designed to calculate iced propeller performance. Comparison with experimental propeller icing data shows generally good agreement, with the quality of the predicted results seen to be directly related to the radial icing extent of each case. The code's capability to properly predict thrust coefficient, power coefficient, and propeller efficiency is shown to be strongly dependent on the choice of correlation selected, as well as upon proper specificatioon of radial icing extent.

  6. Family Reintegration Experiences of Soldiers with Mild Traumatic Brain Injury

    DTIC Science & Technology

    2014-02-26

    depression scores in the spouse. Weak within-couple correlation were indicated on the other measures. Table 3 presents the Spearman correlation matrix...separately. Table 2: Spearman Correlation Coefficients for Couples Spouse MAT Spouse Depression Spouse...Anxiety Soldier MAT -0.06 Soldier Depression -0.61 Soldier Anxiety -0.12 Table 3: Spearman Correlation Coefficients for Soldiers and

  7. Correlation tests of the engine performance parameter by using the detrended cross-correlation coefficient

    NASA Astrophysics Data System (ADS)

    Dong, Keqiang; Gao, You; Jing, Liming

    2015-02-01

    The presence of cross-correlation in complex systems has long been noted and studied in a broad range of physical applications. We here focus on an aero-engine system as an example of a complex system. By applying the detrended cross-correlation (DCCA) coefficient method to aero-engine time series, we investigate the effects of the data length and the time scale on the detrended cross-correlation coefficients ρ DCCA ( T, s). We then show, for a twin-engine aircraft, that the engine fuel flow time series derived from the left engine and the right engine exhibit much stronger cross-correlations than the engine exhaust-gas temperature series derived from the left engine and the right engine do.

  8. A robust nonparametric framework for reconstruction of stochastic differential equation models

    NASA Astrophysics Data System (ADS)

    Rajabzadeh, Yalda; Rezaie, Amir Hossein; Amindavar, Hamidreza

    2016-05-01

    In this paper, we employ a nonparametric framework to robustly estimate the functional forms of drift and diffusion terms from discrete stationary time series. The proposed method significantly improves the accuracy of the parameter estimation. In this framework, drift and diffusion coefficients are modeled through orthogonal Legendre polynomials. We employ the least squares regression approach along with the Euler-Maruyama approximation method to learn coefficients of stochastic model. Next, a numerical discrete construction of mean squared prediction error (MSPE) is established to calculate the order of Legendre polynomials in drift and diffusion terms. We show numerically that the new method is robust against the variation in sample size and sampling rate. The performance of our method in comparison with the kernel-based regression (KBR) method is demonstrated through simulation and real data. In case of real dataset, we test our method for discriminating healthy electroencephalogram (EEG) signals from epilepsy ones. We also demonstrate the efficiency of the method through prediction in the financial data. In both simulation and real data, our algorithm outperforms the KBR method.

  9. Prediction of Very High Reynolds Number Compressible Skin Friction

    NASA Technical Reports Server (NTRS)

    Carlson, John R.

    1998-01-01

    Flat plate skin friction calculations over a range of Mach numbers from 0.4 to 3.5 at Reynolds numbers from 16 million to 492 million using a Navier Stokes method with advanced turbulence modeling are compared with incompressible skin friction coefficient correlations. The semi-empirical correlation theories of van Driest; Cope; Winkler and Cha; and Sommer and Short T' are used to transform the predicted skin friction coefficients of solutions using two algebraic Reynolds stress turbulence models in the Navier-Stokes method PAB3D. In general, the predicted skin friction coefficients scaled well with each reference temperature theory though, overall the theory by Sommer and Short appeared to best collapse the predicted coefficients. At the lower Reynolds number 3 to 30 million, both the Girimaji and Shih, Zhu and Lumley turbulence models predicted skin-friction coefficients within 2% of the semi-empirical correlation skin friction coefficients. At the higher Reynolds numbers of 100 to 500 million, the turbulence models by Shih, Zhu and Lumley and Girimaji predicted coefficients that were 6% less and 10% greater, respectively, than the semi-empirical coefficients.

  10. A Practical Theory of Micro-Solar Power Sensor Networks

    DTIC Science & Technology

    2009-04-20

    Simulation Platform TOSSIM [LLWC03] ns-2 Matlab C++ AVRORA [TLP05] Reference Hardware Mica2 WINS, Medusa Mica Mica2, Medusa Mica2 Simulated Power Power...scale. From this raw data, we can 162 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 2 4 Correlation coefficient F re qu en cy Histogram of correlation...0.5 0.6 0.7 0.8 0.9 1 0 1 2 Correlation coefficient F re qu en cy Histogram of correlation coefficient with solar radiation measurement (Turbidity at

  11. Insight into the structural requirements of proton pump inhibitors based on CoMFA and CoMSIA studies.

    PubMed

    Nayana, M Ravi Shashi; Sekhar, Y Nataraja; Nandyala, Haritha; Muttineni, Ravikumar; Bairy, Santosh Kumar; Singh, Kriti; Mahmood, S K

    2008-10-01

    In the present study, a series of 179 quinoline and quinazoline heterocyclic analogues exhibiting inhibitory activity against Gastric (H+/K+)-ATPase were investigated using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices (CoMSIA) methods. Both the models exhibited good correlation between the calculated 3D-QSAR fields and the observed biological activity for the respective training set compounds. The most optimal CoMFA and CoMSIA models yielded significant leave-one-out cross-validation coefficient, q(2) of 0.777, 0.744 and conventional cross-validation coefficient, r(2) of 0.927, 0.914 respectively. The predictive ability of generated models was tested on a set of 52 compounds having broad range of activity. CoMFA and CoMSIA yielded predicted activities for test set compounds with r(pred)(2) of 0.893 and 0.917 respectively. These validation tests not only revealed the robustness of the models but also demonstrated that for our models r(pred)(2) based on the mean activity of test set compounds can accurately estimate external predictivity. The factors affecting activity were analyzed carefully according to standard coefficient contour maps of steric, electrostatic, hydrophobic, acceptor and donor fields derived from the CoMFA and CoMSIA. These contour plots identified several key features which explain the wide range of activities. The results obtained from models offer important structural insight into designing novel peptic-ulcer inhibitors prior to their synthesis.

  12. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    NASA Astrophysics Data System (ADS)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  13. Repeatability of chemical-shift-encoded water-fat MRI and diffusion-tensor imaging in lower extremity muscles in children.

    PubMed

    Ponrartana, Skorn; Andrade, Kristine E; Wren, Tishya A L; Ramos-Platt, Leigh; Hu, Houchun H; Bluml, Stefan; Gilsanz, Vicente

    2014-06-01

    The purpose of this study was to assess the repeatability of water-fat MRI and diffusion-tensor imaging (DTI) as quantitative biomarkers of pediatric lower extremity skeletal muscle. MRI at 3 T of a randomly selected thigh and lower leg of seven healthy children was studied using water-fat separation and DTI techniques. Muscle-fat fraction, apparent diffusion coefficient (ADC), and fractional anisotropy (FA) values were calculated. Test-retest and interrater repeatability were assessed by calculating the Pearson correlation coefficient, intraclass correlation coefficient, and Bland-Altman analysis. Bland-Altman plots show that the mean difference between test-retest and interrater measurements of muscle-fat fraction, ADC, and FA was near 0. The correlation coefficients and intraclass correlation coefficients were all between 0.88 and 0.99 (p < 0.05), suggesting excellent reliability of the measurements. Muscle-fat fraction measurements from water-fat MRI exhibited the highest intraclass correlation coefficient. Interrater agreement was consistently better than test-retest comparisons. Water-fat MRI and DTI measurements in lower extremity skeletal muscles are objective repeatable biomarkers in children. This knowledge should aid in the understanding of the number of participants needed in clinical trials when using these determinations as an outcome measure to noninvasively monitor neuromuscular disease.

  14. The Gini Coefficient as a Tool for Image Family Idenitification in Strong Lensing Systems with Multiple Images

    NASA Astrophysics Data System (ADS)

    Florian, Michael K.; Gladders, Michael D.; Li, Nan; Sharon, Keren

    2016-01-01

    The sample of cosmological strong lensing systems has been steadily growing in recent years and with the advent of the next generation of space-based survey telescopes, the sample will reach into the thousands. The accuracy of strong lens models relies on robust identification of multiple image families of lensed galaxies. For the most massive lenses, often more than one background galaxy is magnified and multiply imaged, and even in the cases of only a single lensed source, identification of counter images is not always robust. Recently, we have shown that the Gini coefficient in space-telescope-quality imaging is a measurement of galaxy morphology that is relatively well-preserved by strong gravitational lensing. Here, we investigate its usefulness as a diagnostic for the purposes of image family identification and show that it can remove some of the degeneracies encountered when using color as the sole diagnostic, and can do so without the need for additional observations since whenever a color is available, two Gini coefficients are as well.

  15. Observations of copolar correlation coefficient through a bright band at vertical incidence

    NASA Technical Reports Server (NTRS)

    Zrnic, D. S.; Raghavan, R.; Chandrasekar, V.

    1994-01-01

    This paper discusses an application of polarimetric measurements at vertical incidence. In particular, the correlation coefficients between linear copolar components are examined, and measurements obtained with the National Severe Storms Laboratory (NSSL)'s and National Center for Atmospheric Research (NCAR)'s polarimetric radars are presented. The data are from two well-defined bright bands. A sharp decrease of the correlation coefficient, confined to a height interval of a few hundred meters, marks the bottom of the bright band.

  16. Optimizing correlation techniques for improved earthquake location

    USGS Publications Warehouse

    Schaff, D.P.; Bokelmann, G.H.R.; Ellsworth, W.L.; Zanzerkia, E.; Waldhauser, F.; Beroza, G.C.

    2004-01-01

    Earthquake location using relative arrival time measurements can lead to dramatically reduced location errors and a view of fault-zone processes with unprecedented detail. There are two principal reasons why this approach reduces location errors. The first is that the use of differenced arrival times to solve for the vector separation of earthquakes removes from the earthquake location problem much of the error due to unmodeled velocity structure. The second reason, on which we focus in this article, is that waveform cross correlation can substantially reduce measurement error. While cross correlation has long been used to determine relative arrival times with subsample precision, we extend correlation measurements to less similar waveforms, and we introduce a general quantitative means to assess when correlation data provide an improvement over catalog phase picks. We apply the technique to local earthquake data from the Calaveras Fault in northern California. Tests for an example streak of 243 earthquakes demonstrate that relative arrival times with normalized cross correlation coefficients as low as ???70%, interevent separation distances as large as to 2 km, and magnitudes up to 3.5 as recorded on the Northern California Seismic Network are more precise than relative arrival times determined from catalog phase data. Also discussed are improvements made to the correlation technique itself. We find that for large time offsets, our implementation of time-domain cross correlation is often more robust and that it recovers more observations than the cross spectral approach. Longer time windows give better results than shorter ones. Finally, we explain how thresholds and empirical weighting functions may be derived to optimize the location procedure for any given region of interest, taking advantage of the respective strengths of diverse correlation and catalog phase data on different length scales.

  17. Shape and size of the body vs. musculoskeletal stress markers.

    PubMed

    Myszka, Anna; Piontek, Janusz

    2010-01-01

    The objective of this paper is to assess the relationship between the degree of development of muscle attachment sites (musculoskeletal stress markers - MSM1) and the length and circumference measurements of long bones and the body build expressed with the reconstructed values of body height (BH) and body mass (BM). The bone material (102 male and 99 female skeletons) used in the study was collected in the medieval burial ground in Cedynia, Poland. The authors analyzed 10 musculoskeletal stress markers located on the scapula (2), humerus (2), radius (2), femur (2) and tibia (2). The frequency and the degree of expression of muscle attachment size was carried out using the scale prepared by Myszka (2007). The scale encompassed three degrees of expression of muscle attachment size. Only changes of robusticity type (nonpathological changes) were taken into account. The assessment of body build of individuals was carried out according to the method proposed by Vancata & Charvátová (2001). Body height was reconstructed from the length of the humerus and femur using eight equations. Body mass was reconstructed from the measurements of the breadth of the proximal and distal sections of the femur and tibia (mechanical method) using twenty one equations. The equations were developed for different reference populations. The same equations were used for men and women. The correlation between the MSM and the length and circumference measurements of the bones was analyzed using the principal components analysis and the Gamma correlation coefficient. The strength of the correlation between the reconstructed body build traits (BH, BM) and the moderate degree of musculoskeletal stress markers expression was studied based on the principal components method and the Pearson correlation coefficient. A linear correlation was found between musculoskeletal stress markers and the circumference measurements and the reconstructed body mass, but no relationship with body height and the length measurements of long bones was revealed. From previous research it is evident that the relationship between the MSM and metric skeletal traits does not occur in every population. Divergent findings necessitate further corroboration of results on diverse skeletal material.

  18. Relationship of body mass index to percent body fat and waist circumference among schoolchildren in Japan--the influence of gender and obesity: a population-based cross-sectional study.

    PubMed

    Ochiai, Hirotaka; Shirasawa, Takako; Nishimura, Rimei; Morimoto, Aya; Shimada, Naoki; Ohtsu, Tadahiro; Kujirai, Emiko; Hoshino, Hiromi; Tajima, Naoko; Kokaze, Akatsuki

    2010-08-18

    Although the correlation coefficient between body mass index (BMI) and percent body fat (%BF) or waist circumference (WC) has been reported, studies conducted among population-based schoolchildren to date have been limited in Japan, where %BF and WC are not usually measured in annual health examinations at elementary schools or junior high schools. The aim of the present study was to investigate the relationship of BMI to %BF and WC and to examine the influence of gender and obesity on these relationships among Japanese schoolchildren. Subjects included 3,750 schoolchildren from the fourth and seventh grade in Ina-town, Saitama Prefecture, Japan between 2004 and 2008. Information about subject's age, sex, height, weight, %BF, and WC was collected from annual physical examinations. %BF was measured with a bipedal biometrical impedance analysis device. Obesity was defined by the following two criteria: the obese definition of the Centers for Disease Control and Prevention, and the definition of obesity for Japanese children. Pearson's correlation coefficients between BMI and %BF or WC were calculated separately for sex. Among fourth graders, the correlation coefficients between BMI and %BF were 0.74 for boys and 0.97 for girls, whereas those between BMI and WC were 0.94 for boys and 0.90 for girls. Similar results were observed in the analysis of seventh graders. The correlation coefficient between BMI and %BF varied by physique (obese or non-obese), with weaker correlations among the obese regardless of the definition of obesity; most correlation coefficients among obese boys were less than 0.5, whereas most correlations among obese girls were more than 0.7. On the other hand, the correlation coefficients between BMI and WC were more than 0.8 among boys and almost all coefficients were more than 0.7 among girls, regardless of physique. BMI was positively correlated with %BF and WC among Japanese schoolchildren. The correlations could be influenced by obesity as well as by gender. Accordingly, it is essential to consider gender and obesity when using BMI as a surrogate for %BF and WC for epidemiological use.

  19. A Robust State Estimation Framework Considering Measurement Correlations and Imperfect Synchronization

    DOE PAGES

    Zhao, Junbo; Wang, Shaobu; Mili, Lamine; ...

    2018-01-08

    Here, this paper develops a robust power system state estimation framework with the consideration of measurement correlations and imperfect synchronization. In the framework, correlations of SCADA and Phasor Measurements (PMUs) are calculated separately through unscented transformation and a Vector Auto-Regression (VAR) model. In particular, PMU measurements during the waiting period of two SCADA measurement scans are buffered to develop the VAR model with robustly estimated parameters using projection statistics approach. The latter takes into account the temporal and spatial correlations of PMU measurements and provides redundant measurements to suppress bad data and mitigate imperfect synchronization. In case where the SCADAmore » and PMU measurements are not time synchronized, either the forecasted PMU measurements or the prior SCADA measurements from the last estimation run are leveraged to restore system observability. Then, a robust generalized maximum-likelihood (GM)-estimator is extended to integrate measurement error correlations and to handle the outliers in the SCADA and PMU measurements. Simulation results that stem from a comprehensive comparison with other alternatives under various conditions demonstrate the benefits of the proposed framework.« less

  20. A Robust State Estimation Framework Considering Measurement Correlations and Imperfect Synchronization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Junbo; Wang, Shaobu; Mili, Lamine

    Here, this paper develops a robust power system state estimation framework with the consideration of measurement correlations and imperfect synchronization. In the framework, correlations of SCADA and Phasor Measurements (PMUs) are calculated separately through unscented transformation and a Vector Auto-Regression (VAR) model. In particular, PMU measurements during the waiting period of two SCADA measurement scans are buffered to develop the VAR model with robustly estimated parameters using projection statistics approach. The latter takes into account the temporal and spatial correlations of PMU measurements and provides redundant measurements to suppress bad data and mitigate imperfect synchronization. In case where the SCADAmore » and PMU measurements are not time synchronized, either the forecasted PMU measurements or the prior SCADA measurements from the last estimation run are leveraged to restore system observability. Then, a robust generalized maximum-likelihood (GM)-estimator is extended to integrate measurement error correlations and to handle the outliers in the SCADA and PMU measurements. Simulation results that stem from a comprehensive comparison with other alternatives under various conditions demonstrate the benefits of the proposed framework.« less

  1. Reciprocity Between Robustness of Period and Plasticity of Phase in Biological Clocks

    NASA Astrophysics Data System (ADS)

    Hatakeyama, Tetsuhiro S.; Kaneko, Kunihiko

    2015-11-01

    Circadian clocks exhibit the robustness of period and plasticity of phase against environmental changes such as temperature and nutrient conditions. Thus far, however, it is unclear how both are simultaneously achieved. By investigating distinct models of circadian clocks, we demonstrate reciprocity between robustness and plasticity: higher robustness in the period implies higher plasticity in the phase, where changes in period and in phase follow a linear relationship with a negative coefficient. The robustness of period is achieved by the adaptation on the limit cycle via a concentration change of a buffer molecule, whose temporal change leads to a phase shift following a shift of the limit-cycle orbit in phase space. Generality of reciprocity in clocks with the adaptation mechanism is confirmed with theoretical analysis of simple models, while biological significance is discussed.

  2. A Method for Approximating the Bivariate Normal Correlation Coefficient.

    ERIC Educational Resources Information Center

    Kirk, David B.

    Improvements of the Gaussian quadrature in conjunction with the Newton-Raphson iteration technique (TM 000 789) are discussed as effective methods of calculating the bivariate normal correlation coefficient. (CK)

  3. Semi-quantitative spectrographic analysis and rank correlation in geochemistry

    USGS Publications Warehouse

    Flanagan, F.J.

    1957-01-01

    The rank correlation coefficient, rs, which involves less computation than the product-moment correlation coefficient, r, can be used to indicate the degree of relationship between two elements. The method is applicable in situations where the assumptions underlying normal distribution correlation theory may not be satisfied. Semi-quantitative spectrographic analyses which are reported as grouped or partly ranked data can be used to calculate rank correlations between elements. ?? 1957.

  4. Effect of inhibitory feedback on correlated firing of spiking neural network.

    PubMed

    Xie, Jinli; Wang, Zhijie

    2013-08-01

    Understanding the properties and mechanisms that generate different forms of correlation is critical for determining their role in cortical processing. Researches on retina, visual cortex, sensory cortex, and computational model have suggested that fast correlation with high temporal precision appears consistent with common input, and correlation on a slow time scale likely involves feedback. Based on feedback spiking neural network model, we investigate the role of inhibitory feedback in shaping correlations on a time scale of 100 ms. Notably, the relationship between the correlation coefficient and inhibitory feedback strength is non-monotonic. Further, computational simulations show how firing rate and oscillatory activity form the basis of the mechanisms underlying this relationship. When the mean firing rate holds unvaried, the correlation coefficient increases monotonically with inhibitory feedback, but the correlation coefficient keeps decreasing when the network has no oscillatory activity. Our findings reveal that two opposing effects of the inhibitory feedback on the firing activity of the network contribute to the non-monotonic relationship between the correlation coefficient and the strength of the inhibitory feedback. The inhibitory feedback affects the correlated firing activity by modulating the intensity and regularity of the spike trains. Finally, the non-monotonic relationship is replicated with varying transmission delay and different spatial network structure, demonstrating the universality of the results.

  5. Correcting Coefficient Alpha for Correlated Errors: Is [alpha][K]a Lower Bound to Reliability?

    ERIC Educational Resources Information Center

    Rae, Gordon

    2006-01-01

    When errors of measurement are positively correlated, coefficient alpha may overestimate the "true" reliability of a composite. To reduce this inflation bias, Komaroff (1997) has proposed an adjusted alpha coefficient, ak. This article shows that ak is only guaranteed to be a lower bound to reliability if the latter does not include correlated…

  6. Prediction of stream volatilization coefficients

    USGS Publications Warehouse

    Rathbun, Ronald E.

    1990-01-01

    Equations are developed for predicting the liquid-film and gas-film reference-substance parameters for quantifying volatilization of organic solutes from streams. Molecular weight and molecular-diffusion coefficients of the solute are used as correlating parameters. Equations for predicting molecular-diffusion coefficients of organic solutes in water and air are developed, with molecular weight and molal volume as parameters. Mean absolute errors of prediction for diffusion coefficients in water are 9.97% for the molecular-weight equation, 6.45% for the molal-volume equation. The mean absolute error for the diffusion coefficient in air is 5.79% for the molal-volume equation. Molecular weight is not a satisfactory correlating parameter for diffusion in air because two equations are necessary to describe the values in the data set. The best predictive equation for the liquid-film reference-substance parameter has a mean absolute error of 5.74%, with molal volume as the correlating parameter. The best equation for the gas-film parameter has a mean absolute error of 7.80%, with molecular weight as the correlating parameter.

  7. Measuring hepatic functional reserve using T1 mapping of Gd-EOB-DTPA enhanced 3T MR imaging: A preliminary study comparing with 99mTc GSA scintigraphy and signal intensity based parameters.

    PubMed

    Nakagawa, Masataka; Namimoto, Tomohiro; Shimizu, Kie; Morita, Kosuke; Sakamoto, Fumi; Oda, Seitaro; Nakaura, Takeshi; Utsunomiya, Daisuke; Shiraishi, Shinya; Yamashita, Yasuyuki

    2017-07-01

    To determine the utility of liver T1-mapping on gadolinium-ethoxybenzyl-diethylenetriamine-pentaacetic acid (Gd-EOB-DTPA) enhanced magnetic resonance (MR) imaging for the measurement of liver functional reserve compared with the signal intensity (SI) based parameters, technetium-99m-galactosyl serum albumin ( 99m Tc-GSA) scintigraphy and indocyanine green (ICG) clearance. This retrospective study included 111 patients (Child-Pugh-A 90; -B 21) performed with both Gd-EOB-DTPA enhanced liver MR imaging and 99m Tc-GSA (76 patients with ICG). Receiver operating characteristic (ROC) curve analysis was performed to compare diagnostic performances of T1-relaxation-time parameters [pre-(T1pre) and post-contrast (T1hb) Gd-EOB-DTPA], SI based parameters [relative enhancement (RE), liver-to-muscle-ratio (LMR), liver-to-spleen-ratio (LSR)] and 99m Tc-GSA scintigraphy blood clearance index (HH15)] for Child-Pugh classification. Pearson's correlation was used for comparisons among T1-relaxation-time parameters, SI-based parameters, HH15 and ICG. A significant difference was obtained for Child-Pugh classification with T1hb, ΔT1, all SI based parameters and HH15. T1hb had the highest AUC followed by RE, LMR, LSR, ΔT1, HH15 and T1pre. The correlation coefficients with HH15 were T1pre 0.22, T1hb 0.53, ΔT1 -0.38 of T1 relaxation parameters; RE -0.44, LMR -0.45, LSR -0.43 of SI-based parameters. T1hb was highest for correlation with HH15. The correlation coefficients with ICG were T1pre 0.29, T1hb 0.64, ΔT1 -0.42 of T1 relaxation parameters; RE -0.50, LMR -0.61, LSR -0.58 of SI-based parameters; 0.64 of HH15. Both T1hb and HH15 were highest for correlation with ICG. T1 relaxation time at post-contrast of Gd-EOB-DTPA (T1hb) was strongly correlated with ICG clearance and moderately correlated HH15 with 99m Tc-GSA. T1hb has the potential to provide robust parameter of liver functional reserve. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Two-Way Gene Interaction From Microarray Data Based on Correlation Methods

    PubMed Central

    Alavi Majd, Hamid; Talebi, Atefeh; Gilany, Kambiz; Khayyer, Nasibeh

    2016-01-01

    Background Gene networks have generated a massive explosion in the development of high-throughput techniques for monitoring various aspects of gene activity. Networks offer a natural way to model interactions between genes, and extracting gene network information from high-throughput genomic data is an important and difficult task. Objectives The purpose of this study is to construct a two-way gene network based on parametric and nonparametric correlation coefficients. The first step in constructing a Gene Co-expression Network is to score all pairs of gene vectors. The second step is to select a score threshold and connect all gene pairs whose scores exceed this value. Materials and Methods In the foundation-application study, we constructed two-way gene networks using nonparametric methods, such as Spearman’s rank correlation coefficient and Blomqvist’s measure, and compared them with Pearson’s correlation coefficient. We surveyed six genes of venous thrombosis disease, made a matrix entry representing the score for the corresponding gene pair, and obtained two-way interactions using Pearson’s correlation, Spearman’s rank correlation, and Blomqvist’s coefficient. Finally, these methods were compared with Cytoscape, based on BIND, and Gene Ontology, based on molecular function visual methods; R software version 3.2 and Bioconductor were used to perform these methods. Results Based on the Pearson and Spearman correlations, the results were the same and were confirmed by Cytoscape and GO visual methods; however, Blomqvist’s coefficient was not confirmed by visual methods. Conclusions Some results of the correlation coefficients are not the same with visualization. The reason may be due to the small number of data. PMID:27621916

  9. Python Waveform Cross-Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Templeton, Dennise

    PyWCC is a tool to compute seismic waveform cross-correlation coefficients on single-component or multiple-component seismic data across a network of seismic sensors. PyWCC compares waveform data templates with continuous seismic data, associates the resulting detections, identifies the template with the highest cross-correlation coefficient, and outputs a catalog of detections above a user-defined absolute cross-correlation threshold value.

  10. Hadamard multimode optical imaging transceiver

    DOEpatents

    Cooke, Bradly J; Guenther, David C; Tiee, Joe J; Kellum, Mervyn J; Olivas, Nicholas L; Weisse-Bernstein, Nina R; Judd, Stephen L; Braun, Thomas R

    2012-10-30

    Disclosed is a method and system for simultaneously acquiring and producing results for multiple image modes using a common sensor without optical filtering, scanning, or other moving parts. The system and method utilize the Walsh-Hadamard correlation detection process (e.g., functions/matrix) to provide an all-binary structure that permits seamless bridging between analog and digital domains. An embodiment may capture an incoming optical signal at an optical aperture, convert the optical signal to an electrical signal, pass the electrical signal through a Low-Noise Amplifier (LNA) to create an LNA signal, pass the LNA signal through one or more correlators where each correlator has a corresponding Walsh-Hadamard (WH) binary basis function, calculate a correlation output coefficient for each correlator as a function of the corresponding WH binary basis function in accordance with Walsh-Hadamard mathematical principles, digitize each of the correlation output coefficient by passing each correlation output coefficient through an Analog-to-Digital Converter (ADC), and performing image mode processing on the digitized correlation output coefficients as desired to produce one or more image modes. Some, but not all, potential image modes include: multi-channel access, temporal, range, three-dimensional, and synthetic aperture.

  11. Relative validity of an FFQ to estimate daily food and nutrient intakes for Chilean adults.

    PubMed

    Dehghan, Mahshid; Martinez, Solange; Zhang, Xiaohe; Seron, Pamela; Lanas, Fernando; Islam, Shofiqul; Merchant, Anwar T

    2013-10-01

    FFQ are commonly used to rank individuals by their food and nutrient intakes in large epidemiological studies. The purpose of the present study was to develop and validate an FFQ to rank individuals participating in an ongoing Prospective Urban and Rural Epidemiological (PURE) study in Chile. An FFQ and four 24 h dietary recalls were completed over 1 year. Pearson correlation coefficients, energy-adjusted and de-attenuated correlations and weighted kappa were computed between the dietary recalls and the FFQ. The level of agreement between the two dietary assessment methods was evaluated by Bland-Altman analysis. Temuco, Chile. Overall, 166 women and men enrolled in the present study. One hundred men and women participated in FFQ development and sixty-six individuals participated in FFQ validation. The FFQ consisted of 109 food items. For nutrients, the crude correlation coefficients between the dietary recalls and FFQ varied from 0.14 (protein) to 0.44 (fat). Energy adjustment and de-attenuation improved correlation coefficients and almost all correlation coefficients exceeded 0.40. Similar correlation coefficients were observed for food groups; the highest de-attenuated energy adjusted correlation coefficient was found for margarine and butter (0.75) and the lowest for potatoes (0.12). The FFQ showed moderate to high agreement for most nutrients and food groups, and can be used to rank individuals based on energy, nutrient and food intakes. The validation study was conducted in a unique setting and indicated that the tool is valid for use by adults in Chile.

  12. [Habitat suitability index of larval Japanese Halfbeak (Hyporhamphus sajori) in Bohai Sea based on geographically weighted regression.

    PubMed

    Zhao, Yang; Zhang, Xue Qing; Bian, Xiao Dong

    2018-01-01

    To investigate the early supplementary processes of fishre sources in the Bohai Sea, the geographically weighted regression (GWR) was introduced to the habitat suitability index (HSI) model. The Bohai Sea larval Japanese Halfbeak HSI GWR model was established with four environmental variables, including sea surface temperature (SST), sea surface salinity (SSS), water depth (DEP), and chlorophyll a concentration (Chl a). Results of the simulation showed that the four variables had different performances in August 2015. SST and Chl a were global variables, and had little impacts on HSI, with the regression coefficients of -0.027 and 0.006, respectively. SSS and DEP were local variables, and had larger impacts on HSI, while the average values of absolute values of their regression coefficients were 0.075 and 0.129, respectively. In the central Bohai Sea, SSS showed a negative correlation with HSI, and the most negative correlation coefficient was -0.3. In contrast, SSS was correlated positively but weakly with HSI in the three bays of Bohai Sea, and the largest correlation coefficient was 0.1. In particular, DEP and HSI were negatively correlated in the entire Bohai Sea, while they were more negatively correlated in the three bays of Bohai than in the central Bohai Sea, and the most negative correlation coefficient was -0.16 in the three bays. The Poisson regression coefficient of the HSI GWR model was 0.705, consistent with field measurements. Therefore, it could provide a new method for the research on fish habitats in the future.

  13. Psychometric properties of the modified RESIDE physical activity questionnaire among low-income overweight women.

    PubMed

    Jones, Sydney A; Evenson, Kelly R; Johnston, Larry F; Trost, Stewart G; Samuel-Hodge, Carmen; Jewell, David A; Kraschnewski, Jennifer L; Keyserling, Thomas C

    2015-01-01

    This study explored the criterion-related validity and test-retest reliability of the modified RESIDential Environment physical activity questionnaire and whether the instrument's validity varied by body mass index, education, race/ethnicity, or employment status. Validation study using baseline data collected for randomized trial of a weight loss intervention. Participants recruited from health departments wore an ActiGraph accelerometer and self-reported non-occupational walking, moderate and vigorous physical activity on the modified RESIDential Environment questionnaire. We assessed validity (n=152) using Spearman correlation coefficients, and reliability (n=57) using intraclass correlation coefficients. When compared to steps, moderate physical activity, and bouts of moderate/vigorous physical activity measured by accelerometer, these questionnaire measures showed fair evidence for validity: recreational walking (Spearman correlation coefficients 0.23-0.36), total walking (Spearman correlation coefficients 0.24-0.37), and total moderate physical activity (Spearman correlation coefficients 0.18-0.36). Correlations for self-reported walking and moderate physical activity were higher among unemployed participants and women with lower body mass indices. Generally no other variability in the validity of the instrument was found. Evidence for reliability of RESIDential Environment measures of recreational walking, total walking, and total moderate physical activity was substantial (intraclass correlation coefficients 0.56-0.68). Evidence for questionnaire validity and reliability varied by activity domain and was strongest for walking measures. The questionnaire may capture physical activity less accurately among women with higher body mass indices and employed participants. Capturing occupational activity, specifically walking at work, may improve questionnaire validity. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  14. Nonlinearity of the forward-backward correlation function in the model with string fusion

    NASA Astrophysics Data System (ADS)

    Vechernin, Vladimir

    2017-12-01

    The behavior of the forward-backward correlation functions and the corresponding correlation coefficients between multiplicities and transverse momenta of particles produced in high energy hadronic interactions is analyzed by analytical and MC calculations in the models with and without string fusion. The string fusion is taking into account in simplified form by introducing the lattice in the transverse plane. The results obtained with two alternative definitions of the forward-backward correlation coefficient are compared. It is shown that the nonlinearity of correlation functions increases with the width of observation windows, leading at small string density to a strong dependence of correlation coefficient value on the definition. The results of the modeling enable qualitatively to explain the experimentally observed features in the behavior of the correlation functions between multiplicities and mean transverse momenta at small and large multiplicities.

  15. Scaling for Robust Empirical Modeling and Predictions of Net Ecosystem Exchange (NEE) from Diverse Wetland Ecosystems

    NASA Astrophysics Data System (ADS)

    Ishtiaq, K. S.; Abdul-Aziz, O. I.

    2014-12-01

    We developed a scaling-based, simple empirical model for spatio-temporally robust prediction of the diurnal cycles of wetland net ecosystem exchange (NEE) by using an extended stochastic harmonic algorithm (ESHA). A reference-time observation from each diurnal cycle was utilized as the scaling parameter to normalize and collapse hourly observed NEE of different days into a single, dimensionless diurnal curve. The modeling concept was tested by parameterizing the unique diurnal curve and predicting hourly NEE of May to October (summer growing and fall seasons) between 2002-12 for diverse wetland ecosystems, as available in the U.S. AmeriFLUX network. As an example, the Taylor Slough short hydroperiod marsh site in the Florida Everglades had data for four consecutive growing seasons from 2009-12; results showed impressive modeling efficiency (coefficient of determination, R2 = 0.66) and accuracy (ratio of root-mean-square-error to the standard deviation of observations, RSR = 0.58). Model validation was performed with an independent year of NEE data, indicating equally impressive performance (R2 = 0.68, RSR = 0.57). The model included a parsimonious set of estimated parameters, which exhibited spatio-temporal robustness by collapsing onto narrow ranges. Model robustness was further investigated by analytically deriving and quantifying parameter sensitivity coefficients and a first-order uncertainty measure. The relatively robust, empirical NEE model can be applied for simulating continuous (e.g., hourly) NEE time-series from a single reference observation (or a set of limited observations) at different wetland sites of comparable hydro-climatology, biogeochemistry, and ecology. The method can also be used for a robust gap-filling of missing data in observed time-series of periodic ecohydrological variables for wetland or other ecosystems.

  16. Measuring effusion rates of obsidian lava flows by means of satellite thermal data

    NASA Astrophysics Data System (ADS)

    Coppola, D.; Laiolo, M.; Franchi, A.; Massimetti, F.; Cigolini, C.; Lara, L. E.

    2017-11-01

    Space-based thermal data are increasingly used for monitoring effusive eruptions, especially for calculating lava discharge rates and forecasting hazards related to basaltic lava flows. The application of this methodology to silicic, more viscous lava bodies (such as obsidian lava flows) is much less frequent, with only few examples documented in the last decades. The 2011-2012 eruption of Cordón Caulle volcano (Chile) produced a voluminous obsidian lava flow ( 0.6 km3) and offers an exceptional opportunity to analyze the relationship between heat and volumetric flux for such type of viscous lava bodies. Based on a retrospective analysis of MODIS infrared data (MIROVA system), we found that the energy radiated by the active lava flow is robustly correlated with the erupted lava volume, measured independently. We found that after a transient time of about 15 days, the coefficient of proportionality between radiant and volumetric flux becomes almost steady, and stabilizes around a value of 5 × 106 J m- 3. This coefficient (i.e. radiant density) is much lower than those found for basalts ( 1 × 108 J m- 3) and likely reflects the appropriate spreading and cooling properties of the highly-insulated, viscous flows. The effusion rates trend inferred from MODIS data correlates well with the tremor amplitude and with the plume elevation recorded throughout the eruption, thus suggesting a link between the effusive and the coeval explosive activity. Modelling of the eruptive trend indicates that the Cordón Caulle eruption occurred in two stages, either incompletely draining a single magma reservoir or more probably tapping multiple interconnected magmatic compartments.

  17. Reliability and Validity of the Persian Language Version of the International Consultation on Incontinence Questionnaire - Male Lower Urinary Tract Symptoms (ICIQ-MLUTS).

    PubMed

    Pourmomeny, Abbas Ali; Ghanei, Behnaz; Alizadeh, Farshid

    2018-05-01

    Assessment instruments are essential for research, allowing diagnosis and evaluating treatment outcomes in subjects with lower urinary tract disorders of both genders. The purpose of this study was to translate the Male Lower Urinary Tract Symptoms (MLUTS) Questionnaire and determine its psychometric properties in Persian subjects. After getting permission from the International Consultation on Incontinence Modular Questionnaire (ICIQ) web site, the forward and backward translation of the MLUTS questionnaire were carried out by researcher team. The content/face validity, construct validity and reliability were assessed in a sample of MLUTS Iranian patients by measuring with the Cronbach's alpha test. In total, 121 male patients were included in the study. The mean age of the patients was 60.5 years. Cronbach alpha value was 0.757, consecrated the internal consistency of the form (r > 0.7). The internal consistency of each question was examined separately and found to be over 0.7. For the evaluation of reliability test-retest was done, the test was administered to 20% of the patients for a second time with an interval of 1-2 weeks. The intraclass correlation coefficient (ICC) score was 0.901. The Correlation coefficient between the MLUTS and International Prostate Symptoms Score (IPSS) was 0.879. ICIQ-MLUTS is a robust instrument, which can be used for evaluating male LUTS in Persian patients. We believe that the Persian version of the MLUTS is an important tool for research and clinical setting. © 2017 John Wiley & Sons Australia, Ltd.

  18. Validation of the McClear clear-sky model in desert conditions with three stations in Israel

    NASA Astrophysics Data System (ADS)

    Lefèvre, Mireille; Wald, Lucien

    2016-03-01

    The new McClear clear-sky model, a fast model based on a radiative transfer solver, exploits the atmospheric properties provided by the EU-funded Copernicus Atmosphere Monitoring Service (CAMS) to estimate the solar direct and global irradiances received at ground level in cloud-free conditions at any place any time. The work presented here focuses on desert conditions and compares the McClear irradiances to coincident 1 min measurements made in clear-sky conditions at three stations in Israel which are distant from less than 100 km. The bias for global irradiance is comprised between 2 and 32 W m-2, i.e. between 0 and 4 % of the mean observed irradiance (approximately 830 W m-2). The RMSE ranges from 30 to 41 W m-2 (4 %) and the squared correlation coefficient is greater than 0.976. The bias for the direct irradiance at normal incidence (DNI) is comprised between -68 and +13 W m-2, i.e. between -8 and 2 % of the mean observed DNI (approximately 840 W m-2). The RMSE ranges from 53 (7 %) to 83 W m-2 (10 %). The squared correlation coefficient is close to 0.6. The performances are similar for the three sites for the global irradiance and for the DNI to a lesser extent, demonstrating the robustness of the McClear model combined with CAMS products. These results are discussed in the light of those obtained by McClear for other desert areas in Egypt and United Arab Emirates.

  19. Design of Novel Chemotherapeutic Agents Targeting Checkpoint Kinase 1 Using 3D-QSAR Modeling and Molecular Docking Methods.

    PubMed

    Balupuri, Anand; Balasubramanian, Pavithra K; Cho, Seung J

    2016-01-01

    Checkpoint kinase 1 (Chk1) has emerged as a potential therapeutic target for design and development of novel anticancer drugs. Herein, we have performed three-dimensional quantitative structure-activity relationship (3D-QSAR) and molecular docking analyses on a series of diazacarbazoles to design potent Chk1 inhibitors. 3D-QSAR models were developed using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques. Docking studies were performed using AutoDock. The best CoMFA and CoMSIA models exhibited cross-validated correlation coefficient (q2) values of 0.631 and 0.585, and non-cross-validated correlation coefficient (r2) values of 0.933 and 0.900, respectively. CoMFA and CoMSIA models showed reasonable external predictabilities (r2 pred) of 0.672 and 0.513, respectively. A satisfactory performance in the various internal and external validation techniques indicated the reliability and robustness of the best model. Docking studies were performed to explore the binding mode of inhibitors inside the active site of Chk1. Molecular docking revealed that hydrogen bond interactions with Lys38, Glu85 and Cys87 are essential for Chk1 inhibitory activity. The binding interaction patterns observed during docking studies were complementary to 3D-QSAR results. Information obtained from the contour map analysis was utilized to design novel potent Chk1 inhibitors. Their activities and binding affinities were predicted using the derived model and docking studies. Designed inhibitors were proposed as potential candidates for experimental synthesis.

  20. The relationship between the managerial skills and results of "performance evaluation "tool among nursing managers in teaching hospitals of Iran University of Medical Science.

    PubMed

    Isfahani, Haleh Mousavi; Aryankhesal, Aidin; Haghani, Hamid

    2014-09-25

    Performance of different organizations, such as hospitals is mainly influenced by their managers' performance. Nursing managers have an important role in hospital performance and their managerial skills can improve the quality of the services. Hence, the present study was conducted in order to assess the relationship between the managerial skills and the results of their performance evaluation in Teaching Hospitals of Iran University of Medical Science in 2013. The research used the cross sectional method in 2013. It was done by distributing a managerial skills assessment questionnaire, with close-ended questions in 5 choice Likert scale, among 181 managers and head nurses of hospitals of Iran university of Medical Sciences; among which 131 answered the questions. Another data collection tools was a forms to record evaluation marks from the personnel records. We used Pearson and Spearman correlation tests and SPSS for analysis and description (frequency, mean and standard deviation). Results showed that the managerial skills of the nursing mangers were fair (2.57 out of 5) and the results of the performance evaluation were in a good condition (98.44). The mangers' evaluation results and the managerial skills scores were not in a meaningful correlation (r=0.047 np=0.856). The research showed no correlation between different domains of managerial skills and the performance evaluation marks: decision making skills (r=0.074 and p=0.399), leadership (correlation coefficient 0.028 and p=0.654), motivation (correlation coefficient 0.118 and p=0.163), communication  (correlation coefficient 0.116 and p=0.122), systematic thinking  (correlation coefficient 0.028 and p=0.828), time management (correlation coefficient 0.077 and p=0.401) and strategic thinking  (correlation coefficient 0.041 and p=0.756). Lack of any correlation and relation between managers' managerial skills and their performance evaluation results shows need to a fundamental revision at managers' performance evaluation form.

  1. Application of 3-D Urbanization Index to Assess Impact of Urbanization on Air Temperature

    NASA Astrophysics Data System (ADS)

    Wu, Chih-Da; Lung, Shih-Chun Candice

    2016-04-01

    The lack of appropriate methodologies and indicators to quantify three-dimensional (3-D) building constructions poses challenges to authorities and urban planners when formulating polices to reduce health risks due to heat stress. This study evaluated the applicability of an innovative three-dimensional Urbanization Index (3DUI), based on remote sensing database, with a 5 m spatial resolution of 3-D man-made constructions to representing intra-urban variability of air temperature by assessing correlation of 3DUI with air temperature from a 3-D perspective. The results showed robust high correlation coefficients, ranging from 0.83 to 0.85, obtained within the 1,000 m circular buffer around weather stations regardless of season, year, or spatial location. Our findings demonstrated not only the strength of 3DUI in representing intra-urban air-temperature variability, but also its great potential for heat stress assessment within cities. In view of the maximum correlation between building volumes within the 1,000 m circular buffer and ambient air temperature, urban planning should consider setting ceilings for man-made construction volume in each 2 × 2 km2 residential community for thermal environment regulation, especially in Asian metropolis with high population density in city centers.

  2. Dual-color dual-focus line-scanning FCS for quantitative analysis of receptor-ligand interactions in living specimens.

    PubMed

    Dörlich, René M; Chen, Qing; Niklas Hedde, Per; Schuster, Vittoria; Hippler, Marc; Wesslowski, Janine; Davidson, Gary; Nienhaus, G Ulrich

    2015-05-07

    Cellular communication in multi-cellular organisms is mediated to a large extent by a multitude of cell-surface receptors that bind specific ligands. An in-depth understanding of cell signaling networks requires quantitative information on ligand-receptor interactions within living systems. In principle, fluorescence correlation spectroscopy (FCS) based methods can provide such data, but live-cell applications have proven extremely challenging. Here, we have developed an integrated dual-color dual-focus line-scanning fluorescence correlation spectroscopy (2c2f lsFCS) technique that greatly facilitates live-cell and tissue experiments. Absolute ligand and receptor concentrations and their diffusion coefficients within the cell membrane can be quantified without the need to perform additional calibration experiments. We also determine the concentration of ligands diffusing in the medium outside the cell within the same experiment by using a raster image correlation spectroscopy (RICS) based analysis. We have applied this robust technique to study the interactions of two Wnt antagonists, Dickkopf1 and Dickkopf2 (Dkk1/2), to their cognate receptor, low-density-lipoprotein-receptor related protein 6 (LRP6), in the plasma membrane of living HEK293T cells. We obtained significantly lower affinities than previously reported using in vitro studies, underscoring the need to measure such data on living cells or tissues.

  3. Application of 3-D Urbanization Index to Assess Impact of Urbanization on Air Temperature

    PubMed Central

    Wu, Chih-Da; Lung, Shih-Chun Candice

    2016-01-01

    The lack of appropriate methodologies and indicators to quantify three-dimensional (3-D) building constructions poses challenges to authorities and urban planners when formulating polices to reduce health risks due to heat stress. This study evaluated the applicability of an innovative three-dimensional Urbanization Index (3DUI), based on remote sensing database, with a 5 m spatial resolution of 3-D man-made constructions to representing intra-urban variability of air temperature by assessing correlation of 3DUI with air temperature from a 3-D perspective. The results showed robust high correlation coefficients, ranging from 0.83 to 0.85, obtained within the 1,000 m circular buffer around weather stations regardless of season, year, or spatial location. Our findings demonstrated not only the strength of 3DUI in representing intra-urban air-temperature variability, but also its great potential for heat stress assessment within cities. In view of the maximum correlation between building volumes within the 1,000 m circular buffer and ambient air temperature, urban planning should consider setting ceilings for man-made construction volume in each 2 × 2 km2 residential community for thermal environment regulation, especially in Asian metropolis with high population density in city centers. PMID:27079537

  4. Just Noticeable Distortion Model and Its Application in Color Image Watermarking

    NASA Astrophysics Data System (ADS)

    Liu, Kuo-Cheng

    In this paper, a perceptually adaptive watermarking scheme for color images is proposed in order to achieve robustness and transparency. A new just noticeable distortion (JND) estimator for color images is first designed in the wavelet domain. The key issue of the JND model is to effectively integrate visual masking effects. The estimator is an extension to the perceptual model that is used in image coding for grayscale images. Except for the visual masking effects given coefficient by coefficient by taking into account the luminance content and the texture of grayscale images, the crossed masking effect given by the interaction between luminance and chrominance components and the effect given by the variance within the local region of the target coefficient are investigated such that the visibility threshold for the human visual system (HVS) can be evaluated. In a locally adaptive fashion based on the wavelet decomposition, the estimator applies to all subbands of luminance and chrominance components of color images and is used to measure the visibility of wavelet quantization errors. The subband JND profiles are then incorporated into the proposed color image watermarking scheme. Performance in terms of robustness and transparency of the watermarking scheme is obtained by means of the proposed approach to embed the maximum strength watermark while maintaining the perceptually lossless quality of the watermarked color image. Simulation results show that the proposed scheme with inserting watermarks into luminance and chrominance components is more robust than the existing scheme while retaining the watermark transparency.

  5. Adaptive low-rank subspace learning with online optimization for robust visual tracking.

    PubMed

    Liu, Risheng; Wang, Di; Han, Yuzhuo; Fan, Xin; Luo, Zhongxuan

    2017-04-01

    In recent years, sparse and low-rank models have been widely used to formulate appearance subspace for visual tracking. However, most existing methods only consider the sparsity or low-rankness of the coefficients, which is not sufficient enough for appearance subspace learning on complex video sequences. Moreover, as both the low-rank and the column sparse measures are tightly related to all the samples in the sequences, it is challenging to incrementally solve optimization problems with both nuclear norm and column sparse norm on sequentially obtained video data. To address above limitations, this paper develops a novel low-rank subspace learning with adaptive penalization (LSAP) framework for subspace based robust visual tracking. Different from previous work, which often simply decomposes observations as low-rank features and sparse errors, LSAP simultaneously learns the subspace basis, low-rank coefficients and column sparse errors to formulate appearance subspace. Within LSAP framework, we introduce a Hadamard production based regularization to incorporate rich generative/discriminative structure constraints to adaptively penalize the coefficients for subspace learning. It is shown that such adaptive penalization can significantly improve the robustness of LSAP on severely corrupted dataset. To utilize LSAP for online visual tracking, we also develop an efficient incremental optimization scheme for nuclear norm and column sparse norm minimizations. Experiments on 50 challenging video sequences demonstrate that our tracker outperforms other state-of-the-art methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Pulmonary Catherization Data Correlate Poorly with Renal Function in Heart Failure.

    PubMed

    Masha, Luke; Stone, James; Stone, Danielle; Zhang, Jun; Sheng, Luo

    2018-04-10

    The mechanisms of renal dysfunction in heart failure are poorly understood. We chose to explore the relationship of cardiac filling pressures and cardiac index (CI) in relation to renal dysfunction in advanced heart failure. To determine the relationship between renal function and cardiac filling pressures using the United Network of Organ Sharing (UNOS) pulmonary artery catherization registry. Patients over the age of 18 years who were listed for single-organ heart transplantation were included. Exclusion criteria included a history of mechanical circulatory support, previous transplantation, any use of renal replacement therapy, prior history of malignancy, and cardiac surgery, amongst others. Correlations between serum creatinine (SCr) and CI, pulmonary capillary wedge pressure (PCWP), pulmonary artery systolic pressure (PASP), and pulmonary artery diastolic pressure (PADP) were assessed by Pearson correlation coefficients and simple linear regression coefficients. Pearson correlation coefficients between SCr and PCWP, PASP, and PADP were near zero with values of 0.1, 0.07, and 0.08, respectively (p < 0.0001). A weak negative correlation coefficient between SCr and CI was found (correlation coefficient, -0.045, p = 0.027). In a subgroup of young patients unlikely to have noncardiac etiologies, no significant correlations between these values were identified. These findings suggest that, as assessed by pulmonary artery catherization, none of the factors - PCWP, PASP, PADP, or CI - play a prominent role in cardiorenal syndromes. © 2018 S. Karger AG, Basel.

  7. Apparent Diffusion Coefficient and Dynamic Contrast-Enhanced Magnetic Resonance Imaging in Pancreatic Cancer: Characteristics and Correlation With Histopathologic Parameters.

    PubMed

    Ma, Wanling; Li, Na; Zhao, Weiwei; Ren, Jing; Wei, Mengqi; Yang, Yong; Wang, Yingmei; Fu, Xin; Zhang, Zhuoli; Larson, Andrew C; Huan, Yi

    2016-01-01

    To clarify diffusion and perfusion abnormalities and evaluate correlation between apparent diffusion coefficient (ADC), MR perfusion and histopathologic parameters of pancreatic cancer (PC). Eighteen patients with PC underwent diffusion-weighted imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Parameters of DCE-MRI and ADC of cancer and non-cancerous tissue were compared. Correlation between the rate constant that represents transfer of contrast agent from the arterial blood into the extravascular extracellular space (K, volume of the extravascular extracellular space per unit volume of tissue (Ve), and ADC of PC and histopathologic parameters were analyzed. The rate constant that represents transfer of contrast agent from the extravascular extracellular space into blood plasma, K, tissue volume fraction occupied by vascular space, and ADC of PC were significantly lower than nontumoral pancreases. Ve of PC was significantly higher than that of nontumoral pancreas. Apparent diffusion coefficient and K values of PC were negatively correlated to fibrosis content and fibroblast activation protein staining score. Fibrosis content was positively correlated to Ve. Apparent diffusion coefficient values and parameters of DCE-MRI can differentiate PC from nontumoral pancreases. There are correlations between ADC, K, Ve, and fibrosis content of PC. Fibroblast activation protein staining score of PC is negatively correlated to ADC and K. Apparent diffusion coefficient, K, and Ve may be feasible to predict prognosis of PC.

  8. Catching ghosts with a coarse net: use and abuse of spatial sampling data in detecting synchronization

    PubMed Central

    2017-01-01

    Synchronization of population dynamics in different habitats is a frequently observed phenomenon. A common mathematical tool to reveal synchronization is the (cross)correlation coefficient between time courses of values of the population size of a given species where the population size is evaluated from spatial sampling data. The corresponding sampling net or grid is often coarse, i.e. it does not resolve all details of the spatial configuration, and the evaluation error—i.e. the difference between the true value of the population size and its estimated value—can be considerable. We show that this estimation error can make the value of the correlation coefficient very inaccurate or even irrelevant. We consider several population models to show that the value of the correlation coefficient calculated on a coarse sampling grid rarely exceeds 0.5, even if the true value is close to 1, so that the synchronization is effectively lost. We also observe ‘ghost synchronization’ when the correlation coefficient calculated on a coarse sampling grid is close to 1 but in reality the dynamics are not correlated. Finally, we suggest a simple test to check the sampling grid coarseness and hence to distinguish between the true and artifactual values of the correlation coefficient. PMID:28202589

  9. A comparison of two indices for the intraclass correlation coefficient.

    PubMed

    Shieh, Gwowen

    2012-12-01

    In the present study, we examined the behavior of two indices for measuring the intraclass correlation in the one-way random effects model: the prevailing ICC(1) (Fisher, 1938) and the corrected eta-squared (Bliese & Halverson, 1998). These two procedures differ both in their methods of estimating the variance components that define the intraclass correlation coefficient and in their performance of bias and mean squared error in the estimation of the intraclass correlation coefficient. In contrast with the natural unbiased principle used to construct ICC(1), in the present study it was analytically shown that the corrected eta-squared estimator is identical to the maximum likelihood estimator and the pairwise estimator under equal group sizes. Moreover, the empirical results obtained from the present Monte Carlo simulation study across various group structures revealed the mutual dominance relationship between their truncated versions for negative values. The corrected eta-squared estimator performs better than the ICC(1) estimator when the underlying population intraclass correlation coefficient is small. Conversely, ICC(1) has a clear advantage over the corrected eta-squared for medium and large magnitudes of population intraclass correlation coefficient. The conceptual description and numerical investigation provide guidelines to help researchers choose between the two indices for more accurate reliability analysis in multilevel research.

  10. Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares

    NASA Technical Reports Server (NTRS)

    Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.

    2012-01-01

    A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.

  11. Prediction of mass transfer coefficient in rotating bed contactor (Higee) using artificial neural network

    NASA Astrophysics Data System (ADS)

    Saha, Dipendu

    2009-02-01

    The feasibility of drastically reducing the contactor size in mass transfer processes utilizing centrifugal field has generated a lot of interest in rotating packed bed (Higee). Various investigators have proposed correlations to predict mass transfer coefficients in Higee, but, none of the correlations was more than 20-30% accurate. In this work, artificial neural network (ANN) is employed for predicting mass transfer coefficient data. Results show that ANN provides better estimation of mass transfer coefficient with accuracy 5-15%.

  12. Intraclass Correlation Coefficients in Hierarchical Design Studies with Discrete Response Variables: A Note on a Direct Interval Estimation Procedure

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling procedure that can be used to evaluate intraclass correlation coefficients in two-level settings with discrete response variables is discussed. The approach is readily applied when the purpose is to furnish confidence intervals at prespecified confidence levels for these coefficients in setups with binary or ordinal…

  13. Precancerous esophageal epithelia are associated with significantly increased scattering coefficients

    PubMed Central

    Su, Jing-Wei; Lin, Yang-Hsien; Chiang, Chun-Ping; Lee, Jang-Ming; Hsieh, Chao-Mao; Hsieh, Min-Shu; Yang, Pei-Wen; Wang, Chen-Ping; Tseng, Ping-Huei; Lee, Yi-Chia; Sung, Kung-Bin

    2015-01-01

    The progression of epithelial precancers into cancer is accompanied by changes of tissue and cellular structures in the epithelium. Correlations between the structural changes and scattering coefficients of esophageal epithelia were investigated using quantitative phase images and the scattering-phase theorem. An ex vivo study of 14 patients demonstrated that the average scattering coefficient of precancerous epithelia was 37.8% higher than that of normal epithelia from the same patient. The scattering coefficients were highly correlated with morphological features including the cell density and the nuclear-to-cytoplasmic ratio. A high interpatient variability in scattering coefficients was observed and suggests identifying precancerous lesions based on the relative change in scattering coefficients. PMID:26504630

  14. Comparison between uroflowmetry and sonouroflowmetry in recording of urinary flow in healthy men.

    PubMed

    Krhut, Jan; Gärtner, Marcel; Sýkora, Radek; Hurtík, Petr; Burda, Michal; Luňáček, Libor; Zvarová, Katarína; Zvara, Peter

    2015-08-01

    To evaluate the accuracy of sonouroflowmetry in recording urinary flow parameters and voided volume. A total of 25 healthy male volunteers (age 18-63 years) were included in the study. All participants were asked to carry out uroflowmetry synchronous with recording of the sound generated by the urine stream hitting the water level in the urine collection receptacle, using a dedicated cell phone. From 188 recordings, 34 were excluded, because of voided volume <150 mL or technical problems during recording. Sonouroflowmetry recording was visualized in a form of a trace, representing sound intensity over time. Subsequently, the matching datasets of uroflowmetry and sonouroflowmetry were compared with respect to flow time, voided volume, maximum flow rate and average flow rate. Pearson's correlation coefficient was used to compare parameters recorded by uroflowmetry with those calculated based on sonouroflowmetry recordings. The flow pattern recorded by sonouroflowmetry showed a good correlation with the uroflowmetry trace. A strong correlation (Pearson's correlation coefficient 0.87) was documented between uroflowmetry-recorded flow time and duration of the sound signal recorded with sonouroflowmetry. A moderate correlation was observed in voided volume (Pearson's correlation coefficient 0.68) and average flow rate (Pearson's correlation coefficient 0.57). A weak correlation (Pearson's correlation coefficient 0.38) between maximum flow rate recorded using uroflowmetry and sonouroflowmetry-recorded peak sound intensity was documented. The present study shows that the basic concept utilizing sound analysis for estimation of urinary flow parameters and voided volume is valid. However, further development of this technology and standardization of recording algorithm are required. © 2015 The Japanese Urological Association.

  15. Combinatorial Algorithms for Portfolio Optimization Problems - Case of Risk Moderate Investor

    NASA Astrophysics Data System (ADS)

    Juarna, A.

    2017-03-01

    Portfolio optimization problem is a problem of finding optimal combination of n stocks from N ≥ n available stocks that gives maximal aggregate return and minimal aggregate risk. In this paper given N = 43 from the IDX (Indonesia Stock Exchange) group of the 45 most-traded stocks, known as the LQ45, with p = 24 data of monthly returns for each stock, spanned over interval 2013-2014. This problem actually is a combinatorial one where its algorithm is constructed based on two considerations: risk moderate type of investor and maximum allowed correlation coefficient between every two eligible stocks. The main outputs resulted from implementation of the algorithms is a multiple curve of three portfolio’s attributes, e.g. the size, the ratio of return to risk, and the percentage of negative correlation coefficient for every two chosen stocks, as function of maximum allowed correlation coefficient between each two stocks. The output curve shows that the portfolio contains three stocks with ratio of return to risk at 14.57 if the maximum allowed correlation coefficient between every two eligible stocks is negative and contains 19 stocks with maximum allowed correlation coefficient 0.17 to get maximum ratio of return to risk at 25.48.

  16. AN EMPIRICAL INVESTIGATION OF THE EFFECTS OF NONNORMALITY UPON THE SAMPLING DISTRIBUTION OF THE PROJECT MOMENT CORRELATION COEFFICIENT.

    ERIC Educational Resources Information Center

    HJELM, HOWARD; NORRIS, RAYMOND C.

    THE STUDY EMPIRICALLY DETERMINED THE EFFECTS OF NONNORMALITY UPON SOME SAMPLING DISTRIBUTIONS OF THE PRODUCT MOMENT CORRELATION COEFFICIENT (PMCC). SAMPLING DISTRIBUTIONS OF THE PMCC WERE OBTAINED BY DRAWING NUMEROUS SAMPLES FROM CONTROL AND EXPERIMENTAL POPULATIONS HAVING VARIOUS DEGREES OF NONNORMALITY AND BY CALCULATING CORRELATION COEFFICIENTS…

  17. Gender and Age Analyses of NIRS/STAI Pearson Correlation Coefficients at Resting State.

    PubMed

    Matsumoto, T; Fuchita, Y; Ichikawa, K; Fukuda, Y; Takemura, N; Sakatani, K

    2016-01-01

    According to the valence asymmetry hypothesis, the left/right asymmetry of PFC activity is correlated with specific emotional responses to mental stress and personality traits. In a previous study we measured spontaneous oscillation of oxy-Hb concentrations in the bilateral PFC at rest in normal adults employing two-channel portable NIRS and computed the laterality index at rest (LIR). We investigated the Pearson correlation coefficient between the LIR and anxiety levels evaluated by the State-Trait Anxiety Inventory (STAI) test. We found that subjects with right-dominant activity at rest showed higher STAI scores, while those with left dominant oxy-Hb changes at rest showed lower STAI scores such that the Pearson correlation coefficient between LIR and STAI was positive. This study performed Bootstrap analysis on the data and showed the following statistics of the target correlation coefficient: mean=0.4925 and lower confidence limit=0.177 with confidence level 0.05. Using the KS-test, we demonstrated that the correlation did not depend on age, whereas it did depend on gender.

  18. Nonlinear Dynamics in Gene Regulation Promote Robustness and Evolvability of Gene Expression Levels.

    PubMed

    Steinacher, Arno; Bates, Declan G; Akman, Ozgur E; Soyer, Orkun S

    2016-01-01

    Cellular phenotypes underpinned by regulatory networks need to respond to evolutionary pressures to allow adaptation, but at the same time be robust to perturbations. This creates a conflict in which mutations affecting regulatory networks must both generate variance but also be tolerated at the phenotype level. Here, we perform mathematical analyses and simulations of regulatory networks to better understand the potential trade-off between robustness and evolvability. Examining the phenotypic effects of mutations, we find an inverse correlation between robustness and evolvability that breaks only with nonlinearity in the network dynamics, through the creation of regions presenting sudden changes in phenotype with small changes in genotype. For genotypes embedding low levels of nonlinearity, robustness and evolvability correlate negatively and almost perfectly. By contrast, genotypes embedding nonlinear dynamics allow expression levels to be robust to small perturbations, while generating high diversity (evolvability) under larger perturbations. Thus, nonlinearity breaks the robustness-evolvability trade-off in gene expression levels by allowing disparate responses to different mutations. Using analytical derivations of robustness and system sensitivity, we show that these findings extend to a large class of gene regulatory network architectures and also hold for experimentally observed parameter regimes. Further, the effect of nonlinearity on the robustness-evolvability trade-off is ensured as long as key parameters of the system display specific relations irrespective of their absolute values. We find that within this parameter regime genotypes display low and noisy expression levels. Examining the phenotypic effects of mutations, we find an inverse correlation between robustness and evolvability that breaks only with nonlinearity in the network dynamics. Our results provide a possible solution to the robustness-evolvability trade-off, suggest an explanation for the ubiquity of nonlinear dynamics in gene expression networks, and generate useful guidelines for the design of synthetic gene circuits.

  19. Nuclear anxiety: a test-construction study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braunstein, A.L.

    1986-01-01

    The Nuclear Anxiety Scale was administered to 263 undergraduate and graduate studies (on eight occasions in December, 1985 and January, 1986). (1) The obtained alpha coefficient was .91. This was significant at the .01 level, and demonstrated that the scale was internally homogeneous and consistent. (2) Item discrimination indices (point biserial correlation coefficients) computered for the thirty (30) items yielded a range of .25 to .64. All coefficients were significant at the .01 level, and all 30 items were retained as demonstrating significant discriminability. (3) The correlation between two administrations of the scale (with a 48-hour interval) was .83. Thismore » was significant at the .01 level, and demonstrated test-retest reliability and stability over time. (4) The point-biserial correlation coefficient between scores on the Nuclear Anxiety Scale, and the students' self-report of nuclear anxiety as being either a high or low ranked stressor, was .59. This was significant at the .01 level, and demonstrated concurrent validity. (5) The correlation coefficient between scores on the Nuclear Anxiety Scale and the Spielberger State-Trait Anxiety Inventory, A-Trait, (1970), was .41. This was significant at the .01 level, and demonstrated convergent validity. (6) The correlation coefficient between positively stated and negatively stated items (with scoring reversed) was .76. This was significant at the .01 level, and demonstrated freedom from response set bias.« less

  20. Stabilized determination of geopotential coefficients by the mixed hom-BLUP approach

    NASA Technical Reports Server (NTRS)

    Middel, B.; Schaffrin, B.

    1989-01-01

    For the determination of geopotential coefficients, data can be used from rather different sources, e.g., satellite tracking, gravimetry, or altimetry. As each data type is particularly sensitive to certain wavelengths of the spherical harmonic coefficients it is of essential importance how they are treated in a combination solution. For example the longer wavelengths are well described by the coefficients of a model derived by satellite tracking, while other observation types such as gravity anomalies, delta g, and geoid heights, N, from altimetry contain only poor information for these long wavelengths. Therefore, the lower coefficients of the satellite model should be treated as being superior in the combination. In the combination a new method is presented which turns out to be highly suitable for this purpose due to its great flexibility combined with robustness.

  1. Channel correlation and BER performance analysis of coherent optical communication systems with receive diversity over moderate-to-strong non-Kolmogorov turbulence.

    PubMed

    Fu, Yulong; Ma, Jing; Tan, Liying; Yu, Siyuan; Lu, Gaoyuan

    2018-04-10

    In this paper, new expressions of the channel-correlation coefficient and its components (the large- and small-scale channel-correlation coefficients) for a plane wave are derived for a horizontal link in moderate-to-strong non-Kolmogorov turbulence using a generalized effective atmospheric spectrum which includes finite-turbulence inner and outer scales and high-wave-number "bump". The closed-form expression of the average bit error rate (BER) of the coherent free-space optical communication system is derived using the derived channel-correlation coefficients and an α-μ distribution to approximate the sum of the square root of arbitrarily correlated Gamma-Gamma random variables. Analytical results are provided to investigate the channel correlation and evaluate the average BER performance. The validity of the proposed approximation is illustrated by Monte Carlo simulations. This work will help with further investigation of the fading correlation in spatial diversity systems.

  2. Technical characterization of dialysis fluid flow and mass transfer rate in dialyzers with various filtration coefficients using dimensionless correlation equation.

    PubMed

    Fukuda, Makoto; Yoshimura, Kengo; Namekawa, Koki; Sakai, Kiyotaka

    2017-06-01

    The objective of the present study is to evaluate the effect of filtration coefficient and internal filtration on dialysis fluid flow and mass transfer coefficient in dialyzers using dimensionless mass transfer correlation equations. Aqueous solution of vitamin B 12 clearances were obtained for REXEED-15L as a low flux dialyzer, and APS-15EA and APS-15UA as high flux dialyzers. All the other design specifications were identical for these dialyzers except for filtration coefficient. The overall mass transfer coefficient was calculated, moreover, the exponents of Reynolds number (Re) and film mass transfer coefficient of the dialysis-side fluid (k D ) for each flow rate were derived from the Wilson plot and dimensionless correlation equation. The exponents of Re were 0.4 for the low flux dialyzer whereas 0.5 for the high flux dialyzers. Dialysis fluid of the low flux dialyzer was close to laminar flow because of its low filtration coefficient. On the other hand, dialysis fluid of the high flux dialyzers was assumed to be orthogonal flow. Higher filtration coefficient was associated with higher k D influenced by mass transfer rate through diffusion and internal filtration. Higher filtration coefficient of dialyzers and internal filtration affect orthogonal flow of dialysis fluid.

  3. An Energy-Based Similarity Measure for Time Series

    NASA Astrophysics Data System (ADS)

    Boudraa, Abdel-Ouahab; Cexus, Jean-Christophe; Groussat, Mathieu; Brunagel, Pierre

    2007-12-01

    A new similarity measure, called SimilB, for time series analysis, based on the cross-[InlineEquation not available: see fulltext.]-energy operator (2004), is introduced. [InlineEquation not available: see fulltext.] is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED) or the Pearson correlation coefficient (CC), SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of [InlineEquation not available: see fulltext.] are presented. Particularly, we show that [InlineEquation not available: see fulltext.] as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  4. A new security solution to JPEG using hyper-chaotic system and modified zigzag scan coding

    NASA Astrophysics Data System (ADS)

    Ji, Xiao-yong; Bai, Sen; Guo, Yu; Guo, Hui

    2015-05-01

    Though JPEG is an excellent compression standard of images, it does not provide any security performance. Thus, a security solution to JPEG was proposed in Zhang et al. (2014). But there are some flaws in Zhang's scheme and in this paper we propose a new scheme based on discrete hyper-chaotic system and modified zigzag scan coding. By shuffling the identifiers of zigzag scan encoded sequence with hyper-chaotic sequence and accurately encrypting the certain coefficients which have little relationship with the correlation of the plain image in zigzag scan encoded domain, we achieve high compression performance and robust security simultaneously. Meanwhile we present and analyze the flaws in Zhang's scheme through theoretical analysis and experimental verification, and give the comparisons between our scheme and Zhang's. Simulation results verify that our method has better performance in security and efficiency.

  5. A Solution-Doped Polymer Semiconductor:Insulator Blend for Thermoelectrics.

    PubMed

    Kiefer, David; Yu, Liyang; Fransson, Erik; Gómez, Andrés; Primetzhofer, Daniel; Amassian, Aram; Campoy-Quiles, Mariano; Müller, Christian

    2017-01-01

    Poly(ethylene oxide) is demonstrated to be a suitable matrix polymer for the solution-doped conjugated polymer poly(3-hexylthiophene). The polarity of the insulator combined with carefully chosen processing conditions permits the fabrication of tens of micrometer-thick films that feature a fine distribution of the F4TCNQ dopant:semiconductor complex. Changes in electrical conductivity from 0.1 to 0.3 S cm -1 and Seebeck coefficient from 100 to 60 μV K -1 upon addition of the insulator correlate with an increase in doping efficiency from 20% to 40% for heavily doped ternary blends. An invariant bulk thermal conductivity of about 0.3 W m -1 K -1 gives rise to a thermoelectric Figure of merit ZT ∼ 10 -4 that remains unaltered for an insulator content of more than 60 wt%. Free-standing, mechanically robust tapes illustrate the versatility of the developed dopant:semiconductor:insulator ternary blends.

  6. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS)

    NASA Astrophysics Data System (ADS)

    Sierra Villar, Ana M.; Calpena Campmany, Ana C.; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r2 > 0.999) and low limits of detection and quantification (LOD of 0.075 μg mL-1 and LOQ of 0.226 μg mL-1) in the range of 0.2-5 μg mL-1, equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily.

  7. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS).

    PubMed

    Sierra Villar, Ana M; Calpena Campmany, Ana C; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r(2)>0.999) and low limits of detection and quantification (LOD of 0.075 μg mL(-1) and LOQ of 0.226 μg mL(-1)) in the range of 0.2-5 μg mL(-1), equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. A Solution‐Doped Polymer Semiconductor:Insulator Blend for Thermoelectrics

    PubMed Central

    Kiefer, David; Yu, Liyang; Fransson, Erik; Gómez, Andrés; Primetzhofer, Daniel; Amassian, Aram; Campoy‐Quiles, Mariano

    2016-01-01

    Poly(ethylene oxide) is demonstrated to be a suitable matrix polymer for the solution‐doped conjugated polymer poly(3‐hexylthiophene). The polarity of the insulator combined with carefully chosen processing conditions permits the fabrication of tens of micrometer‐thick films that feature a fine distribution of the F4TCNQ dopant:semiconductor complex. Changes in electrical conductivity from 0.1 to 0.3 S cm−1 and Seebeck coefficient from 100 to 60 μV K−1 upon addition of the insulator correlate with an increase in doping efficiency from 20% to 40% for heavily doped ternary blends. An invariant bulk thermal conductivity of about 0.3 W m−1 K−1 gives rise to a thermoelectric Figure of merit ZT ∼ 10−4 that remains unaltered for an insulator content of more than 60 wt%. Free‐standing, mechanically robust tapes illustrate the versatility of the developed dopant:semiconductor:insulator ternary blends. PMID:28105396

  9. The Zernike expansion--an example of a merit function for 2D/3D registration based on orthogonal functions.

    PubMed

    Dong, Shuo; Kettenbach, Joachim; Hinterleitner, Isabella; Bergmann, Helmar; Birkfellner, Wolfgang

    2008-01-01

    Current merit functions for 2D/3D registration usually rely on comparing pixels or small regions of images using some sort of statistical measure. Problems connected to this paradigm the sometimes problematic behaviour of the method if noise or artefacts (for instance a guide wire) are present on the projective image. We present a merit function for 2D/3D registration which utilizes the decomposition of the X-ray and the DRR under comparison into orthogonal Zernike moments; the quality of the match is assessed by an iterative comparison of expansion coefficients. Results in a imaging study on a physical phantom show that--compared to standard cross--correlation the Zernike moment based merit function shows better robustness if histogram content in images under comparison is different, and that time expenses are comparable if the merit function is constructed out of a few significant moments only.

  10. Detrended fluctuation analysis made flexible to detect range of cross-correlated fluctuations

    NASA Astrophysics Data System (ADS)

    Kwapień, Jarosław; Oświecimka, Paweł; DroŻdŻ, Stanisław

    2015-11-01

    The detrended cross-correlation coefficient ρDCCA has recently been proposed to quantify the strength of cross-correlations on different temporal scales in bivariate, nonstationary time series. It is based on the detrended cross-correlation and detrended fluctuation analyses (DCCA and DFA, respectively) and can be viewed as an analog of the Pearson coefficient in the case of the fluctuation analysis. The coefficient ρDCCA works well in many practical situations but by construction its applicability is limited to detection of whether two signals are generally cross-correlated, without the possibility to obtain information on the amplitude of fluctuations that are responsible for those cross-correlations. In order to introduce some related flexibility, here we propose an extension of ρDCCA that exploits the multifractal versions of DFA and DCCA: multifractal detrended fluctuation analysis and multifractal detrended cross-correlation analysis, respectively. The resulting new coefficient ρq not only is able to quantify the strength of correlations but also allows one to identify the range of detrended fluctuation amplitudes that are correlated in two signals under study. We show how the coefficient ρq works in practical situations by applying it to stochastic time series representing processes with long memory: autoregressive and multiplicative ones. Such processes are often used to model signals recorded from complex systems and complex physical phenomena like turbulence, so we are convinced that this new measure can successfully be applied in time-series analysis. In particular, we present an example of such application to highly complex empirical data from financial markets. The present formulation can straightforwardly be extended to multivariate data in terms of the q -dependent counterpart of the correlation matrices and then to the network representation.

  11. Development of a fuzzy-stochastic programming with Green Z-score criterion method for planning water resources systems with a trading mechanism.

    PubMed

    Zeng, X T; Huang, G H; Li, Y P; Zhang, J L; Cai, Y P; Liu, Z P; Liu, L R

    2016-12-01

    This study developed a fuzzy-stochastic programming with Green Z-score criterion (FSGZ) method for water resources allocation and water quality management with a trading-mechanism (WAQT) under uncertainties. FSGZ can handle uncertainties expressed as probability distributions, and it can also quantify objective/subjective fuzziness in the decision-making process. Risk-averse attitudes and robustness coefficient are joined to express the relationship between the expected target and outcome under various risk preferences of decision makers and systemic robustness. The developed method is applied to a real-world case of WAQT in the Kaidu-Kongque River Basin in northwest China, where an effective mechanism (e.g., market trading) to simultaneously confront severely diminished water availability and degraded water quality is required. Results of water transaction amounts, water allocation patterns, pollution mitigation schemes, and system benefits under various scenarios are analyzed, which indicate that a trading-mechanism is a more sustainable method to manage water-environment crisis in the study region. Additionally, consideration of anthropogenic (e.g., a risk-averse attitude) and systemic factors (e.g., the robustness coefficient) can support the generation of a robust plan associated with risk control for WAQT when uncertainty is present. These findings assist local policy and decision makers to gain insights into water-environment capacity planning to balance the basin's social and economic growth with protecting the region's ecosystems.

  12. Relations between Brain Structure and Attentional Function in Spina Bifida: Utilization of Robust Statistical Approaches

    PubMed Central

    Kulesz, Paulina A.; Tian, Siva; Juranek, Jenifer; Fletcher, Jack M.; Francis, David J.

    2015-01-01

    Objective Weak structure-function relations for brain and behavior may stem from problems in estimating these relations in small clinical samples with frequently occurring outliers. In the current project, we focused on the utility of using alternative statistics to estimate these relations. Method Fifty-four children with spina bifida meningomyelocele performed attention tasks and received MRI of the brain. Using a bootstrap sampling process, the Pearson product moment correlation was compared with four robust correlations: the percentage bend correlation, the Winsorized correlation, the skipped correlation using the Donoho-Gasko median, and the skipped correlation using the minimum volume ellipsoid estimator Results All methods yielded similar estimates of the relations between measures of brain volume and attention performance. The similarity of estimates across correlation methods suggested that the weak structure-function relations previously found in many studies are not readily attributable to the presence of outlying observations and other factors that violate the assumptions behind the Pearson correlation. Conclusions Given the difficulty of assembling large samples for brain-behavior studies, estimating correlations using multiple, robust methods may enhance the statistical conclusion validity of studies yielding small, but often clinically significant, correlations. PMID:25495830

  13. Relations between volumetric measures of brain structure and attentional function in spina bifida: utilization of robust statistical approaches.

    PubMed

    Kulesz, Paulina A; Tian, Siva; Juranek, Jenifer; Fletcher, Jack M; Francis, David J

    2015-03-01

    Weak structure-function relations for brain and behavior may stem from problems in estimating these relations in small clinical samples with frequently occurring outliers. In the current project, we focused on the utility of using alternative statistics to estimate these relations. Fifty-four children with spina bifida meningomyelocele performed attention tasks and received MRI of the brain. Using a bootstrap sampling process, the Pearson product-moment correlation was compared with 4 robust correlations: the percentage bend correlation, the Winsorized correlation, the skipped correlation using the Donoho-Gasko median, and the skipped correlation using the minimum volume ellipsoid estimator. All methods yielded similar estimates of the relations between measures of brain volume and attention performance. The similarity of estimates across correlation methods suggested that the weak structure-function relations previously found in many studies are not readily attributable to the presence of outlying observations and other factors that violate the assumptions behind the Pearson correlation. Given the difficulty of assembling large samples for brain-behavior studies, estimating correlations using multiple, robust methods may enhance the statistical conclusion validity of studies yielding small, but often clinically significant, correlations. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  14. Influences of sampling size and pattern on the uncertainty of correlation estimation between soil water content and its influencing factors

    NASA Astrophysics Data System (ADS)

    Lai, Xiaoming; Zhu, Qing; Zhou, Zhiwen; Liao, Kaihua

    2017-12-01

    In this study, seven random combination sampling strategies were applied to investigate the uncertainties in estimating the hillslope mean soil water content (SWC) and correlation coefficients between the SWC and soil/terrain properties on a tea + bamboo hillslope. One of the sampling strategies is the global random sampling and the other six are the stratified random sampling on the top, middle, toe, top + mid, top + toe and mid + toe slope positions. When each sampling strategy was applied, sample sizes were gradually reduced and each sampling size contained 3000 replicates. Under each sampling size of each sampling strategy, the relative errors (REs) and coefficients of variation (CVs) of the estimated hillslope mean SWC and correlation coefficients between the SWC and soil/terrain properties were calculated to quantify the accuracy and uncertainty. The results showed that the uncertainty of the estimations decreased as the sampling size increasing. However, larger sample sizes were required to reduce the uncertainty in correlation coefficient estimation than in hillslope mean SWC estimation. Under global random sampling, 12 randomly sampled sites on this hillslope were adequate to estimate the hillslope mean SWC with RE and CV ≤10%. However, at least 72 randomly sampled sites were needed to ensure the estimated correlation coefficients with REs and CVs ≤10%. Comparing with all sampling strategies, reducing sampling sites on the middle slope had the least influence on the estimation of hillslope mean SWC and correlation coefficients. Under this strategy, 60 sites (10 on the middle slope and 50 on the top and toe slopes) were enough to ensure the estimated correlation coefficients with REs and CVs ≤10%. This suggested that when designing the SWC sampling, the proportion of sites on the middle slope can be reduced to 16.7% of the total number of sites. Findings of this study will be useful for the optimal SWC sampling design.

  15. Image segmentation-based robust feature extraction for color image watermarking

    NASA Astrophysics Data System (ADS)

    Li, Mianjie; Deng, Zeyu; Yuan, Xiaochen

    2018-04-01

    This paper proposes a local digital image watermarking method based on Robust Feature Extraction. The segmentation is achieved by Simple Linear Iterative Clustering (SLIC) based on which an Image Segmentation-based Robust Feature Extraction (ISRFE) method is proposed for feature extraction. Our method can adaptively extract feature regions from the blocks segmented by SLIC. This novel method can extract the most robust feature region in every segmented image. Each feature region is decomposed into low-frequency domain and high-frequency domain by Discrete Cosine Transform (DCT). Watermark images are then embedded into the coefficients in the low-frequency domain. The Distortion-Compensated Dither Modulation (DC-DM) algorithm is chosen as the quantization method for embedding. The experimental results indicate that the method has good performance under various attacks. Furthermore, the proposed method can obtain a trade-off between high robustness and good image quality.

  16. Robustness of Type I Error and Power in Set Correlation Analysis of Contingency Tables.

    ERIC Educational Resources Information Center

    Cohen, Jacob; Nee, John C. M.

    1990-01-01

    The analysis of contingency tables via set correlation allows the assessment of subhypotheses involving contrast functions of the categories of the nominal scales. The robustness of such methods with regard to Type I error and statistical power was studied via a Monte Carlo experiment. (TJH)

  17. NT-pro-BNP levels in patients with acute pulmonary embolism are correlated to right but not left ventricular volume and function.

    PubMed

    Pasha, Sharif M; Klok, Frederikus A; van der Bijl, Noortje; de Roos, Albert; Kroft, Lucia J M; Huisman, Menno V

    2012-08-01

    N-terminal pro-Brain Natriuretic Peptide (NT-pro-BNP) is primarily secreted by left ventricular (LV) stretch and wall tension. Notably, NT-pro-BNP is a prognostic marker in acute pulmonary embolism (PE), which primarily stresses the right ventricle (RV). We sought to evaluate the relative contribution of the RV to NT-pro-BNP levels during PE. A post-hoc analysis of an observational prospective outcome study in 113 consecutive patients with computed tomography (CT)-proven PE and 226 patients in whom PE was clinically suspected but ruled out by CT. In all patients RV and LV function was established by assessing ECG-triggered-CT measured ventricular end-diastolic-volumes and ejection fraction (EF). NT-pro-BNP was assessed in all patients. The correlation between RV and LV end-diastolic-volumes and systolic function was evaluated by multiple linear regression corrected for known confounders. In the PE cohort increased RVEF (β-coefficient (95% confidence interval [CI]) -0.044 (± -0.011); p<0.001) and higher RV end-diastolic-volume (β-coefficient 0.005 (± 0.001); p<0.001) were significantly correlated to NT-pro-BNP, while no correlation was found with LVEF (β-coefficient 0.005 (± 0.010); p=0.587) and LV end-diastolic-volume (β-coefficient -0.003 (± 0.002); p=0.074). In control patients without PE we found a strong correlation between NT-pro-BNP levels and LVEF (β-coefficient -0.027 (± -0.006); p<0.001) although not LV end-diastolic-volume (β-coefficient 0.001 (± 0.001); p=0.418). RVEF (β-coefficient -0.002 (± -0.006); p=0.802) and RV end-diastolic-volume (β-coefficient <0.001 (± 0.001); p=0.730) were not correlated in patients without PE. In PE patients, lower RVEF and higher RV end-diastolic-volume were significantly correlated to NT-pro-BNP levels as compared to control patients without PE. These observations provide pathophysiological ground for the well-known prognostic value of NT-pro-BNP in acute PE.

  18. Quantitative analysis of spatial variability of geotechnical parameters

    NASA Astrophysics Data System (ADS)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  19. Microplasma device architectures with various diamond nanostructures

    NASA Astrophysics Data System (ADS)

    Kunuku, Srinivasu; Jothiramalingam Sankaran, Kamatchi; Leou, Keh-Chyang; Lin, I.-Nan

    2017-02-01

    Diamond nanostructures (DNSs) were fabricated from three different morphological diamonds, microcrystalline diamond (MCD), nanocrystalline diamond (NCD), and ultrananocrystalline diamond (UNCD) films, using a reactive ion etching method. The plasma illumination (PI) behavior of microplasma devices using the DNSs and the diamond films as cathode were investigated. The Paschen curve approach revealed that the secondary electron emission coefficient (γ value) of diamond materials is similar irrespective of the microstructure (MCD, NCD, and UNCD) and geometry of the materials (DNSs and diamond films). The diamond materials show markedly larger γ-coefficient than conventional metallic cathode materials such as Mo that resulted in markedly better PI behavior for the corresponding microplasma devices. Moreover, the PI behavior, i.e. the voltage dependence of plasma current density (J pl-V), plasma density (n e-V), and the robustness of the devices, varied markedly with the microstructure and geometry of the cathode materials that was closely correlated to the electron field emission (EFE) properties of the cathode materials. The UNCD nanopillars, possessing good EFE properties, resulted in superior PI behavior, whereas the MCD diamond films with insufficient EFE properties led to inferior PI behavior. Consequently, enhancement of plasma characteristics is the collective effects of EFE behavior and secondary electron emission characteristics of diamond-based cathode materials.

  20. Acoustic auditing as a real-time, non-invasive quality control process for both source and assay plates.

    PubMed

    Olechno, Joseph; Ellson, Richard; Browning, Brent; Stearns, Richard; Mutz, Mitchell; Travis, Michael; Qureshi, Shehrzad; Shieh, Jean

    2005-08-01

    Acoustic auditing is a non-destructive, non-invasive technique to monitor the composition and volume of fluids in open or sealed microplates and storage tubes. When acoustic energy encounters an interface between two materials, some of the energy passes through the interface, while the remainder is reflected. Acoustic energy applied to the bottom of a multi-well plate or a storage tube is reflected by the fluid contents of the microplate or tube. The amplitude of these reflections or echoes correlates directly with properties of the fluid, including the speed of sound and the concentration of water in the fluid. Once the speed of sound in the solution is known from the analysis of these echoes, it is easy to determine the depth of liquid and, thereby, the volume by monitoring how long it takes for sound energy to reflect off the fluid meniscus. This technique is rapid (>100,000 samples per day), precise (<1% coefficient of variation for hydration measurements, <4% coefficient of variation for volume measurements), and robust. It does not require uncapping tubes or unsealing or unlidding microplates. The sound energy is extremely gentle and has no deleterious impact upon the fluid or compounds dissolved in it.

  1. Determination of solute descriptors by chromatographic methods.

    PubMed

    Poole, Colin F; Atapattu, Sanka N; Poole, Salwa K; Bell, Andrea K

    2009-10-12

    The solvation parameter model is now well established as a useful tool for obtaining quantitative structure-property relationships for chemical, biomedical and environmental processes. The model correlates a free-energy related property of a system to six free-energy derived descriptors describing molecular properties. These molecular descriptors are defined as L (gas-liquid partition coefficient on hexadecane at 298K), V (McGowan's characteristic volume), E (excess molar refraction), S (dipolarity/polarizability), A (hydrogen-bond acidity), and B (hydrogen-bond basicity). McGowan's characteristic volume is trivially calculated from structure and the excess molar refraction can be calculated for liquids from their refractive index and easily estimated for solids. The remaining four descriptors are derived by experiment using (largely) two-phase partitioning, chromatography, and solubility measurements. In this article, the use of gas chromatography, reversed-phase liquid chromatography, micellar electrokinetic chromatography, and two-phase partitioning for determining solute descriptors is described. A large database of experimental retention factors and partition coefficients is constructed after first applying selection tools to remove unreliable experimental values and an optimized collection of varied compounds with descriptor values suitable for calibrating chromatographic systems is presented. These optimized descriptors are demonstrated to be robust and more suitable than other groups of descriptors characterizing the separation properties of chromatographic systems.

  2. Some correlations between sugar maple tree characteristics and sap and sugar yields

    Treesearch

    Barton M. Blum

    1971-01-01

    Simple correlation coefficients between various characteristics of sugar maple trees and sap sugar concentration, sap volume yield, and total sugar production are given for the 1968 sap season. Correlation coefficients in general indicated that individual tree characteristics that express tree and crown size are significantly related to sap volume yield and total sugar...

  3. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality

    ERIC Educational Resources Information Center

    Bishara, Anthony J.; Hittner, James B.

    2015-01-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared…

  4. The Use of Alternative Regression Methods in Social Sciences and the Comparison of Least Squares and M Estimation Methods in Terms of the Determination of Coefficient

    ERIC Educational Resources Information Center

    Coskuntuncel, Orkun

    2013-01-01

    The purpose of this study is two-fold; the first aim being to show the effect of outliers on the widely used least squares regression estimator in social sciences. The second aim is to compare the classical method of least squares with the robust M-estimator using the "determination of coefficient" (R[superscript 2]). For this purpose,…

  5. Relationship of body mass index to percent body fat and waist circumference among schoolchildren in Japan - the influence of gender and obesity: a population-based cross-sectional study

    PubMed Central

    2010-01-01

    Background Although the correlation coefficient between body mass index (BMI) and percent body fat (%BF) or waist circumference (WC) has been reported, studies conducted among population-based schoolchildren to date have been limited in Japan, where %BF and WC are not usually measured in annual health examinations at elementary schools or junior high schools. The aim of the present study was to investigate the relationship of BMI to %BF and WC and to examine the influence of gender and obesity on these relationships among Japanese schoolchildren. Methods Subjects included 3,750 schoolchildren from the fourth and seventh grade in Ina-town, Saitama Prefecture, Japan between 2004 and 2008. Information about subject's age, sex, height, weight, %BF, and WC was collected from annual physical examinations. %BF was measured with a bipedal biometrical impedance analysis device. Obesity was defined by the following two criteria: the obese definition of the Centers for Disease Control and Prevention, and the definition of obesity for Japanese children. Pearson's correlation coefficients between BMI and %BF or WC were calculated separately for sex. Results Among fourth graders, the correlation coefficients between BMI and %BF were 0.74 for boys and 0.97 for girls, whereas those between BMI and WC were 0.94 for boys and 0.90 for girls. Similar results were observed in the analysis of seventh graders. The correlation coefficient between BMI and %BF varied by physique (obese or non-obese), with weaker correlations among the obese regardless of the definition of obesity; most correlation coefficients among obese boys were less than 0.5, whereas most correlations among obese girls were more than 0.7. On the other hand, the correlation coefficients between BMI and WC were more than 0.8 among boys and almost all coefficients were more than 0.7 among girls, regardless of physique. Conclusions BMI was positively correlated with %BF and WC among Japanese schoolchildren. The correlations could be influenced by obesity as well as by gender. Accordingly, it is essential to consider gender and obesity when using BMI as a surrogate for %BF and WC for epidemiological use. PMID:20716379

  6. Zero Pearson coefficient for strongly correlated growing trees

    NASA Astrophysics Data System (ADS)

    Dorogovtsev, S. N.; Ferreira, A. L.; Goltsev, A. V.; Mendes, J. F. F.

    2010-03-01

    We obtained Pearson’s coefficient of strongly correlated recursive networks growing by preferential attachment of every new vertex by m edges. We found that the Pearson coefficient is exactly zero in the infinite network limit for the recursive trees (m=1) . If the number of connections of new vertices exceeds one (m>1) , then the Pearson coefficient in the infinite networks equals zero only when the degree distribution exponent γ does not exceed 4. We calculated the Pearson coefficient for finite networks and observed a slow power-law-like approach to an infinite network limit. Our findings indicate that Pearson’s coefficient strongly depends on size and details of networks, which makes this characteristic virtually useless for quantitative comparison of different networks.

  7. Zero Pearson coefficient for strongly correlated growing trees.

    PubMed

    Dorogovtsev, S N; Ferreira, A L; Goltsev, A V; Mendes, J F F

    2010-03-01

    We obtained Pearson's coefficient of strongly correlated recursive networks growing by preferential attachment of every new vertex by m edges. We found that the Pearson coefficient is exactly zero in the infinite network limit for the recursive trees (m=1). If the number of connections of new vertices exceeds one (m>1), then the Pearson coefficient in the infinite networks equals zero only when the degree distribution exponent gamma does not exceed 4. We calculated the Pearson coefficient for finite networks and observed a slow power-law-like approach to an infinite network limit. Our findings indicate that Pearson's coefficient strongly depends on size and details of networks, which makes this characteristic virtually useless for quantitative comparison of different networks.

  8. Determining Sample Size for Accurate Estimation of the Squared Multiple Correlation Coefficient.

    ERIC Educational Resources Information Center

    Algina, James; Olejnik, Stephen

    2000-01-01

    Discusses determining sample size for estimation of the squared multiple correlation coefficient and presents regression equations that permit determination of the sample size for estimating this parameter for up to 20 predictor variables. (SLD)

  9. Effect of degree correlations above the first shell on the percolation transition

    NASA Astrophysics Data System (ADS)

    Valdez, L. D.; Buono, C.; Braunstein, L. A.; Macri, P. A.

    2011-11-01

    The use of degree-degree correlations to model realistic networks which are characterized by their Pearson's coefficient, has become widespread. However the effect on how different correlation algorithms produce different results on processes on top of them, has not yet been discussed. In this letter, using different correlation algorithms to generate assortative networks, we show that for very assortative networks the behavior of the main observables in percolation processes depends on the algorithm used to build the network. The different alghoritms used here introduce different inner structures that are missed in Pearson's coefficient. We explain the different behaviors through a generalization of Pearson's coefficient that allows to study the correlations at chemical distances l from a root node. We apply our findings to real networks.

  10. Evaluation of generalized heat-transfer coefficients in pilot AFBC units

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grewal, N.S.

    Experimental data for heat transfer rates as obtained in a 0.209m/sup 2/ AFBC unit at the GFETC is examined in the light of the existing four correlations for heat transfer coefficient between an immersed staggered array of horizontal tubes and a gas-solid fluidized bed. The predicted values of heat transfer coefficient from the correlations proposed by Grewal and Bansal are found to be in good agreement with the experimental values of heat transfer coefficient when the contribution due to radiation is also included.

  11. Evaluation of generalized heat transfer coefficients in pilot AFBC units

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grewal, N.S.

    Experimental data for heat transfer rates as obtained in a 0.209m/sup 2/ AFBC unit at the GFETC is examined in the light of the existing four correlations for heat transfer coefficient between an immersed staggered array of horizontal tubes and a gas-solid fluidized bed. The predicted values of heat transfer coefficient from the correlations proposed by Grewal and Bansal are found to be in good agreement with the experimental values of heat transfer coefficient when the contribution due to radiation is also included.

  12. Phase retrieval in generalized optical interferometry systems.

    PubMed

    Farriss, Wesley E; Fienup, James R; Malhotra, Tanya; Vamivakas, A Nick

    2018-02-05

    Modal analysis of an optical field via generalized interferometry (GI) is a novel technique that treats said field as a linear superposition of transverse modes and recovers the amplitudes of modal weighting coefficients. We use phase retrieval by nonlinear optimization to recover the phase of these modal weighting coefficients. Information diversity increases the robustness of the algorithm by better constraining the solution. Additionally, multiple sets of random starting phase values assist the algorithm in overcoming local minima. The algorithm was able to recover nearly all coefficient phases for simulated fields consisting of up to 21 superpositioned Hermite Gaussian modes from simulated data and proved to be resilient to shot noise.

  13. Robust determination of mass attenuation coefficients of materials with unknown thickness and density

    NASA Astrophysics Data System (ADS)

    Kurudirek, M.; Medhat, M. E.

    2014-07-01

    An alternative approach is used to measure normalized mass attenuation coefficients (μ/ρ) of materials with unknown thickness and density. The adopted procedure is based on the use of simultaneous emission of Kα and Kβ X-ray lines as well as gamma peaks from radioactive sources in transmission geometry. 109Cd and 60Co radioactive sources were used for the purpose of the investigation. It has been observed that using the simultaneous X- and/or gamma rays of different energy allows accurate determination of relative mass attenuation coefficients by eliminating the dependence of μ/ρ on thickness and density of the material.

  14. Variability in the Correlation between Asian Dust Storms and Chlorophyll a Concentration from the North to Equatorial Pacific

    PubMed Central

    Tan, Sai-Chun; Yao, Xiaohong; Gao, Hui-Wang; Shi, Guang-Yu; Yue, Xu

    2013-01-01

    A long-term record of Asian dust storms showed seven high-occurrence-frequency centers in China. The intrusion of Asian dust into the downwind seas, including the China seas, the Sea of Japan, the subarctic North Pacific, the North Pacific subtropical gyre, and the western and eastern Equatorial Pacific, has been shown to add nutrients to ocean ecosystems and enhance their biological activities. To explore the relationship between the transported dust from various sources to the six seas and oceanic biological activities with different nutrient conditions, the correlation between monthly chlorophyll a concentration in each sea and monthly dust storm occurrence frequencies reaching the sea during 1997–2007 was examined in this study. No correlations were observed between dust and chlorophyll a concentration in the <50 m China seas because atmospheric deposition is commonly believed to exert less impact on coastal seas. Significant correlations existed between dust sources and many sea areas, suggesting a link between dust and chlorophyll a concentration in those seas. However, the correlation coefficients were highly variable. In general, the correlation coefficients (0.54–0.63) for the Sea of Japan were highest, except for that between the subarctic Pacific and the Taklimakan Desert, where it was as high as 0.7. For the >50 m China seas and the North Pacific subtropical gyre, the correlation coefficients were in the range 0.32–0.57. The correlation coefficients for the western and eastern Equatorial Pacific were relatively low (<0.36). These correlation coefficients were further interpreted in terms of the geographical distributions of dust sources, the transport pathways, the dust deposition, the nutrient conditions of oceans, and the probability of dust storms reaching the seas. PMID:23460892

  15. Variability in the correlation between Asian dust storms and chlorophyll a concentration from the North to Equatorial Pacific.

    PubMed

    Tan, Sai-Chun; Yao, Xiaohong; Gao, Hui-Wang; Shi, Guang-Yu; Yue, Xu

    2013-01-01

    A long-term record of Asian dust storms showed seven high-occurrence-frequency centers in China. The intrusion of Asian dust into the downwind seas, including the China seas, the Sea of Japan, the subarctic North Pacific, the North Pacific subtropical gyre, and the western and eastern Equatorial Pacific, has been shown to add nutrients to ocean ecosystems and enhance their biological activities. To explore the relationship between the transported dust from various sources to the six seas and oceanic biological activities with different nutrient conditions, the correlation between monthly chlorophyll a concentration in each sea and monthly dust storm occurrence frequencies reaching the sea during 1997-2007 was examined in this study. No correlations were observed between dust and chlorophyll a concentration in the <50 m China seas because atmospheric deposition is commonly believed to exert less impact on coastal seas. Significant correlations existed between dust sources and many sea areas, suggesting a link between dust and chlorophyll a concentration in those seas. However, the correlation coefficients were highly variable. In general, the correlation coefficients (0.54-0.63) for the Sea of Japan were highest, except for that between the subarctic Pacific and the Taklimakan Desert, where it was as high as 0.7. For the >50 m China seas and the North Pacific subtropical gyre, the correlation coefficients were in the range 0.32-0.57. The correlation coefficients for the western and eastern Equatorial Pacific were relatively low (<0.36). These correlation coefficients were further interpreted in terms of the geographical distributions of dust sources, the transport pathways, the dust deposition, the nutrient conditions of oceans, and the probability of dust storms reaching the seas.

  16. Correlations between symptoms, nasal endoscopy, and in-office computed tomography in post-surgical chronic rhinosinusitis patients.

    PubMed

    Ryan, William R; Ramachandra, Tara; Hwang, Peter H

    2011-03-01

    To determine correlations between symptoms, nasal endoscopy findings, and computed tomography (CT) scan findings in post-surgical chronic rhinosinusitis (CRS) patients. Cross-sectional. A total of 51 CRS patients who had undergone endoscopic sinus surgery (ESS) completed symptom questionnaires, underwent endoscopy, and received an in-office sinus CT scan during one clinic visit. For metrics, we used the Sinonasal Outcomes Test-20 (SNOT-20) questionnaire, visual analog symptom scale (VAS), Lund-Kennedy endoscopy scoring scale, and Lund-MacKay (LM) CT scoring scale. We determined Pearson correlation coefficients, sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) between scores for symptoms, endoscopy, and CT. The SNOT-20 score and most VAS symptoms had poor correlation coefficients with both endoscopy and CT scores (0.03-0.24). Nasal drainage of pus, nasal congestion, and impaired sense of smell had moderate correlation coefficients with endoscopy and CT (0.24-0.42). Endoscopy had a strong correlation coefficient with CT (0.76). Drainage, edema, and polyps had strong correlation coefficients with CT (0.80, 0.69, and 0.49, respectively). Endoscopy had a PPV of 92.5% and NPV of 45.5% for detecting an abnormal sinus CT (LM score ≥1). In post-ESS CRS patients, most symptoms do not correlate well with either endoscopy or CT findings. Endoscopy and CT scores correlate well. Abnormal endoscopy findings have the ability to confidently rule in the presence of CT opacification, thus validating the importance of endoscopy in clinical decision making. However, a normal endoscopy cannot assure a normal CT. Thus, symptoms, endoscopy, and CT are complementary in the evaluation of the post-ESS CRS patient. Copyright © 2011 The American Laryngological, Rhinological, and Otological Society, Inc., Rhinological, and Otological Society, Inc.

  17. An experimental study on the noise correlation properties of CBCT projection data

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Ouyang, Luo; Ma, Jianhua; Huang, Jing; Chen, Wufan; Wang, Jing

    2014-03-01

    In this study, we systematically investigated the noise correlation properties among detector bins of CBCT projection data by analyzing repeated projection measurements. The measurements were performed on a TrueBeam on-board CBCT imaging system with a 4030CB flat panel detector. An anthropomorphic male pelvis phantom was used to acquire 500 repeated projection data at six different dose levels from 0.1 mAs to 1.6 mAs per projection at three fixed angles. To minimize the influence of the lag effect, lag correction was performed on the consecutively acquired projection data. The noise correlation coefficient between detector bin pairs was calculated from the corrected projection data. The noise correlation among CBCT projection data was then incorporated into the covariance matrix of the penalized weighted least-squares (PWLS) criterion for noise reduction of low-dose CBCT. The analyses of the repeated measurements show that noise correlation coefficients are non-zero between the nearest neighboring bins of CBCT projection data. The average noise correlation coefficients for the first- and second- order neighbors are 0.20 and 0.06, respectively. The noise correlation coefficients are independent of the dose level. Reconstruction of the pelvis phantom shows that the PWLS criterion with consideration of noise correlation results in a lower noise level as compared to the PWLS criterion without considering the noise correlation at the matched resolution.

  18. Investigation on changes of modularity and robustness by edge-removal mutations in signaling networks.

    PubMed

    Truong, Cong-Doan; Kwon, Yung-Keun

    2017-12-21

    Biological networks consisting of molecular components and interactions are represented by a graph model. There have been some studies based on that model to analyze a relationship between structural characteristics and dynamical behaviors in signaling network. However, little attention has been paid to changes of modularity and robustness in mutant networks. In this paper, we investigated the changes of modularity and robustness by edge-removal mutations in three signaling networks. We first observed that both the modularity and robustness increased on average in the mutant network by the edge-removal mutations. However, the modularity change was negatively correlated with the robustness change. This implies that it is unlikely that both the modularity and the robustness values simultaneously increase by the edge-removal mutations. Another interesting finding is that the modularity change was positively correlated with the degree, the number of feedback loops, and the edge betweenness of the removed edges whereas the robustness change was negatively correlated with them. We note that these results were consistently observed in randomly structure networks. Additionally, we identified two groups of genes which are incident to the highly-modularity-increasing and the highly-robustness-decreasing edges with respect to the edge-removal mutations, respectively, and observed that they are likely to be central by forming a connected component of a considerably large size. The gene-ontology enrichment of each of these gene groups was significantly different from the rest of genes. Finally, we showed that the highly-robustness-decreasing edges can be promising edgetic drug-targets, which validates the usefulness of our analysis. Taken together, the analysis of changes of robustness and modularity against edge-removal mutations can be useful to unravel novel dynamical characteristics underlying in signaling networks.

  19. Exponential smoothing weighted correlations

    NASA Astrophysics Data System (ADS)

    Pozzi, F.; Di Matteo, T.; Aste, T.

    2012-06-01

    In many practical applications, correlation matrices might be affected by the "curse of dimensionality" and by an excessive sensitiveness to outliers and remote observations. These shortcomings can cause problems of statistical robustness especially accentuated when a system of dynamic correlations over a running window is concerned. These drawbacks can be partially mitigated by assigning a structure of weights to observational events. In this paper, we discuss Pearson's ρ and Kendall's τ correlation matrices, weighted with an exponential smoothing, computed on moving windows using a data-set of daily returns for 300 NYSE highly capitalized companies in the period between 2001 and 2003. Criteria for jointly determining optimal weights together with the optimal length of the running window are proposed. We find that the exponential smoothing can provide more robust and reliable dynamic measures and we discuss that a careful choice of the parameters can reduce the autocorrelation of dynamic correlations whilst keeping significance and robustness of the measure. Weighted correlations are found to be smoother and recovering faster from market turbulence than their unweighted counterparts, helping also to discriminate more effectively genuine from spurious correlations.

  20. A methodology for assessing the effect of correlations among muscle synergy activations on task-discriminating information.

    PubMed

    Delis, Ioannis; Berret, Bastien; Pozzo, Thierry; Panzeri, Stefano

    2013-01-01

    Muscle synergies have been hypothesized to be the building blocks used by the central nervous system to generate movement. According to this hypothesis, the accomplishment of various motor tasks relies on the ability of the motor system to recruit a small set of synergies on a single-trial basis and combine them in a task-dependent manner. It is conceivable that this requires a fine tuning of the trial-to-trial relationships between the synergy activations. Here we develop an analytical methodology to address the nature and functional role of trial-to-trial correlations between synergy activations, which is designed to help to better understand how these correlations may contribute to generating appropriate motor behavior. The algorithm we propose first divides correlations between muscle synergies into types (noise correlations, quantifying the trial-to-trial covariations of synergy activations at fixed task, and signal correlations, quantifying the similarity of task tuning of the trial-averaged activation coefficients of different synergies), and then uses single-trial methods (task-decoding and information theory) to quantify their overall effect on the task-discriminating information carried by muscle synergy activations. We apply the method to both synchronous and time-varying synergies and exemplify it on electromyographic data recorded during performance of reaching movements in different directions. Our method reveals the robust presence of information-enhancing patterns of signal and noise correlations among pairs of synchronous synergies, and shows that they enhance by 9-15% (depending on the set of tasks) the task-discriminating information provided by the synergy decompositions. We suggest that the proposed methodology could be useful for assessing whether single-trial activations of one synergy depend on activations of other synergies and quantifying the effect of such dependences on the task-to-task differences in muscle activation patterns.

  1. Automated Quantification of Volumetric Optic Disc Swelling in Papilledema Using Spectral-Domain Optical Coherence Tomography

    PubMed Central

    Wang, Jui-Kai; Kardon, Randy H.; Kupersmith, Mark J.; Garvin, Mona K.

    2012-01-01

    Purpose. To develop an automated method for the quantification of volumetric optic disc swelling in papilledema subjects using spectral-domain optical coherence tomography (SD-OCT) and to determine the extent that such volumetric measurements correlate with Frisén scale grades (from fundus photographs) and two-dimensional (2-D) peripapillary retinal nerve fiber layer (RNFL) and total retinal (TR) thickness measurements from SD-OCT. Methods. A custom image-analysis algorithm was developed to obtain peripapillary circular RNFL thickness, TR thickness, and TR volume measurements from SD-OCT volumes of subjects with papilledema. In addition, peripapillary RNFL thickness measures from the commercially available Zeiss SD-OCT machine were obtained. Expert Frisén scale grades were independently obtained from corresponding fundus photographs. Results. In 71 SD-OCT scans, the mean (± standard deviation) resulting TR volumes for Frisén scale 0 to scale 4 were 11.36 ± 0.56, 12.53 ± 1.21, 14.42 ± 2.11, 17.48 ± 2.63, and 21.81 ± 3.16 mm3, respectively. The Spearman's rank correlation coefficient was 0.737. Using 55 eyes with valid Zeiss RNFL measurements, Pearson's correlation coefficient (r) between the TR volume and the custom algorithm's TR thickness, the custom algorithm's RNFL thickness, and Zeiss' RNFL thickness was 0.980, 0.929, and 0.946, respectively. Between Zeiss' RNFL and the custom algorithm's RNFL, and the study's TR thickness, r was 0.901 and 0.961, respectively. Conclusions. Volumetric measurements of the degree of disc swelling in subjects with papilledema can be obtained from SD-OCT volumes, with the mean volume appearing to be roughly linearly related to the Frisén scale grade. Using such an approach can provide a more continuous, objective, and robust means for assessing the degree of disc swelling compared with presently available approaches. PMID:22599584

  2. Breakdown of interdependent directed networks.

    PubMed

    Liu, Xueming; Stanley, H Eugene; Gao, Jianxi

    2016-02-02

    Increasing evidence shows that real-world systems interact with one another via dependency connectivities. Failing connectivities are the mechanism behind the breakdown of interacting complex systems, e.g., blackouts caused by the interdependence of power grids and communication networks. Previous research analyzing the robustness of interdependent networks has been limited to undirected networks. However, most real-world networks are directed, their in-degrees and out-degrees may be correlated, and they are often coupled to one another as interdependent directed networks. To understand the breakdown and robustness of interdependent directed networks, we develop a theoretical framework based on generating functions and percolation theory. We find that for interdependent Erdős-Rényi networks the directionality within each network increases their vulnerability and exhibits hybrid phase transitions. We also find that the percolation behavior of interdependent directed scale-free networks with and without degree correlations is so complex that two criteria are needed to quantify and compare their robustness: the percolation threshold and the integrated size of the giant component during an entire attack process. Interestingly, we find that the in-degree and out-degree correlations in each network layer increase the robustness of interdependent degree heterogeneous networks that most real networks are, but decrease the robustness of interdependent networks with homogeneous degree distribution and with strong coupling strengths. Moreover, by applying our theoretical analysis to real interdependent international trade networks, we find that the robustness of these real-world systems increases with the in-degree and out-degree correlations, confirming our theoretical analysis.

  3. Parametric synthesis of a robust controller on a base of mathematical programming method

    NASA Astrophysics Data System (ADS)

    Khozhaev, I. V.; Gayvoronskiy, S. A.; Ezangina, T. A.

    2018-05-01

    Considered paper is dedicated to deriving sufficient conditions, linking root indices of robust control quality with coefficients of interval characteristic polynomial, on the base of mathematical programming method. On the base of these conditions, a method of PI- and PID-controllers, providing aperiodic transient process with acceptable stability degree and, subsequently, acceptable setting time, synthesis was developed. The method was applied to a problem of synthesizing a controller for a depth control system of an unmanned underwater vehicle.

  4. Enhancing robustness and immunization in geographical networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang Liang; Department of Physics, Lanzhou University, Lanzhou 730000; Yang Kongqing

    2007-03-15

    We find that different geographical structures of networks lead to varied percolation thresholds, although these networks may have similar abstract topological structures. Thus, strategies for enhancing robustness and immunization of a geographical network are proposed. Using the generating function formalism, we obtain an explicit form of the percolation threshold q{sub c} for networks containing arbitrary order cycles. For three-cycles, the dependence of q{sub c} on the clustering coefficients is ascertained. The analysis substantiates the validity of the strategies with analytical evidence.

  5. Closure and ratio correlation analysis of lunar chemical and grain size data

    NASA Technical Reports Server (NTRS)

    Butler, J. C.

    1976-01-01

    Major element and major element plus trace element analyses were selected from the lunar data base for Apollo 11, 12 and 15 basalt and regolith samples. Summary statistics for each of the six data sets were compiled, and the effects of closure on the Pearson product moment correlation coefficient were investigated using the Chayes and Kruskal approximation procedure. In general, there are two types of closure effects evident in these data sets: negative correlations of intermediate size which are solely the result of closure, and correlations of small absolute value which depart significantly from their expected closure correlations which are of intermediate size. It is shown that a positive closure correlation will arise only when the product of the coefficients of variation is very small (less than 0.01 for most data sets) and, in general, trace elements in the lunar data sets exhibit relatively large coefficients of variation.

  6. Correlation Between Posttraumatic Growth and Posttraumatic Stress Disorder Symptoms Based on Pearson Correlation Coefficient: A Meta-Analysis.

    PubMed

    Liu, An-Nuo; Wang, Lu-Lu; Li, Hui-Ping; Gong, Juan; Liu, Xiao-Hong

    2017-05-01

    The literature on posttraumatic growth (PTG) is burgeoning, with the inconsistencies in the literature of the relationship between PTG and posttraumatic stress disorder (PTSD) symptoms becoming a focal point of attention. Thus, this meta-analysis aims to explore the relationship between PTG and PTSD symptoms through the Pearson correlation coefficient. A systematic search of the literature from January 1996 to November 2015 was completed. We retrieved reports on 63 studies that involved 26,951 patients. The weighted correlation coefficient revealed an effect size of 0.22 with a 95% confidence interval of 0.18 to 0.25. Meta-analysis provides evidence that PTG may be positively correlated with PTSD symptoms and that this correlation may be modified by age, trauma type, and time since trauma. Accordingly, people with high levels of PTG should not be ignored, but rather, they should continue to receive help to alleviate their PTSD symptoms.

  7. Effective Visual Tracking Using Multi-Block and Scale Space Based on Kernelized Correlation Filters

    PubMed Central

    Jeong, Soowoong; Kim, Guisik; Lee, Sangkeun

    2017-01-01

    Accurate scale estimation and occlusion handling is a challenging problem in visual tracking. Recently, correlation filter-based trackers have shown impressive results in terms of accuracy, robustness, and speed. However, the model is not robust to scale variation and occlusion. In this paper, we address the problems associated with scale variation and occlusion by employing a scale space filter and multi-block scheme based on a kernelized correlation filter (KCF) tracker. Furthermore, we develop a more robust algorithm using an appearance update model that approximates the change of state of occlusion and deformation. In particular, an adaptive update scheme is presented to make each process robust. The experimental results demonstrate that the proposed method outperformed 29 state-of-the-art trackers on 100 challenging sequences. Specifically, the results obtained with the proposed scheme were improved by 8% and 18% compared to those of the KCF tracker for 49 occlusion and 64 scale variation sequences, respectively. Therefore, the proposed tracker can be a robust and useful tool for object tracking when occlusion and scale variation are involved. PMID:28241475

  8. Effective Visual Tracking Using Multi-Block and Scale Space Based on Kernelized Correlation Filters.

    PubMed

    Jeong, Soowoong; Kim, Guisik; Lee, Sangkeun

    2017-02-23

    Accurate scale estimation and occlusion handling is a challenging problem in visual tracking. Recently, correlation filter-based trackers have shown impressive results in terms of accuracy, robustness, and speed. However, the model is not robust to scale variation and occlusion. In this paper, we address the problems associated with scale variation and occlusion by employing a scale space filter and multi-block scheme based on a kernelized correlation filter (KCF) tracker. Furthermore, we develop a more robust algorithm using an appearance update model that approximates the change of state of occlusion and deformation. In particular, an adaptive update scheme is presented to make each process robust. The experimental results demonstrate that the proposed method outperformed 29 state-of-the-art trackers on 100 challenging sequences. Specifically, the results obtained with the proposed scheme were improved by 8% and 18% compared to those of the KCF tracker for 49 occlusion and 64 scale variation sequences, respectively. Therefore, the proposed tracker can be a robust and useful tool for object tracking when occlusion and scale variation are involved.

  9. [Examination of diagnosis procedure combination survey data that influence function evaluation coefficient II].

    PubMed

    Nakajima, Hisato; Yano, Kouya; Nagasawa, Kaoko; Kobayashi, Eiji; Yokota, Kuninobu

    2015-01-01

    On the basis of Diagnosis Procedure Combination (DPC) survey data, the factors that increase the value of function evaluation coefficient II were considered. A total of 1,505 hospitals were divided into groups I, II, and III, and the following items were considered. 1. Significant differences in function evaluation coefficient II and DPC survey data. 2. Examination of using the Mahalanobis-Taguchi (MT) method. 3. Correlation between function evaluation coefficient II and each DPC survey data item. 1. Function evaluation coefficient II was highest in group II. Group I hospitals showed the highest bed capacity, and numbers of hospitalization days, operations, chemotherapies, radiotherapies and general anesthesia procedures. 2. Using the MT method, we found that the number of ambulance conveyances was effective factor in group I hospitals, the number of general anesthesia procedures was effective factor in group II hospitals, and the bed capacity was effective factor in group III hospitals. 3. In group I hospitals, function evaluation coefficient II significantly correlated to the numbers of ambulance conveyances and chemotherapies. In group II hospitals, function evaluation coefficient II significantly correlated to bed capacity, the numbers of ambulance conveyances, hospitalization days, operations, general anesthesia procedures, and mean hospitalization days. In group III hospitals, function evaluation coefficient II significantly correlated to all items. The factors that improve the value of function evaluation coefficient II were the increases in the numbers of ambulance conveyances, chemotherapies and radiotherapies in group I hospitals, increases in the numbers of hospitalization days, operations, ambulance conveyances and general anesthesia procedures in group II hospitals, and increases in the numbers of hospitalization days, operations and ambulance conveyances. These results indicate that the profit of a hospital will increase, which will lead to medical services of good quality.

  10. Estimating a graphical intra-class correlation coefficient (GICC) using multivariate probit-linear mixed models.

    PubMed

    Yue, Chen; Chen, Shaojie; Sair, Haris I; Airan, Raag; Caffo, Brian S

    2015-09-01

    Data reproducibility is a critical issue in all scientific experiments. In this manuscript, the problem of quantifying the reproducibility of graphical measurements is considered. The image intra-class correlation coefficient (I2C2) is generalized and the graphical intra-class correlation coefficient (GICC) is proposed for such purpose. The concept for GICC is based on multivariate probit-linear mixed effect models. A Markov Chain Monte Carlo EM (mcm-cEM) algorithm is used for estimating the GICC. Simulation results with varied settings are demonstrated and our method is applied to the KIRBY21 test-retest dataset.

  11. Molecular dimensions and surface diffusion assisted mechanically robust slippery perfluoropolyether impregnated mesoporous alumina interfaces

    NASA Astrophysics Data System (ADS)

    Rowthu, Sriharitha; Balic, Edin E.; Hoffmann, Patrik

    2017-12-01

    Accomplishing mechanically robust omniphobic surfaces is a long-existing challenge, and can potentially find applications in bioengineering, tribology and paint industries. Slippery liquid impregnated mesoporous α-Al2O3 interfaces are achieved with water, alkanes, water based and oil based high viscosity acrylic paints. Incredibly high abrasion-resistance (wear coefficients ≤10-8 mm3 N-1 m-1) and ultra-low friction coefficients (≥0.025) are attained, attributing to the hard alumina matrix and continuous replenishment of perfluoropolyether aided by capillarity and surface diffusion processes. A variety of impregnating liquids employed suggest that large molecules, faster surface diffusion and lowest evaporation rate generate the rare combination of high wear-resistance and omniphobicity. It is noteworthy that these novel liquid impregnated Al2O3 composites exhibit outstanding load bearing capacity up to 350 MPa; three orders of magnitude higher than achievable by the state of the art omniphobic surfaces. Further, our developed thermodynamic calculations suggest that the relative thermodynamic stability of liquid impregnated composites is linearly proportional to the spreading coefficient (S) of the impregnating liquid with the matrix material and is an important tool for the selection of an appropriate matrix material for a given liquid.

  12. Simulation-based coefficients for adjusting climate impact on energy consumption of commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Makhmalbaf, Atefe; Srivastava, Viraj

    This paper presents a new technique for and the results of normalizing building energy consumption to enable a fair comparison among various types of buildings located near different weather stations across the U.S. The method was developed for the U.S. Building Energy Asset Score, a whole-building energy efficiency rating system focusing on building envelope, mechanical systems, and lighting systems. The Asset Score is calculated based on simulated energy use under standard operating conditions. Existing weather normalization methods such as those based on heating and cooling degrees days are not robust enough to adjust all climatic factors such as humidity andmore » solar radiation. In this work, over 1000 sets of climate coefficients were developed to separately adjust building heating, cooling, and fan energy use at each weather station in the United States. This paper also presents a robust, standardized weather station mapping based on climate similarity rather than choosing the closest weather station. This proposed simulated-based climate adjustment was validated through testing on several hundreds of thousands of modeled buildings. Results indicated the developed climate coefficients can isolate and adjust for the impacts of local climate for asset rating.« less

  13. Stress Field Variation after the 2001 Skyros Earthquake, Greece, Derived from Seismicity Rate Changes

    NASA Astrophysics Data System (ADS)

    Leptokaropoulos, K.; Papadimitriou, E.; Orlecka-Sikora, B.; Karakostas, V.

    2012-04-01

    The spatial variation of the stress field (ΔCFF) after the 2001 strong (Mw=6.4) Skyros earthquake in North Aegean Sea, Greece, is investigated in association with the changes of earthquake production rates. A detailed slip model is considered in which the causative fault is consisted of several sub-faults with different coseismic slip onto each one of them. First the spatial distribution of aftershock productivity is compared with the static stress changes due to the coseismic slip. Calculations of ΔCFF are performed at different depths inside the seismogenic layer, defined from the vertical distribution of the aftershocks. Seismicity rates of the smaller magnitude events with M≥Mc for different time increments before and after the main shock are then derived from the application of a Probability Density Function (PDF). These rates are computed by spatially smoothing the seismicity and for this purpose a normal grid of rectangular cells is superimposed onto the area and the PDF determines seismicity rate values at the center of each cell. The differences between the earthquake occurrence rates before and after the main shock are compared and used as input data in a stress inversion algorithm based upon the Rate/State dependent friction concept in order to provide an independent estimation of stress changes. This model incorporates the physical properties of the fault zones (characteristic relaxation time, fault constitutive parameters, effective friction coefficient) with a probabilistic estimation of the spatial distribution of seismicity rates, derived from the application of the PDF. The stress patterns derived from the previously mentioned approaches are compared and the quantitative correlation between the respective results is accomplished by the evaluation of Pearson linear correlation coefficient and its confidence intervals to quantify their significance. Different assumptions and combinations of the physical and statistical parameters are tested for the model performance and robustness to be evaluated. Simulations will provide a measure of how robust is the use of seismicity rate changes as a stress meter for both positive and negative stress steps. This work was partially prepared within the framework of the research projects No. N N307234937 and 3935/B/T02/2010/39 financed by the Ministry of Education and Science of Poland during the period 2009 to 2011 and 2010 to 2012, respectively.

  14. QSAR modeling of flotation collectors using principal components extracted from topological indices.

    PubMed

    Natarajan, R; Nirdosh, Inderjit; Basak, Subhash C; Mills, Denise R

    2002-01-01

    Several topological indices were calculated for substituted-cupferrons that were tested as collectors for the froth flotation of uranium. The principal component analysis (PCA) was used for data reduction. Seven principal components (PC) were found to account for 98.6% of the variance among the computed indices. The principal components thus extracted were used in stepwise regression analyses to construct regression models for the prediction of separation efficiencies (Es) of the collectors. A two-parameter model with a correlation coefficient of 0.889 and a three-parameter model with a correlation coefficient of 0.913 were formed. PCs were found to be better than partition coefficient to form regression equations, and inclusion of an electronic parameter such as Hammett sigma or quantum mechanically derived electronic charges on the chelating atoms did not improve the correlation coefficient significantly. The method was extended to model the separation efficiencies of mercaptobenzothiazoles (MBT) and aminothiophenols (ATP) used in the flotation of lead and zinc ores, respectively. Five principal components were found to explain 99% of the data variability in each series. A three-parameter equation with correlation coefficient of 0.985 and a two-parameter equation with correlation coefficient of 0.926 were obtained for MBT and ATP, respectively. The amenability of separation efficiencies of chelating collectors to QSAR modeling using PCs based on topological indices might lead to the selection of collectors for synthesis and testing from a virtual database.

  15. Different hip and knee priority score systems: are they good for the same thing?

    PubMed

    Escobar, Antonio; Quintana, Jose Maria; Espallargues, Mireia; Allepuz, Alejandro; Ibañez, Berta

    2010-10-01

    The aim of the present study was to compare two priority tools used for joint replacement for patients on waiting lists, which use two different methods. Two prioritization tools developed and validated by different methodologies were used on the same cohort of patients. The first, an IRYSS hip and knee priority score (IHKPS) developed by RAND method, was applied while patients were on the waiting list. The other, a Catalonia hip-knee priority score (CHKPS) developed by conjoint analysis, was adapted and applied retrospectively. In addition, all patients fulfilled pre-intervention the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC). Correlation between them was studied by Pearson correlation coefficient (r). Agreement was analysed by means of intra-class correlation coefficient (ICC), Kendall coefficient and Cohern kappa. The relationship between IHKPS, CHKPS and baseline WOMAC scores by r coefficient was studied. The sample consisted of 774 consecutive patients. Pearson correlation coefficient between IHKPS and CHKPS was 0.79. The agreement study showed that ICC was 0.74, Kendall coefficient 0.86 and kappa 0.66. Finally, correlation between CHKPS and baseline WOMAC ranged from 0.43 to 0.64. The results according to the relationship between IHKPS and WOMAC ranged from 0.50 to 0.74. Results support the hypothesis that if the final objective of the prioritization tools is to organize and sort patients on the waiting list, although they use different methodologies, the results are similar. © 2010 Blackwell Publishing Ltd.

  16. Improved workflow for quantification of left ventricular volumes and mass using free-breathing motion corrected cine imaging.

    PubMed

    Cross, Russell; Olivieri, Laura; O'Brien, Kendall; Kellman, Peter; Xue, Hui; Hansen, Michael

    2016-02-25

    Traditional cine imaging for cardiac functional assessment requires breath-holding, which can be problematic in some situations. Free-breathing techniques have relied on multiple averages or real-time imaging, producing images that can be spatially and/or temporally blurred. To overcome this, methods have been developed to acquire real-time images over multiple cardiac cycles, which are subsequently motion corrected and reformatted to yield a single image series displaying one cardiac cycle with high temporal and spatial resolution. Application of these algorithms has required significant additional reconstruction time. The use of distributed computing was recently proposed as a way to improve clinical workflow with such algorithms. In this study, we have deployed a distributed computing version of motion corrected re-binning reconstruction for free-breathing evaluation of cardiac function. Twenty five patients and 25 volunteers underwent cardiovascular magnetic resonance (CMR) for evaluation of left ventricular end-systolic volume (ESV), end-diastolic volume (EDV), and end-diastolic mass. Measurements using motion corrected re-binning were compared to those using breath-held SSFP and to free-breathing SSFP with multiple averages, and were performed by two independent observers. Pearson correlation coefficients and Bland-Altman plots tested agreement across techniques. Concordance correlation coefficient and Bland-Altman analysis tested inter-observer variability. Total scan plus reconstruction times were tested for significant differences using paired t-test. Measured volumes and mass obtained by motion corrected re-binning and by averaged free-breathing SSFP compared favorably to those obtained by breath-held SSFP (r = 0.9863/0.9813 for EDV, 0.9550/0.9685 for ESV, 0.9952/0.9771 for mass). Inter-observer variability was good with concordance correlation coefficients between observers across all acquisition types suggesting substantial agreement. Both motion corrected re-binning and averaged free-breathing SSFP acquisition and reconstruction times were shorter than breath-held SSFP techniques (p < 0.0001). On average, motion corrected re-binning required 3 min less than breath-held SSFP imaging, a 37% reduction in acquisition and reconstruction time. The motion corrected re-binning image reconstruction technique provides robust cardiac imaging that can be used for quantification that compares favorably to breath-held SSFP as well as multiple average free-breathing SSFP, but can be obtained in a fraction of the time when using cloud-based distributed computing reconstruction.

  17. Reliability and validity of the Positive Mental Health Questionnaire in a sample of Spanish university students.

    PubMed

    Roldán-Merino, J; Lluch-Canut, M T; Casas, I; Sanromà-Ortíz, M; Ferré-Grau, C; Sequeira, C; Falcó-Pegueroles, A; Soares, D; Puig-Llobet, M

    2017-03-01

    WHAT IS KNOWN ON THE SUBJECT?: In general, the current studies of positive mental health use questionnaires or parts thereof. However, while these questionnaires evaluate aspects of positive mental health, they fail to measure the construct itself. WHAT DOES THIS PAPER ADD TO EXISTING KNOWLEDGE?: The widespread use and the lack of specific questionnaires for evaluating the positive mental health construct justify the need to measure the robustness of the Positive Mental Health Questionnaire. Also six factors are proposed to measure positive mental health. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: The availability of a good questionnaire to measure positive mental health in university students is useful not only to promote mental health but also to strengthen the curricula of future professionals. Introduction Nursing has a relevant role in managing mental health. It is important to identify and thereafter to enhance positive aspects of mental health among university nursing students. Aim The aim of the present study was to analyse the psychometric properties of the Positive Mental Health Questionnaire (PMHQ) in terms of reliability and validity using confirmatory factor analysis in a sample of university students. Method A cross-sectional study was carried out in a sample of 1091 students at 4 nursing schools in Catalonia, Spain. The reliability of the PMHQ was measured by means of Cronbach's alpha coefficient, and the test-retest stability was measured with the intraclass correlation coefficient (ICC). Confirmatory factor analysis was used to determine the validity of the factorial structure. Results Cronbach's alpha coefficient was satisfactory (>0.70) for four of the six subscales or dimensions and ranged from 0.54 to 0.79. ICC analysis was satisfactory for the six subscales or dimensions. The hypothesis was confirmed in the analysis of the correlations between subclasses and the overall scale, with the strongest correlations being found between the majority of the subscales and the overall scale. Confirmatory factor analysis showed that the model proposed for the factors fit the data satisfactorily. Discussion This scale is a valid and reliable instrument for evaluating positive mental health in university students. Implications for Practice A good questionnaire to measure positive mental health in university students is useful not only to promote mental health but also to strengthen the curricula of future professionals. © 2017 John Wiley & Sons Ltd.

  18. High-resolution correlation

    NASA Astrophysics Data System (ADS)

    Nelson, D. J.

    2007-09-01

    In the basic correlation process a sequence of time-lag-indexed correlation coefficients are computed as the inner or dot product of segments of two signals. The time-lag(s) for which the magnitude of the correlation coefficient sequence is maximized is the estimated relative time delay of the two signals. For discrete sampled signals, the delay estimated in this manner is quantized with the same relative accuracy as the clock used in sampling the signals. In addition, the correlation coefficients are real if the input signals are real. There have been many methods proposed to estimate signal delay to more accuracy than the sample interval of the digitizer clock, with some success. These methods include interpolation of the correlation coefficients, estimation of the signal delay from the group delay function, and beam forming techniques, such as the MUSIC algorithm. For spectral estimation, techniques based on phase differentiation have been popular, but these techniques have apparently not been applied to the correlation problem . We propose a phase based delay estimation method (PBDEM) based on the phase of the correlation function that provides a significant improvement of the accuracy of time delay estimation. In the process, the standard correlation function is first calculated. A time lag error function is then calculated from the correlation phase and is used to interpolate the correlation function. The signal delay is shown to be accurately estimated as the zero crossing of the correlation phase near the index of the peak correlation magnitude. This process is nearly as fast as the conventional correlation function on which it is based. For real valued signals, a simple modification is provided, which results in the same correlation accuracy as is obtained for complex valued signals.

  19. A Robust and Multi-Weighted Approach to Estimating Topographically Correlated Tropospheric Delays in Radar Interferograms

    PubMed Central

    Zhu, Bangyan; Li, Jiancheng; Chu, Zhengwei; Tang, Wei; Wang, Bin; Li, Dawei

    2016-01-01

    Spatial and temporal variations in the vertical stratification of the troposphere introduce significant propagation delays in interferometric synthetic aperture radar (InSAR) observations. Observations of small amplitude surface deformations and regional subsidence rates are plagued by tropospheric delays, and strongly correlated with topographic height variations. Phase-based tropospheric correction techniques assuming a linear relationship between interferometric phase and topography have been exploited and developed, with mixed success. Producing robust estimates of tropospheric phase delay however plays a critical role in increasing the accuracy of InSAR measurements. Meanwhile, few phase-based correction methods account for the spatially variable tropospheric delay over lager study regions. Here, we present a robust and multi-weighted approach to estimate the correlation between phase and topography that is relatively insensitive to confounding processes such as regional subsidence over larger regions as well as under varying tropospheric conditions. An expanded form of robust least squares is introduced to estimate the spatially variable correlation between phase and topography by splitting the interferograms into multiple blocks. Within each block, correlation is robustly estimated from the band-filtered phase and topography. Phase-elevation ratios are multiply- weighted and extrapolated to each persistent scatter (PS) pixel. We applied the proposed method to Envisat ASAR images over the Southern California area, USA, and found that our method mitigated the atmospheric noise better than the conventional phase-based method. The corrected ground surface deformation agreed better with those measured from GPS. PMID:27420066

  20. A Robust and Multi-Weighted Approach to Estimating Topographically Correlated Tropospheric Delays in Radar Interferograms.

    PubMed

    Zhu, Bangyan; Li, Jiancheng; Chu, Zhengwei; Tang, Wei; Wang, Bin; Li, Dawei

    2016-07-12

    Spatial and temporal variations in the vertical stratification of the troposphere introduce significant propagation delays in interferometric synthetic aperture radar (InSAR) observations. Observations of small amplitude surface deformations and regional subsidence rates are plagued by tropospheric delays, and strongly correlated with topographic height variations. Phase-based tropospheric correction techniques assuming a linear relationship between interferometric phase and topography have been exploited and developed, with mixed success. Producing robust estimates of tropospheric phase delay however plays a critical role in increasing the accuracy of InSAR measurements. Meanwhile, few phase-based correction methods account for the spatially variable tropospheric delay over lager study regions. Here, we present a robust and multi-weighted approach to estimate the correlation between phase and topography that is relatively insensitive to confounding processes such as regional subsidence over larger regions as well as under varying tropospheric conditions. An expanded form of robust least squares is introduced to estimate the spatially variable correlation between phase and topography by splitting the interferograms into multiple blocks. Within each block, correlation is robustly estimated from the band-filtered phase and topography. Phase-elevation ratios are multiply- weighted and extrapolated to each persistent scatter (PS) pixel. We applied the proposed method to Envisat ASAR images over the Southern California area, USA, and found that our method mitigated the atmospheric noise better than the conventional phase-based method. The corrected ground surface deformation agreed better with those measured from GPS.

  1. Development of stock correlation networks using mutual information and financial big data.

    PubMed

    Guo, Xue; Zhang, Hu; Tian, Tianhai

    2018-01-01

    Stock correlation networks use stock price data to explore the relationship between different stocks listed in the stock market. Currently this relationship is dominantly measured by the Pearson correlation coefficient. However, financial data suggest that nonlinear relationships may exist in the stock prices of different shares. To address this issue, this work uses mutual information to characterize the nonlinear relationship between stocks. Using 280 stocks traded at the Shanghai Stocks Exchange in China during the period of 2014-2016, we first compare the effectiveness of the correlation coefficient and mutual information for measuring stock relationships. Based on these two measures, we then develop two stock networks using the Minimum Spanning Tree method and study the topological properties of these networks, including degree, path length and the power-law distribution. The relationship network based on mutual information has a better distribution of the degree and larger value of the power-law distribution than those using the correlation coefficient. Numerical results show that mutual information is a more effective approach than the correlation coefficient to measure the stock relationship in a stock market that may undergo large fluctuations of stock prices.

  2. Development of stock correlation networks using mutual information and financial big data

    PubMed Central

    Guo, Xue; Zhang, Hu

    2018-01-01

    Stock correlation networks use stock price data to explore the relationship between different stocks listed in the stock market. Currently this relationship is dominantly measured by the Pearson correlation coefficient. However, financial data suggest that nonlinear relationships may exist in the stock prices of different shares. To address this issue, this work uses mutual information to characterize the nonlinear relationship between stocks. Using 280 stocks traded at the Shanghai Stocks Exchange in China during the period of 2014-2016, we first compare the effectiveness of the correlation coefficient and mutual information for measuring stock relationships. Based on these two measures, we then develop two stock networks using the Minimum Spanning Tree method and study the topological properties of these networks, including degree, path length and the power-law distribution. The relationship network based on mutual information has a better distribution of the degree and larger value of the power-law distribution than those using the correlation coefficient. Numerical results show that mutual information is a more effective approach than the correlation coefficient to measure the stock relationship in a stock market that may undergo large fluctuations of stock prices. PMID:29668715

  3. SU-F-R-31: Identification of Robust Normal Lung CT Texture Features for the Prediction of Radiation-Induced Lung Disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, W; Riyahi, S; Lu, W

    Purpose: Normal lung CT texture features have been used for the prediction of radiation-induced lung disease (radiation pneumonitis and radiation fibrosis). For these features to be clinically useful, they need to be relatively invariant (robust) to tumor size and not correlated with normal lung volume. Methods: The free-breathing CTs of 14 lung SBRT patients were studied. Different sizes of GTVs were simulated with spheres placed at the upper lobe and lower lobe respectively in the normal lung (contralateral to tumor). 27 texture features (9 from intensity histogram, 8 from grey-level co-occurrence matrix [GLCM] and 10 from grey-level run-length matrix [GLRM])more » were extracted from [normal lung-GTV]. To measure the variability of a feature F, the relative difference D=|Fref -Fsim|/Fref*100% was calculated, where Fref was for the entire normal lung and Fsim was for [normal lung-GTV]. A feature was considered as robust if the largest non-outlier (Q3+1.5*IQR) D was less than 5%, and considered as not correlated with normal lung volume when their Pearson correlation was lower than 0.50. Results: Only 11 features were robust. All first-order intensity-histogram features (mean, max, etc.) were robust, while most higher-order features (skewness, kurtosis, etc.) were unrobust. Only two of the GLCM and four of the GLRM features were robust. Larger GTV resulted greater feature variation, this was particularly true for unrobust features. All robust features were not correlated with normal lung volume while three unrobust features showed high correlation. Excessive variations were observed in two low grey-level run features and were later identified to be from one patient with local lung diseases (atelectasis) in the normal lung. There was no dependence on GTV location. Conclusion: We identified 11 robust normal lung CT texture features that can be further examined for the prediction of radiation-induced lung disease. Interestingly, low grey-level run features identified normal lung diseases. This work was supported in part by the National Cancer Institute Grants R01CA172638.« less

  4. Important LiDAR metrics for discriminating forest tree species in Central Europe

    NASA Astrophysics Data System (ADS)

    Shi, Yifang; Wang, Tiejun; Skidmore, Andrew K.; Heurich, Marco

    2018-03-01

    Numerous airborne LiDAR-derived metrics have been proposed for classifying tree species. Yet an in-depth ecological and biological understanding of the significance of these metrics for tree species mapping remains largely unexplored. In this paper, we evaluated the performance of 37 frequently used LiDAR metrics derived under leaf-on and leaf-off conditions, respectively, for discriminating six different tree species in a natural forest in Germany. We firstly assessed the correlation between these metrics. Then we applied a Random Forest algorithm to classify the tree species and evaluated the importance of the LiDAR metrics. Finally, we identified the most important LiDAR metrics and tested their robustness and transferability. Our results indicated that about 60% of LiDAR metrics were highly correlated to each other (|r| > 0.7). There was no statistically significant difference in tree species mapping accuracy between the use of leaf-on and leaf-off LiDAR metrics. However, combining leaf-on and leaf-off LiDAR metrics significantly increased the overall accuracy from 58.2% (leaf-on) and 62.0% (leaf-off) to 66.5% as well as the kappa coefficient from 0.47 (leaf-on) and 0.51 (leaf-off) to 0.58. Radiometric features, especially intensity related metrics, provided more consistent and significant contributions than geometric features for tree species discrimination. Specifically, the mean intensity of first-or-single returns as well as the mean value of echo width were identified as the most robust LiDAR metrics for tree species discrimination. These results indicate that metrics derived from airborne LiDAR data, especially radiometric metrics, can aid in discriminating tree species in a mixed temperate forest, and represent candidate metrics for tree species classification and monitoring in Central Europe.

  5. Spaced antenna diversity in temperate latitude meteor burst systems operating near 40 MHz - Variation of signal cross-correlation coefficients with antenna separation

    NASA Astrophysics Data System (ADS)

    Cannon, Paul S.; Shukla, Anil K.; Lester, Mark

    1993-04-01

    We have studied 37-MHz signals received over an 800-km temperate latitude path using 400-W continuous wave transmissions. Signals collected during a 9-day period in February 1990 on two antennas at separations of 5, 10, and 20 lambda were analyzed. Three signal categories were identified (overdense, underdense, and not known (NK)) and cross-correlation coefficients between the signals received by the two antennas were calculated for each signal category. No spatial variation, and in particular no decrease, in average cross-correlation coefficient was observed for underdense or NK signals as the antenna spacing was increased from 5 to 20 lambda. At each antenna separation the cross-correlation coefficients of these two categories were strongly dependent on time. Overdense signals, however, showed no cross-correlation time dependency at 5 and 10 lambda, but there was a strong time dependency at 20 lambda. Recommendations are made in regard to the optimum antenna spacing for a meteor burst communication system using spaced antenna diversity.

  6. Transcultural adaptation and psychometric properties of Spanish version of Pregnancy Physical Activity Questionnaire: the PregnActive project.

    PubMed

    Oviedo-Caro, Miguel Ángel; Bueno-Antequera, Javier; Munguía-Izquierdo, Diego

    2018-03-19

    To transculturally adapt the Spanish version of Pregnancy Physical Activity Questionnaire (PPAQ) analyzing its psychometric properties. The PPAQ was transculturally adapted into Spanish. Test-retest reliability was evaluated in a subsample of 109 pregnant women. The validity was evaluated in a sample of 208 pregnant women who answered the questionnaire and wore the multi-sensor monitor for 7 valid days. The reliability (intraclass correlation coefficient), concordance (concordance correlation coefficient), correlation (Pearson correlation coefficient), agreement (Bland-Altman plots) and relative activity levels (Jonckheere-Terpstra test) between both administrations and methods were examined. Intraclass correlation coefficients between both administrations were good for all categories except transportation. A low but significant correlation was found for total activity (light and above) whereas no correlation was found for other intensities between both methods. Relative activity levels analysis showed a significant linear trend for increased total activity between both methods. Spanish version of PPAQ is a brief and easily interpretable questionnaire with good reliability and ability to rank individuals, and poor validity compared with multi-sensor monitor. The use of PPAQ provides information of pregnancy-specific activities in order to establish physical activity levels of pregnant women and adapt health promotion interventions. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. QSAR Study on the anti-tumor activity of levofloxacin-thiadiazole HDACi conjugates

    NASA Astrophysics Data System (ADS)

    Tang, Ziqiang; Feng, Hui; Chen, Yan; Yue, Wei; Feng, Changjun

    2017-12-01

    A molecular electronegativity distance vector(M t) based on 13atomic types is used to describe the structures of 19 conjugates(LHCc) of levofloxacin-thiadiazole HDAC inhibitor(HDACi) and related to the anti-tumor activity (M F and P C) of LHCc against MCF-7 and PC-3. The quantitative structure-activity relationships (QSAR) was established by using leaps-and-bounds regression analysis for the anti-tumor activities (M F and P C) of 19 above compounds to MCF-7and PC-3 along with the M t. The correlation coefficients (R 2) and the leave-one-out (LOO) cross validation R cv 2 for the M F and P C models were 0.792 and 0.679; 0.773 and 0.565, respectively. The QSAR models have favorable correlation, as well as robustness and good prediction capability by R 2, F, R cv 2, A IC F IT V IF tests. The results indicate that the molecular structural units: -CHg-(g=1, 2), -NH2, -NH-,-OH, O=, -O-, -S- and -X are main factors which can affect the anti-tumor activity M F and PC bioactivities of these compounds directly.

  8. Finger vein recognition based on the hyperinformation feature

    NASA Astrophysics Data System (ADS)

    Xi, Xiaoming; Yang, Gongping; Yin, Yilong; Yang, Lu

    2014-01-01

    The finger vein is a promising biometric pattern for personal identification due to its advantages over other existing biometrics. In finger vein recognition, feature extraction is a critical step, and many feature extraction methods have been proposed to extract the gray, texture, or shape of the finger vein. We treat them as low-level features and present a high-level feature extraction framework. Under this framework, base attribute is first defined to represent the characteristics of a certain subcategory of a subject. Then, for an image, the correlation coefficient is used for constructing the high-level feature, which reflects the correlation between this image and all base attributes. Since the high-level feature can reveal characteristics of more subcategories and contain more discriminative information, we call it hyperinformation feature (HIF). Compared with low-level features, which only represent the characteristics of one subcategory, HIF is more powerful and robust. In order to demonstrate the potential of the proposed framework, we provide a case study to extract HIF. We conduct comprehensive experiments to show the generality of the proposed framework and the efficiency of HIF on our databases, respectively. Experimental results show that HIF significantly outperforms the low-level features.

  9. Cerebella segmentation on MR images of pediatric patients with medulloblastoma

    NASA Astrophysics Data System (ADS)

    Shan, Zu Y.; Ji, Qing; Glass, John; Gajjar, Amar; Reddick, Wilburn E.

    2005-04-01

    In this study, an automated method has been developed to identify the cerebellum from T1-weighted MR brain images of patients with medulloblastoma. A new objective function that is similar to Gibbs free energy in classic physics was defined; and the brain structure delineation was viewed as a process of minimizing Gibbs free energy. We used a rigid-body registration and an active contour (snake) method to minimize the Gibbs free energy in this study. The method was applied to 20 patient data sets to generate cerebellum images and volumetric results. The generated cerebellum images were compared with two manually drawn results. Strong correlations were found between the automatically and manually generated volumetric results, the correlation coefficients with each of manual results were 0.971 and 0.974, respectively. The average Jaccard similarities with each of two manual results were 0.89 and 0.88, respectively. The average Kappa indexes with each of two manual results were 0.94 and 0.93, respectively. These results showed this method was both robust and accurate for cerebellum segmentation. The method may be applied to various research and clinical investigation in which cerebellum segmentation and quantitative MR measurement of cerebellum are needed.

  10. Quantifying the impact of Teleconnections on Hydrologic Regimes in Texas

    NASA Astrophysics Data System (ADS)

    Bhatia, N.; Singh, V. P.; Srivastav, R. K.

    2016-12-01

    Climate change is being alleged to have led to the increased frequency of extreme flooding events and the resulting damages are severe, especially where the flood-plain population densities are higher. Much research in the field of hydroclimatology is focusing on improving real-time flood forecasting models. Recent studies show that, in the state of Texas, extreme regional floods are actually triggered by abruptly higher precipitation intensities. Such intensities are further driven by sea-surface temperature and pressure anomalies, defined by certain patterns of teleconnections. In this study, climate variability is defined on the basis of five major Atlantic and Pacific Ocean related teleconnections: (i) Atlantic Multidecadal Oscillation (AMO), (ii) North Atlantic Oscillation (NAO), (iii) Pacific Decadal Oscillation (PDO), (iv) Pacific North American Pattern (PNA), and (v) Southern Oscillation Index (SOI). Hydrologic extremes will be modeled using probabilistic distributions. Leave-One-Out-Test (LOOT) will be employed to address the outliers in the extremes, and to eventually obtain the robust correlation coefficient. The variation in the effect of most correlated teleconnection with respect to hydrologic attributes will be investigated for the entire state. This study will attempt to identify potential teleconnection inputs for data-driven hydrologic models under varying climatic conditions.

  11. A robust method of computing finite difference coefficients based on Vandermonde matrix

    NASA Astrophysics Data System (ADS)

    Zhang, Yijie; Gao, Jinghuai; Peng, Jigen; Han, Weimin

    2018-05-01

    When the finite difference (FD) method is employed to simulate the wave propagation, high-order FD method is preferred in order to achieve better accuracy. However, if the order of FD scheme is high enough, the coefficient matrix of the formula for calculating finite difference coefficients is close to be singular. In this case, when the FD coefficients are computed by matrix inverse operator of MATLAB, inaccuracy can be produced. In order to overcome this problem, we have suggested an algorithm based on Vandermonde matrix in this paper. After specified mathematical transformation, the coefficient matrix is transformed into a Vandermonde matrix. Then the FD coefficients of high-order FD method can be computed by the algorithm of Vandermonde matrix, which prevents the inverse of the singular matrix. The dispersion analysis and numerical results of a homogeneous elastic model and a geophysical model of oil and gas reservoir demonstrate that the algorithm based on Vandermonde matrix has better accuracy compared with matrix inverse operator of MATLAB.

  12. Constraining the surface properties of effective Skyrme interactions

    NASA Astrophysics Data System (ADS)

    Jodon, R.; Bender, M.; Bennaceur, K.; Meyer, J.

    2016-08-01

    Background: Deformation energy surfaces map how the total binding energy of a nuclear system depends on the geometrical properties of intrinsic configurations, thereby providing a powerful tool to interpret nuclear spectroscopy and large-amplitude collective-motion phenomena such as fission. The global behavior of the deformation energy is known to be directly connected to the surface properties of the effective interaction used for its calculation. Purpose: The precise control of surface properties during the parameter adjustment of an effective interaction is key to obtain a reliable and predictive description of nuclear properties. The most relevant indicator is the surface-energy coefficient asurf. There are several possibilities for its definition and estimation, which are not fully equivalent and require a computational effort that can differ by orders of magnitude. The purpose of this study is threefold: first, to identify a scheme for the determination of asurf that offers the best compromise between robustness, precision, and numerical efficiency; second, to analyze the correlation between values for asurf and the characteristic energies of the fission barrier of 240Pu; and third, to lay out an efficient and robust procedure for how the deformation properties of the Skyrme energy density functional (EDF) can be constrained during the parameter fit. Methods: There are several frequently used possibilities to define and calculate the surface energy coefficient asurf of effective interactions built for the purpose of self-consistent mean-field calculations. The most direct access is provided by the model system of semi-infinite nuclear matter, but asurf can also be extracted from the systematics of binding energies of finite nuclei. Calculations can be carried out either self-consistently [Hartree-Fock (HF)], which incorporates quantal shell effects, or in one of the semiclassical extended Thomas-Fermi (ETF) or modified Thomas-Fermi (MTF) approximations. The latter is of particular interest because it provides asurf as a numerical integral without the need to solve self-consistent equations. Results for semi-infinite nuclear matter obtained with the HF, ETF, and MTF methods will be compared with one another and with asurf, as deduced from ETF calculations of very heavy fictitious nuclei. Results: The surface energy coefficient of 76 parametrizations of the Skyrme EDF have been calculated. Values obtained with the HF, ETF, and MTF methods are not identical, but differ by fairly constant systematic offsets. By contrast, extracting asurf from the binding energy of semi-infinite matter or of very large nuclei within the same method gives the same result within the numerical uncertainties. Conclusions: Despite having some drawbacks compared to the other methods studied here, the MTF approach provides sufficiently precise values for asurf such that it can be used as a very robust constraint on surface properties during a parameter fit at negligible additional cost. While the excitation energy of superdeformed states and the height of fission barriers is obviously strongly correlated to asurf, the presence of shell effects prevents a one-to-one correspondence between them. As in addition the value of asurf providing realistic fission barriers depends on the choices made for corrections for spurious motion, its "best value" (within a given scheme to calculate it) depends on the fit protocol. Through the construction of a series of eight parametrizations SLy5s1-SLy5s8 of the standard Skyrme EDF with systematically varied asurf value, it is shown how to arrive at a fit with realistic deformation properties.

  13. Phononic glass: a robust acoustic-absorption material.

    PubMed

    Jiang, Heng; Wang, Yuren

    2012-08-01

    In order to achieve strong wide band acoustic absorption under high hydrostatic pressure, an interpenetrating network structure is introduced into the locally resonant phononic crystal to fabricate a type of phononic composite material called "phononic glass." Underwater acoustic absorption coefficient measurements show that the material owns high underwater sound absorption coefficients over 0.9 in 12-30 kHz. Moreover, the quasi-static compressive behavior shows that the phononic glass has a compressive strength over 5 MPa which is crucial for underwater applications.

  14. Cross-Correlations and Structures of Aero-Engine Gas Path System Based on DCCA Coefficient and Rooted Tree

    NASA Astrophysics Data System (ADS)

    Dong, Keqiang; Fan, Jie; Gao, You

    2015-12-01

    Identifying the mutual interaction is a crucial problem that facilitates the understanding of emerging structures in complex system. We here focus on aero-engine dynamic as an example of complex system. By applying the detrended cross-correlation analysis (DCCA) coefficient method to aero-engine gas path system, we find that the low-spool rotor speed (N1) and high-spool rotor speed (N2) fluctuation series exhibit cross-correlation characteristic. Further, we employ detrended cross-correlation coefficient matrix and rooted tree to investigate the mutual interactions of other gas path variables. The results can infer that the exhaust gas temperature (EGT), N1, N2, fuel flow (WF) and engine pressure ratio (EPR) are main gas path parameters.

  15. Colocalization analysis in fluorescence micrographs: verification of a more accurate calculation of pearson's correlation coefficient.

    PubMed

    Barlow, Andrew L; Macleod, Alasdair; Noppen, Samuel; Sanderson, Jeremy; Guérin, Christopher J

    2010-12-01

    One of the most routine uses of fluorescence microscopy is colocalization, i.e., the demonstration of a relationship between pairs of biological molecules. Frequently this is presented simplistically by the use of overlays of red and green images, with areas of yellow indicating colocalization of the molecules. Colocalization data are rarely quantified and can be misleading. Our results from both synthetic and biological datasets demonstrate that the generation of Pearson's correlation coefficient between pairs of images can overestimate positive correlation and fail to demonstrate negative correlation. We have demonstrated that the calculation of a thresholded Pearson's correlation coefficient using only intensity values over a determined threshold in both channels produces numerical values that more accurately describe both synthetic datasets and biological examples. Its use will bring clarity and accuracy to colocalization studies using fluorescent microscopy.

  16. Estimation of soil-soil solution distribution coefficient of radiostrontium using soil properties.

    PubMed

    Ishikawa, Nao K; Uchida, Shigeo; Tagami, Keiko

    2009-02-01

    We propose a new approach for estimation of soil-soil solution distribution coefficient (K(d)) of radiostrontium using some selected soil properties. We used 142 Japanese agricultural soil samples (35 Andosol, 25 Cambisol, 77 Fluvisol, and 5 others) for which Sr-K(d) values had been determined by a batch sorption test and listed in our database. Spearman's rank correlation test was carried out to investigate correlations between Sr-K(d) values and soil properties. Electrical conductivity and water soluble Ca had good correlations with Sr-K(d) values for all soil groups. Then, we found a high correlation between the ratio of exchangeable Ca to Ca concentration in water soluble fraction and Sr-K(d) values with correlation coefficient R=0.72. This pointed us toward a relatively easy way to estimate Sr-K(d) values.

  17. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality.

    PubMed

    Bishara, Anthony J; Hittner, James B

    2015-10-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared with its major alternatives, including the Spearman rank-order correlation, the bootstrap estimate, the Box-Cox transformation family, and a general normalizing transformation (i.e., rankit), as well as to various bias adjustments. Nonnormality caused the correlation coefficient to be inflated by up to +.14, particularly when the nonnormality involved heavy-tailed distributions. Traditional bias adjustments worsened this problem, further inflating the estimate. The Spearman and rankit correlations eliminated this inflation and provided conservative estimates. Rankit also minimized random error for most sample sizes, except for the smallest samples ( n = 10), where bootstrapping was more effective. Overall, results justify the use of carefully chosen alternatives to the Pearson correlation when normality is violated.

  18. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality

    PubMed Central

    Hittner, James B.

    2014-01-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared with its major alternatives, including the Spearman rank-order correlation, the bootstrap estimate, the Box–Cox transformation family, and a general normalizing transformation (i.e., rankit), as well as to various bias adjustments. Nonnormality caused the correlation coefficient to be inflated by up to +.14, particularly when the nonnormality involved heavy-tailed distributions. Traditional bias adjustments worsened this problem, further inflating the estimate. The Spearman and rankit correlations eliminated this inflation and provided conservative estimates. Rankit also minimized random error for most sample sizes, except for the smallest samples (n = 10), where bootstrapping was more effective. Overall, results justify the use of carefully chosen alternatives to the Pearson correlation when normality is violated. PMID:29795841

  19. Correlation and prediction of gaseous diffusion coefficients.

    NASA Technical Reports Server (NTRS)

    Marrero, T. R.; Mason, E. A.

    1973-01-01

    A new correlation method for binary gaseous diffusion coefficients from very low temperatures to 10,000 K is proposed based on an extended principle of corresponding states, and having greater range and accuracy than previous correlations. There are two correlation parameters that are related to other physical quantities and that are predictable in the absence of diffusion measurements. Quantum effects and composition dependence are included, but high-pressure effects are not. The results are directly applicable to multicomponent mixtures.

  20. Robust and sparse correlation matrix estimation for the analysis of high-dimensional genomics data.

    PubMed

    Serra, Angela; Coretto, Pietro; Fratello, Michele; Tagliaferri, Roberto; Stegle, Oliver

    2018-02-15

    Microarray technology can be used to study the expression of thousands of genes across a number of different experimental conditions, usually hundreds. The underlying principle is that genes sharing similar expression patterns, across different samples, can be part of the same co-expression system, or they may share the same biological functions. Groups of genes are usually identified based on cluster analysis. Clustering methods rely on the similarity matrix between genes. A common choice to measure similarity is to compute the sample correlation matrix. Dimensionality reduction is another popular data analysis task which is also based on covariance/correlation matrix estimates. Unfortunately, covariance/correlation matrix estimation suffers from the intrinsic noise present in high-dimensional data. Sources of noise are: sampling variations, presents of outlying sample units, and the fact that in most cases the number of units is much larger than the number of genes. In this paper, we propose a robust correlation matrix estimator that is regularized based on adaptive thresholding. The resulting method jointly tames the effects of the high-dimensionality, and data contamination. Computations are easy to implement and do not require hand tunings. Both simulated and real data are analyzed. A Monte Carlo experiment shows that the proposed method is capable of remarkable performances. Our correlation metric is more robust to outliers compared with the existing alternatives in two gene expression datasets. It is also shown how the regularization allows to automatically detect and filter spurious correlations. The same regularization is also extended to other less robust correlation measures. Finally, we apply the ARACNE algorithm on the SyNTreN gene expression data. Sensitivity and specificity of the reconstructed network is compared with the gold standard. We show that ARACNE performs better when it takes the proposed correlation matrix estimator as input. The R software is available at https://github.com/angy89/RobustSparseCorrelation. aserra@unisa.it or robtag@unisa.it. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. iPhone Sensors in Tracking Outcome Variables of the 30-Second Chair Stand Test and Stair Climb Test to Evaluate Disability: Cross-Sectional Pilot Study

    PubMed Central

    Samaan, Michael A; Schultz, Brooke; Popovic, Tijana; Souza, Richard B; Majumdar, Sharmila

    2017-01-01

    Background Performance tests are important to characterize patient disabilities and functional changes. The Osteoarthritis Research Society International and others recommend the 30-second Chair Stand Test and Stair Climb Test, among others, as core tests that capture two distinct types of disability during activities of daily living. However, these two tests are limited by current protocols of testing in clinics. There is a need for an alternative that allows remote testing of functional capabilities during these tests in the osteoarthritis patient population. Objective Objectives are to (1) develop an app for testing the functionality of an iPhone’s accelerometer and gravity sensor and (2) conduct a pilot study objectively evaluating the criterion validity and test-retest reliability of outcome variables obtained from these sensors during the 30-second Chair Stand Test and Stair Climb Test. Methods An iOS app was developed with data collection capabilities from the built-in iPhone accelerometer and gravity sensor tools and linked to Google Firebase. A total of 24 subjects performed the 30-second Chair Stand Test with an iPhone accelerometer collecting data and an external rater manually counting sit-to-stand repetitions. A total of 21 subjects performed the Stair Climb Test with an iPhone gravity sensor turned on and an external rater timing the duration of the test on a stopwatch. App data from Firebase were converted into graphical data and exported into MATLAB for data filtering. Multiple iterations of a data processing algorithm were used to increase robustness and accuracy. MATLAB-generated outcome variables were compared to the manually determined outcome variables of each test. Pearson’s correlation coefficients (PCCs), Bland-Altman plots, intraclass correlation coefficients (ICCs), standard errors of measurement, and repeatability coefficients were generated to evaluate criterion validity, agreement, and test-retest reliability of iPhone sensor data against gold-standard manual measurements. Results App accelerometer data during the 30-second Chair Stand Test (PCC=.890) and gravity sensor data during the Stair Climb Test (PCC=.865) were highly correlated to gold-standard manual measurements. Greater than 95% of values on Bland-Altman plots comparing the manual data to the app data fell within the 95% limits of agreement. Strong intraclass correlation was found for trials of the 30-second Chair Stand Test (ICC=.968) and Stair Climb Test (ICC=.902). Standard errors of measurement for both tests were found to be within acceptable thresholds for MATLAB. Repeatability coefficients for the 30-second Chair Stand Test and Stair Climb Test were 0.629 and 1.20, respectively. Conclusions App-based performance testing of the 30-second Chair Stand Test and Stair Climb Test is valid and reliable, suggesting its applicability to future, larger-scale studies in the osteoarthritis patient population. PMID:29079549

  2. Robust adaptive tracking control for nonholonomic mobile manipulator with uncertainties.

    PubMed

    Peng, Jinzhu; Yu, Jie; Wang, Jie

    2014-07-01

    In this paper, mobile manipulator is divided into two subsystems, that is, nonholonomic mobile platform subsystem and holonomic manipulator subsystem. First, the kinematic controller of the mobile platform is derived to obtain a desired velocity. Second, regarding the coupling between the two subsystems as disturbances, Lyapunov functions of the two subsystems are designed respectively. Third, a robust adaptive tracking controller is proposed to deal with the unknown upper bounds of parameter uncertainties and disturbances. According to the Lyapunov stability theory, the derived robust adaptive controller guarantees global stability of the closed-loop system, and the tracking errors and adaptive coefficient errors are all bounded. Finally, simulation results show that the proposed robust adaptive tracking controller for nonholonomic mobile manipulator is effective and has good tracking capacity. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Equations of prediction for abdominal fat in brown egg-laying hens fed different diets.

    PubMed

    Souza, C; Jaimes, J J B; Gewehr, C E

    2017-06-01

    The objective was to use noninvasive measurements to formulate equations for predicting the abdominal fat weight of laying hens in a noninvasive manner. Hens were fed with different diets; the external body measurements of birds were used as regressors. We used 288 Hy-Line Brown laying hens, distributed in a completely randomized design in a factorial arrangement, submitted for 16 wk to 2 metabolizable energy levels (2,550 and 2,800 kcal/kg) and 3 levels of crude protein in the diet (150, 160, and 170 g/kg), totaling 6 treatments, with 48 hens each. Sixteen hens per treatment of 92 wk age were utilized to evaluate body weight, bird length, tarsus and sternum, greater and lesser diameter of the tarsus, and abdominal fat weight, after slaughter. The equations were obtained by using measures evaluated with regressors through simple and multiple linear regression with the stepwise method of indirect elimination (backward), with P < 0.10 for all variables remaining in the model. The weight of abdominal fat as predicted by the equations and observed values for each bird were subjected to Pearson's correlation analysis. The equations generated by energy levels showed coefficients of determination of 0.50 and 0.74 for 2,800 and 2,550 kcal/kg of metabolizable energy, respectively, with correlation coefficients of 0.71 and 0.84, with a highly significant correlation between the calculated and observed values of abdominal fat. For protein levels of 150, 160, and 170 g/kg in the diet, it was possible to obtain coefficients of determination of 0.75, 0.57, and 0.61, with correlation coefficients of 0.86, 0.75, and 0.78, respectively. Regarding the general equation for predicting abdominal fat weight, the coefficient of determination was 0.62; the correlation coefficient was 0.79. The equations for predicting abdominal fat weight in laying hens, based on external measurements of the birds, showed positive coefficients of determination and correlation coefficients, thus allowing researchers to determine abdominal fat weight in vivo. © 2016 Poultry Science Association Inc.

  4. Virial Coefficients for the Liquid Argon

    NASA Astrophysics Data System (ADS)

    Korth, Micheal; Kim, Saesun

    2014-03-01

    We begin with a geometric model of hard colliding spheres and calculate probability densities in an iterative sequence of calculations that lead to the pair correlation function. The model is based on a kinetic theory approach developed by Shinomoto, to which we added an interatomic potential for argon based on the model from Aziz. From values of the pair correlation function at various values of density, we were able to find viral coefficients of liquid argon. The low order coefficients are in good agreement with theoretical hard sphere coefficients, but appropriate data for argon to which these results might be compared is difficult to find.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, A.B.; Clothiaux, E.

    Because of Earth`s gravitational field, its atmosphere is strongly anisotropic with respect to the vertical; the effect of the Earth`s rotation on synoptic wind patterns also causes a more subtle form of anisotropy in the horizontal plane. The authors survey various approaches to statistically robust anisotropy from a wavelet perspective and present a new one adapted to strongly non-isotropic fields that are sampled on a rectangular grid with a large aspect ratio. This novel technique uses an anisotropic version of Multi-Resolution Analysis (MRA) in image analysis; the authors form a tensor product of the standard dyadic Haar basis, where themore » dividing ratio is {lambda}{sub z} = 2, and a nonstandard triadic counterpart, where the dividing ratio is {lambda}{sub x} = 3. The natural support of the field is therefore 2{sup n} pixels (vertically) by 3{sup n} pixels (horizontally) where n is the number of levels in the MRA. The natural triadic basis includes the French top-hat wavelet which resonates with bumps in the field whereas the Haar wavelet responds to ramps or steps. The complete 2D basis has one scaling function and five wavelets. The resulting anisotropic MRA is designed for application to the liquid water content (LWC) field in boundary-layer clouds, as the prevailing wind advects them by a vertically pointing mm-radar system. Spatial correlations are notoriously long-range in cloud structure and the authors use the wavelet coefficients from the new MRA to characterize these correlations in a multifractal analysis scheme. In the present study, the MRA is used (in synthesis mode) to generate fields that mimic cloud structure quite realistically although only a few parameters are used to control the randomness of the LWC`s wavelet coefficients.« less

  6. 2D-QSAR study of fullerene nanostructure derivatives as potent HIV-1 protease inhibitors

    NASA Astrophysics Data System (ADS)

    Barzegar, Abolfazl; Jafari Mousavi, Somaye; Hamidi, Hossein; Sadeghi, Mehdi

    2017-09-01

    The protease of human immunodeficiency virus1 (HIV-PR) is an essential enzyme for antiviral treatments. Carbon nanostructures of fullerene derivatives, have nanoscale dimension with a diameter comparable to the diameter of the active site of HIV-PR which would in turn inhibit HIV. In this research, two dimensional quantitative structure-activity relationships (2D-QSAR) of fullerene derivatives against HIV-PR activity were employed as a powerful tool for elucidation the relationships between structure and experimental observations. QSAR study of 49 fullerene derivatives was performed by employing stepwise-MLR, GAPLS-MLR, and PCA-MLR models for variable (descriptor) selection and model construction. QSAR models were obtained with higher ability to predict the activity of the fullerene derivatives against HIV-PR by a correlation coefficient (R2training) of 0.942, 0.89, and 0.87 as well as R2test values of 0.791, 0.67and 0.674 for stepwise-MLR, GAPLS-MLR, and PCA -MLR models, respectively. Leave-one-out cross-validated correlation coefficient (R2CV) and Y-randomization methods confirmed the models robustness. The descriptors indicated that the HIV-PR inhibition depends on the van der Waals volumes, polarizability, bond order between two atoms and electronegativities of fullerenes derivatives. 2D-QSAR simulation without needing receptor's active site geometry, resulted in useful descriptors mainly denoting ;C60 backbone-functional groups; and ;C60 functional groups; properties. Both properties in fullerene refer to the ligand fitness and improvement van der Waals interactions with HIV-PR active site. Therefore, the QSAR models can be used in the search for novel HIV-PR inhibitors based on fullerene derivatives.

  7. Tracking variable sedimentation rates in orbitally forced paleoclimate proxy series

    NASA Astrophysics Data System (ADS)

    Li, M.; Kump, L. R.; Hinnov, L.

    2017-12-01

    This study addresses two fundamental issues in cyclostratigraphy: quantitative testing of orbital forcing in cyclic sedimentary sequences and tracking variable sedimentation rates. The methodology proposed here addresses these issues as an inverse problem, and estimates the product-moment correlation coefficient between the frequency spectra of orbital solutions and paleoclimate proxy series over a range of "test" sedimentation rates. It is inspired by the ASM method (1). The number of orbital parameters involved in the estimation is also considered. The method relies on the hypothesis that orbital forcing had a significant impact on the paleoclimate proxy variations, and thus is also tested. The null hypothesis of no astronomical forcing is evaluated using the Beta distribution, for which the shape parameters are estimated using a Monte Carlo simulation approach. We introduce a metric to estimate the most likely sedimentation rate using the product-moment correlation coefficient, H0 significance level, and the number of contributing orbital parameters, i.e., the CHO value. The CHO metric is applied with a sliding window to track variable sedimentation rates along the paleoclimate proxy series. Two forward models with uniform and variable sedimentation rates are evaluated to demonstrate the robustness of the method. The CHO method is applied to the classical Late Triassic Newark depth rank series; the estimated sedimentation rates match closely with previously published sedimentation rates and provide a more highly time-resolved estimate (2,3). References: (1) Meyers, S.R., Sageman, B.B., Amer. J. Sci., 307, 773-792, 2007; (2) Kent, D.V., Olsen, P.E., Muttoni, G., Earth-Sci. Rev.166, 153-180, 2017; (3) Li, M., Zhang, Y., Huang, C., Ogg, J., Hinnov, L., Wang, Y., Zou, Z., Li, L., 2017. Earth Plant. Sc. Lett. doi:10.1016/j.epsl.2017.07.015

  8. Accuracy of data buoys for measurement of cyanobacteria, chlorophyll, and turbidity in a large lake (Lake Erie, North America): implications for estimation of cyanobacterial bloom parameters from water quality sonde measurements.

    PubMed

    Chaffin, Justin D; Kane, Douglas D; Stanislawczyk, Keara; Parker, Eric M

    2018-06-25

    Microcystin (MCY)-producing harmful cyanobacterial blooms (cHABs) are an annual occurrence in Lake Erie, and buoys equipped with water quality sondes have been deployed to help researchers and resource managers track cHABs. The objective of this study was to determine how well water quality sondes attached to buoys measure total algae and cyanobacterial biomass and water turbidity. Water samples were collected next to two data buoys in western Lake Erie (near Gibraltar Island and in the Sandusky subbasin) throughout summers 2015, 2016, and 2017 to determine correlations between buoy sonde data and water sample data. MCY and nutrient concentrations were also measured. Significant (P < 0.001) linear relationships (R 2  > 0.75) occurred between cyanobacteria buoy and water sample data at the Gibraltar buoy, but not at the Sandusky buoy; however, the coefficients at the Gibraltar buoy differed significantly across years. There was a significant correlation between buoy and water sample total chlorophyll data at both buoys, but the coefficient varied considerably between buoys and among years. Total MCY concentrations at the Gibraltar buoy followed similar temporal patterns as buoy and water sample cyanobacterial biomass data, and the ratio of MCY to cyanobacteria-chlorophyll decreased with decreased ambient nitrate concentrations. These results suggest that buoy data are difficult to compare across time and space. Additionally, the inclusion of nitrate concentration data can lead to more robust predictions on the relative toxicity of blooms. Overall, deployed buoys with sondes that are routinely cleaned and calibrated can track relative cyanobacteria abundance and be used as an early warning system for potentially toxic blooms.

  9. Reading Ability as an Estimator of Premorbid Intelligence: Does It Remain Stable Among Ethnically Diverse HIV+ Adults?

    PubMed Central

    Olsen, J. Pat; Fellows, Robert P.; Rivera-Mindt, Monica; Morgello, Susan; Byrd, Desiree A.

    2015-01-01

    The Wide Range Achievement Test, 3rd edition, Reading-Recognition subtest (WRAT-3 RR) is an established measure of premorbid ability. Furthermore, its long-term reliability is not well documented, particularly in diverse populations with CNS-relevant disease. Objective: We examined test-retest reliability of the WRAT-3 RR over time in an HIV+ sample of predominantly racial/ethnic minority adults. Method: Participants (N = 88) completed a comprehensive neuropsychological battery, including the WRAT-3 RR, on at least two separate study visits. Intraclass correlation coefficients (ICCs) were computed using scores from baseline and follow-up assessments to determine the test-retest reliability of the WRAT-3 RR across racial/ethnic groups and changes in medical (immunological) and clinical (neurocognitive) factors. Additionally, Fisher’s Z tests were used to determine the significance of the differences between ICCs. Results: The average test-retest interval was 58.7 months (SD=36.4). The overall WRAT-3 RR test-retest reliability was high (r = .97, p < .001), and remained robust across all demographic, medical, and clinical variables (all r’s > .92). Intraclass correlation coefficients did not differ significantly between the subgroups tested (all Fisher’s Z p’s > .05). Conclusions: Overall, this study supports the appropriateness of word-reading tests, such as the WRAT-3 RR, for use as stable premorbid IQ estimates among ethnically diverse groups. Moreover, this study supports the reliability of this measure in the context of change in health and neurocognitive status, and in lengthy inter-test intervals. These findings offer strong rationale for reading as a “hold” test, even in the presence of a chronic, variable disease such as HIV. PMID:26689235

  10. Reproducibility of food consumption frequencies derived from the Children's Eating Habits Questionnaire used in the IDEFICS study.

    PubMed

    Lanfer, A; Hebestreit, A; Ahrens, W; Krogh, V; Sieri, S; Lissner, L; Eiben, G; Siani, A; Huybrechts, I; Loit, H-M; Papoutsou, S; Kovács, E; Pala, V

    2011-04-01

    To investigate the reproducibility of food consumption frequencies derived from the food frequency section of the Children's Eating Habits Questionnaire (CEHQ-FFQ) that was developed and used in the IDEFICS (Identification and prevention of dietary- and lifestyle-induced health effects in children and infants) project to assess food habits in 2- to 9-year-old European children. From a subsample of 258 children who participated in the IDEFICS baseline examination, parental questionnaires of the CEHQ were collected twice to assess reproducibility of questionnaire results from 0 to 354 days after the first examination. Weighted Cohen's kappa coefficients (κ) and Spearman's correlation coefficients (r) were calculated to assess agreement between the first and second questionnaires for each food item of the CEHQ-FFQ. Stratification was performed for sex, age group, geographical region and length of period between the first and second administrations. Fisher's Z transformation was applied to test correlation coefficients for significant differences between strata. For all food items analysed, weighted Cohen's kappa coefficients (κ) and Spearman's correlation coefficients (r) were significant and positive (P<0.001). Reproducibility was lowest for diet soft drinks (κ=0.23, r=0.32) and highest for sweetened milk (κ=0.68, r=0.76). Correlation coefficients were comparable to those of previous studies on FFQ reproducibility in children and adults. Stratification did not reveal systematic differences in reproducibility by sex and age group. Spearman's correlation coefficients differed significantly between northern and southern European countries for 10 food items. In nine of them, the lower respective coefficient was still high enough to conclude acceptable reproducibility. As expected, longer time (>128 days) between the first and second administrations resulted in a generally lower, yet still acceptable, reproducibility. Results indicate that the CEHQ-FFQ gives reproducible estimates of the consumption frequency of 43 food items from 14 food groups in European children.

  11. [Cross-cultural adaptation and validation of the Health and Taste Attitude Scale (HTAS) in Portuguese].

    PubMed

    Koritar, Priscila; Philippi, Sonia Tucunduva; Alvarenga, Marle dos Santos; Santos, Bernardo dos

    2014-08-01

    The scope of this study was to show the cross-cultural adaptation and validation of the Health and Taste Attitude Scale in Portuguese. The methodology included translation of the scale; evaluation of conceptual, operational and item-based equivalence by 14 experts and 51 female undergraduates; semantic equivalence and measurement assessment by 12 bilingual women by the paired t-test, the Pearson correlation coefficient and the coefficient intraclass correlation; internal consistency and test-retest reliability by Cronbach's alpha and intraclass correlation coefficient, respectively, after application on 216 female undergraduates; assessment of discriminant and concurrent validity via the t-test and Spearman's correlation coefficient, respectively, in addition to Confirmatory Factor and Exploratory Factor Analysis. The scale was considered adequate and easily understood by the experts and university students and presented good internal consistency and reliability (µ 0.86, ICC 0.84). The results show that the scale is valid and can be used in studies with women to better understand attitudes related to taste.

  12. Negative Correlation between the Diffusion Coefficient and Transcriptional Activity of the Glucocorticoid Receptor.

    PubMed

    Mikuni, Shintaro; Yamamoto, Johtaro; Horio, Takashi; Kinjo, Masataka

    2017-08-25

    The glucocorticoid receptor (GR) is a transcription factor, which interacts with DNA and other cofactors to regulate gene transcription. Binding to other partners in the cell nucleus alters the diffusion properties of GR. Raster image correlation spectroscopy (RICS) was applied to quantitatively characterize the diffusion properties of EGFP labeled human GR (EGFP-hGR) and its mutants in the cell nucleus. RICS is an image correlation technique that evaluates the spatial distribution of the diffusion coefficient as a diffusion map. Interestingly, we observed that the averaged diffusion coefficient of EGFP-hGR strongly and negatively correlated with its transcriptional activities in comparison to that of EGFP-hGR wild type and mutants with various transcriptional activities. This result suggests that the decreasing of the diffusion coefficient of hGR was reflected in the high-affinity binding to DNA. Moreover, the hyper-phosphorylation of hGR can enhance the transcriptional activity by reduction of the interaction between the hGR and the nuclear corepressors.

  13. Comprehensive analysis of correlation coefficients estimated from pooling heterogeneous microarray data

    PubMed Central

    2013-01-01

    Background The synthesis of information across microarray studies has been performed by combining statistical results of individual studies (as in a mosaic), or by combining data from multiple studies into a large pool to be analyzed as a single data set (as in a melting pot of data). Specific issues relating to data heterogeneity across microarray studies, such as differences within and between labs or differences among experimental conditions, could lead to equivocal results in a melting pot approach. Results We applied statistical theory to determine the specific effect of different means and heteroskedasticity across 19 groups of microarray data on the sign and magnitude of gene-to-gene Pearson correlation coefficients obtained from the pool of 19 groups. We quantified the biases of the pooled coefficients and compared them to the biases of correlations estimated by an effect-size model. Mean differences across the 19 groups were the main factor determining the magnitude and sign of the pooled coefficients, which showed largest values of bias as they approached ±1. Only heteroskedasticity across the pool of 19 groups resulted in less efficient estimations of correlations than did a classical meta-analysis approach of combining correlation coefficients. These results were corroborated by simulation studies involving either mean differences or heteroskedasticity across a pool of N > 2 groups. Conclusions The combination of statistical results is best suited for synthesizing the correlation between expression profiles of a gene pair across several microarray studies. PMID:23822712

  14. Examination of the relationship between theory-driven policies and allowed lost-time back claims in workers' compensation: a system dynamics model.

    PubMed

    Wong, Jessica J; McGregor, Marion; Mior, Silvano A; Loisel, Patrick

    2014-01-01

    The purpose of this study was to develop a model that evaluates the impact of policy changes on the number of workers' compensation lost-time back claims in Ontario, Canada, over a 30-year timeframe. The model was used to test the hypothesis that a theory- and policy-driven model would be sufficient in reproducing historical claims data in a robust manner and that policy changes would have a major impact on modeled data. The model was developed using system dynamics methods in the Vensim simulation program. The theoretical effects of policies for compensation benefit levels and experience rating fees were modeled. The model was built and validated using historical claims data from 1980 to 2009. Sensitivity analysis was used to evaluate the modeled data at extreme end points of variable input and timeframes. The degree of predictive value of the modeled data was measured by the coefficient of determination, root mean square error, and Theil's inequality coefficients. Correlation between modeled data and actual data was found to be meaningful (R(2) = 0.934), and the modeled data were stable at extreme end points. Among the effects explored, policy changes were found to be relatively minor drivers of back claims data, accounting for a 13% improvement in error. Simulation results suggested that unemployment, number of no-lost-time claims, number of injuries per worker, and recovery rate from back injuries outside of claims management to be sensitive drivers of back claims data. A robust systems-based model was developed and tested for use in future policy research in Ontario's workers' compensation. The study findings suggest that certain areas within and outside the workers' compensation system need to be considered when evaluating and changing policies around back claims. © 2014. Published by National University of Health Sciences All rights reserved.

  15. Correlation between National Influenza Surveillance Data and Google Trends in South Korea

    PubMed Central

    Jo, Min Woo; Shin, Soo-Yong; Lee, Jae Ho; Ryoo, Seoung Mok; Kim, Won Young; Seo, Dong-Woo

    2013-01-01

    Background In South Korea, there is currently no syndromic surveillance system using internet search data, including Google Flu Trends. The purpose of this study was to investigate the correlation between national influenza surveillance data and Google Trends in South Korea. Methods Our study was based on a publicly available search engine database, Google Trends, using 12 influenza-related queries, from September 9, 2007 to September 8, 2012. National surveillance data were obtained from the Korea Centers for Disease Control and Prevention (KCDC) influenza-like illness (ILI) and virologic surveillance system. Pearson's correlation coefficients were calculated to compare the national surveillance and the Google Trends data for the overall period and for 5 influenza seasons. Results The correlation coefficient between the KCDC ILI and virologic surveillance data was 0.72 (p<0.05). The highest correlation was between the Google Trends query of H1N1 and the ILI data, with a correlation coefficient of 0.53 (p<0.05), for the overall study period. When compared with the KCDC virologic data, the Google Trends query of bird flu had the highest correlation with a correlation coefficient of 0.93 (p<0.05) in the 2010-11 season. The following queries showed a statistically significant correlation coefficient compared with ILI data for three consecutive seasons: Tamiflu (r = 0.59, 0.86, 0.90, p<0.05), new flu (r = 0.64, 0.43, 0.70, p<0.05) and flu (r = 0.68, 0.43, 0.77, p<0.05). Conclusions In our study, we found that the Google Trends for certain queries using the survey on influenza correlated with national surveillance data in South Korea. The results of this study showed that Google Trends in the Korean language can be used as complementary data for influenza surveillance but was insufficient for the use of predictive models, such as Google Flu Trends. PMID:24339927

  16. Correlation between national influenza surveillance data and google trends in South Korea.

    PubMed

    Cho, Sungjin; Sohn, Chang Hwan; Jo, Min Woo; Shin, Soo-Yong; Lee, Jae Ho; Ryoo, Seoung Mok; Kim, Won Young; Seo, Dong-Woo

    2013-01-01

    In South Korea, there is currently no syndromic surveillance system using internet search data, including Google Flu Trends. The purpose of this study was to investigate the correlation between national influenza surveillance data and Google Trends in South Korea. Our study was based on a publicly available search engine database, Google Trends, using 12 influenza-related queries, from September 9, 2007 to September 8, 2012. National surveillance data were obtained from the Korea Centers for Disease Control and Prevention (KCDC) influenza-like illness (ILI) and virologic surveillance system. Pearson's correlation coefficients were calculated to compare the national surveillance and the Google Trends data for the overall period and for 5 influenza seasons. The correlation coefficient between the KCDC ILI and virologic surveillance data was 0.72 (p<0.05). The highest correlation was between the Google Trends query of H1N1 and the ILI data, with a correlation coefficient of 0.53 (p<0.05), for the overall study period. When compared with the KCDC virologic data, the Google Trends query of bird flu had the highest correlation with a correlation coefficient of 0.93 (p<0.05) in the 2010-11 season. The following queries showed a statistically significant correlation coefficient compared with ILI data for three consecutive seasons: Tamiflu (r = 0.59, 0.86, 0.90, p<0.05), new flu (r = 0.64, 0.43, 0.70, p<0.05) and flu (r = 0.68, 0.43, 0.77, p<0.05). In our study, we found that the Google Trends for certain queries using the survey on influenza correlated with national surveillance data in South Korea. The results of this study showed that Google Trends in the Korean language can be used as complementary data for influenza surveillance but was insufficient for the use of predictive models, such as Google Flu Trends.

  17. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruan, D; Yang, Y; Cao, M

    2014-06-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improvedmore » robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The scheme generalizes to multiple tissue classes.« less

  18. Supersymmetric contributions to weak decay correlation coefficients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Profumo, S.; Ramsey-Musolf, M. J.; Tulin, S.

    2007-04-01

    We study supersymmetric contributions to correlation coefficients that characterize the spectral shape and angular distribution for polarized {mu}- and {beta}-decays. In the minimal supersymmetric standard model (MSSM), one-loop box graphs containing superpartners can give rise to non-(V-Ax(V-A) four-fermion operators in the presence of left-right or flavor mixing between sfermions. We analyze the present phenomenological constraints on such mixing and determine the range of allowed contributions to the weak decay correlation coefficients. We discuss the prospective implications for future {mu}- and {beta}-decay experiments, and argue that they may provide unique probes of left-right mixing in the first generation scalar fermion sector.

  19. Ecotoxicology of phenylphosphonothioates.

    PubMed Central

    Francis, B M; Hansen, L G; Fukuto, T R; Lu, P Y; Metcalf, R L

    1980-01-01

    The phenylphosphonothioate insecticides EPN and leptophos, and several analogs, were evaluated with respect to their delayed neurotoxic effects in hens and their environmental behavior in a terrestrial-aquatic model ecosystem. Acute toxicity to insects was highly correlated with sigma sigma of the substituted phenyl group (regression coefficient r = -0.91) while acute toxicity to mammals was slightly less well correlated (regression coefficient r = -0.71), and neurotoxicity was poorly correlated with sigma sigma (regression coefficient r = -0.35). Both EPN and leptophos were markedly more persistent and bioaccumulative in the model ecosystem than parathion. Desbromoleptophos, a contaminant and metabolite of leptophos, was seen to be a highly stable and persistent terminal residue of leptophos. PMID:6159210

  20. Choosing the best index for the average score intraclass correlation coefficient.

    PubMed

    Shieh, Gwowen

    2016-09-01

    The intraclass correlation coefficient (ICC)(2) index from a one-way random effects model is widely used to describe the reliability of mean ratings in behavioral, educational, and psychological research. Despite its apparent utility, the essential property of ICC(2) as a point estimator of the average score intraclass correlation coefficient is seldom mentioned. This article considers several potential measures and compares their performance with ICC(2). Analytical derivations and numerical examinations are presented to assess the bias and mean square error of the alternative estimators. The results suggest that more advantageous indices can be recommended over ICC(2) for their theoretical implication and computational ease.

  1. Hunter-gatherer postcranial robusticity relative to patterns of mobility, climatic adaptation, and selection for tissue economy.

    PubMed

    Stock, J T

    2006-10-01

    Human skeletal robusticity is influenced by a number of factors, including habitual behavior, climate, and physique. Conflicting evidence as to the relative importance of these factors complicates our ability to interpret variation in robusticity in the past. It remains unclear how the pattern of robusticity in the skeleton relates to adaptive constraints on skeletal morphology. This study investigates variation in robusticity in claviculae, humeri, ulnae, femora, and tibiae among human foragers, relative to climate and habitual behavior. Cross-sectional geometric properties of the diaphyses are compared among hunter-gatherers from southern Africa (n = 83), the Andaman Islands (n = 32), Tierra del Fuego (n = 34), and the Great Lakes region (n = 15). The robusticity of both proximal and distal limb segments correlates negatively with climate and positively with patterns of terrestrial and marine mobility among these groups. However, the relative correspondence between robusticity and these factors varies throughout the body. In the lower limb, partial correlations between polar second moment of area (J(0.73)) and climate decrease from proximal to distal section locations, while this relationship increases from proximal to distal in the upper limb. Patterns of correlation between robusticity and mobility, either terrestrial or marine, generally increase from proximal to distal in the lower and upper limbs, respectively. This suggests that there may be a stronger relationship between observed patterns of diaphyseal hypertrophy and behavioral differences between populations in distal elements. Despite this trend, strength circularity indices at the femoral midshaft show the strongest correspondence with terrestrial mobility, particularly among males.

  2. The Evolution of Pearson's Correlation Coefficient

    ERIC Educational Resources Information Center

    Kader, Gary D.; Franklin, Christine A.

    2008-01-01

    This article describes an activity for developing the notion of association between two quantitative variables. By exploring a collection of scatter plots, the authors propose a nonstandard "intuitive" measure of association; and by examining properties of this measure, they develop the more standard measure, Pearson's Correlation Coefficient. The…

  3. Modeling Concordance Correlation Coefficient for Longitudinal Study Data

    ERIC Educational Resources Information Center

    Ma, Yan; Tang, Wan; Yu, Qin; Tu, X. M.

    2010-01-01

    Measures of agreement are used in a wide range of behavioral, biomedical, psychosocial, and health-care related research to assess reliability of diagnostic test, psychometric properties of instrument, fidelity of psychosocial intervention, and accuracy of proxy outcome. The concordance correlation coefficient (CCC) is a popular measure of…

  4. Neonates with Bartter syndrome have enormous fluid and sodium requirements.

    PubMed

    Azzi, Antonio; Chehade, Hassib; Deschênes, Georges

    2015-07-01

    Managing neonatal Bartter syndrome by achieving adequate weight gain is challenging. We assessed the correlation between weight gain in neonatal Bartter syndrome and the introduction of fluid and sodium supplementations and indomethacin during the first 4 weeks of life. Daily fluid and electrolytes requirements were analysed using linear regression and Spearman correlation coefficients. The weight gain coefficient was calculated as daily weight gain after physiological neonatal weight loss. We studied seven infants. The highest weight gain coefficients occurred between weeks two and four in the five neonates who either received prompt amounts of fluid (maximum 810 mL/kg/day) and sodium (maximum 70 mmol/kg/day) or were treated with indomethacin. For the two patients with the highest weight gain coefficient, water and sodium supplementations were decreased in weeks two to four leading to a significant negative Spearman correlation between weight gain and fluid supplements (r = -0.55 and -0.68) and weight gain and sodium supplementations (r = -0.96 and -0.72). The two patients with the lowest weight gain coefficient had positive Spearman correlation coefficients between weight gain and fluid and sodium supplementations. Infants with neonatal Bartter syndrome required rapid and enormous fluid and sodium supplementations or the early introduction of indomethacin treatment to achieve adequate weight gain during the early postnatal period. ©2015 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  5. [The appraisal of reliability and validity of subjective workload assessment technique and NASA-task load index].

    PubMed

    Xiao, Yuan-mei; Wang, Zhi-ming; Wang, Mian-zhen; Lan, Ya-jia

    2005-06-01

    To test the reliability and validity of two mental workload assessment scales, i.e. subjective workload assessment technique (SWAT) and NASA task load index (NASA-TLX). One thousand two hundred and sixty-eight mental workers were sampled from various kinds of occupations, such as scientific research, education, administration and medicine, etc, with randomized cluster sampling. The re-test reliability, split-half reliability, Cronbach's alpha coefficient and correlation coefficients between item score and total score were adopted to test the reliability. The test of validity included structure validity. The re-test reliability coefficients of these two scales and their items were ranged from 0.516 to 0.753 (P < 0.01), indicating the two scales had good re-test reliability; the split-half reliability of SWAT was 0.645, and its Cronbach's alpha coefficient was more than 0.80, all the correlation coefficients between its items score and total score were more than 0.70; as for NASA-TLX, both the split-half reliability and Cronbach's alpha coefficient were more than 0.80, the correlation coefficients between its items score and total score were all more than 0.60 (P < 0.01) except the item of performance. Both scales had good inner consistency. The Pearson correlation coefficient between the two scales was 0.492 (P < 0.01), implying the results of the two scales had good consistency. Factor analysis showed that the two scales had good structure validity. Both SWAT and NASA-TLX have good reliability and validity and may be used as a valid tool to assess mental workload in China after being revised properly.

  6. Pharmaceuticals' sorptions relative to properties of thirteen different soils.

    PubMed

    Kodešová, Radka; Grabic, Roman; Kočárek, Martin; Klement, Aleš; Golovko, Oksana; Fér, Miroslav; Nikodem, Antonín; Jakšík, Ondřej

    2015-04-01

    Transport of human and veterinary pharmaceuticals in soils and consequent ground-water contamination are influenced by many factors, including compound sorption on soil particles. Here we evaluate the sorption isotherms for 7 pharmaceuticals on 13 soils, described by Freundlich equations, and assess the impact of soil properties on various pharmaceuticals' sorption on soils. Sorption of ionizable pharmaceuticals was, in many cases, highly affected by soil pH. The sorption coefficient of sulfamethoxazole was negatively correlated to soil pH, and thus positively related to hydrolytic acidity and exchangeable acidity. Sorption coefficients for clindamycin and clarithromycin were positively related to soil pH and thus negatively related to hydrolytic acidity and exchangeable acidity, and positively related to base cation saturation. The sorption coefficients for the remaining pharmaceuticals (trimethoprim, metoprolol, atenolol, and carbamazepine) were also positively correlated with the base cation saturation and cation exchange capacity. Positive correlations between sorption coefficients and clay content were found for clindamycin, clarithromycin, atenolol, and metoprolol. Positive correlations between sorption coefficients and organic carbon content were obtained for trimethoprim and carbamazepine. Pedotransfer rules for predicting sorption coefficients of various pharmaceuticals included hydrolytic acidity (sulfamethoxazole), organic carbon content (trimethoprimand carbamazepine), base cation saturation (atenolol and metoprolol), exchangeable acidity and clay content (clindamycin), and soil active pH and clay content (clarithromycin). Pedotransfer rules, predicting the Freundlich sorption coefficients, could be applied for prediction of pharmaceutical mobility in soils with similar soil properties. Predicted sorption coefficients together with pharmaceutical half-lives and other imputes (e.g., soil-hydraulic, geological, hydro-geological, climatic) may be used for assessing potential ground-water contamination. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Statistical analysis on multifractal detrended cross-correlation coefficient for return interval by oriented percolation

    NASA Astrophysics Data System (ADS)

    Deng, Wei; Wang, Jun

    2015-06-01

    We investigate and quantify the multifractal detrended cross-correlation of return interval series for Chinese stock markets and a proposed price model, the price model is established by oriented percolation. The return interval describes the waiting time between two successive price volatilities which are above some threshold, the present work is an attempt to quantify the level of multifractal detrended cross-correlation for the return intervals. Further, the concept of MF-DCCA coefficient of return intervals is introduced, and the corresponding empirical research is performed. The empirical results show that the return intervals of SSE and SZSE are weakly positive multifractal power-law cross-correlated, and exhibit the fluctuation patterns of MF-DCCA coefficients. The similar behaviors of return intervals for the price model is also demonstrated.

  8. Information-theoretic indices usage for the prediction and calculation of octanol-water partition coefficient.

    PubMed

    Persona, Marek; Kutarov, Vladimir V; Kats, Boris M; Persona, Andrzej; Marczewska, Barbara

    2007-01-01

    The paper describes the new prediction method of octanol-water partition coefficient, which is based on molecular graph theory. The results obtained using the new method are well correlated with experimental values. These results were compared with the ones obtained by use of ten other structure correlated methods. The comparison shows that graph theory can be very useful in structure correlation research.

  9. Analytic posteriors for Pearson's correlation coefficient.

    PubMed

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  10. Validity of a self-administered food frequency questionnaire in the 5-year follow-up survey of the JPHC Study Cohort I to assess sodium and potassium intake: comparison with dietary records and 24-hour urinary excretion level.

    PubMed

    Sasaki, Satoshi; Ishihara, Junko; Tsugane, Shoichiro

    2003-01-01

    We compared the intake levels of sodium and potassium assessed with a self-administered semi-quantitative food frequency questionnaire (FFQ) used in a 5-year follow-up survey of the JPHC study and 28-day dietary record (DR), and the corresponding two 24-hour urinary excretion levels (32 men and 57 women) in 3-areas, i.e., Ninohe, Yokote, and Saku Public Health Center areas. The Spearman rank correlation coefficients between dietary sodium assessed with FFQ and the urinary excretion for crude values were 0.24 and -0.10 in men and women, respectively. After adjusting for energy and creatinine, the sodium correlation coefficients were 0.35 and 0.25 in men and women, respectively. The correlation coefficients for crude potassium values were 0.18 and -0.13 in men and women, respectively. After adjusting for energy and creatinine, the potassium correlation coefficients were 0.48 and 0.18 in men and women, respectively in conclusion, a weak correlation was observed both for sodium and potassium after energy and creatinine adjustment in men, whereas no meaningful correlation was observed in women.

  11. Direct numerical simulation of variable surface tension flows using a Volume-of-Fluid method

    NASA Astrophysics Data System (ADS)

    Seric, Ivana; Afkhami, Shahriar; Kondic, Lou

    2018-01-01

    We develop a general methodology for the inclusion of a variable surface tension coefficient into a Volume-of-Fluid based Navier-Stokes solver. This new numerical model provides a robust and accurate method for computing the surface gradients directly by finding the tangent directions on the interface using height functions. The implementation is applicable to both temperature and concentration dependent surface tension coefficient, along with the setups involving a large jump in the temperature between the fluid and its surrounding, as well as the situations where the concentration should be strictly confined to the fluid domain, such as the mixing of fluids with different surface tension coefficients. We demonstrate the applicability of our method to the thermocapillary migration of bubbles and the coalescence of drops characterized by a different surface tension coefficient.

  12. 28 CFR 50.14 - Guidelines on employee selection procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... out are: Assumptions of validity based on a procedure's name or descriptive labels; all forms of... relationship (e.g., correlation coefficient) between performance on a selection procedure and one or more... upon a study involving a large number of subjects and has a low correlation coefficient will be subject...

  13. 28 CFR 50.14 - Guidelines on employee selection procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... out are: Assumptions of validity based on a procedure's name or descriptive labels; all forms of... relationship (e.g., correlation coefficient) between performance on a selection procedure and one or more... upon a study involving a large number of subjects and has a low correlation coefficient will be subject...

  14. 28 CFR 50.14 - Guidelines on employee selection procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... out are: Assumptions of validity based on a procedure's name or descriptive labels; all forms of... relationship (e.g., correlation coefficient) between performance on a selection procedure and one or more... upon a study involving a large number of subjects and has a low correlation coefficient will be subject...

  15. 28 CFR 50.14 - Guidelines on employee selection procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... out are: Assumptions of validity based on a procedure's name or descriptive labels; all forms of... relationship (e.g., correlation coefficient) between performance on a selection procedure and one or more... upon a study involving a large number of subjects and has a low correlation coefficient will be subject...

  16. 28 CFR 50.14 - Guidelines on employee selection procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... out are: Assumptions of validity based on a procedure's name or descriptive labels; all forms of... relationship (e.g., correlation coefficient) between performance on a selection procedure and one or more... upon a study involving a large number of subjects and has a low correlation coefficient will be subject...

  17. Inferential Procedures for Correlation Coefficients Corrected for Attenuation.

    ERIC Educational Resources Information Center

    Hakstian, A. Ralph; And Others

    1988-01-01

    A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)

  18. Meta-Analysis of the Correlation between Apparent Diffusion Coefficient and Standardized Uptake Value in Malignant Disease

    PubMed Central

    Deng, Shengming; Wu, Zhifang; Wu, Yiwei; Zhang, Wei; Li, Jihui; Dai, Na

    2017-01-01

    The objective of this meta-analysis is to explore the correlation between the apparent diffusion coefficient (ADC) on diffusion-weighted MR and the standard uptake value (SUV) of 18F-FDG on PET/CT in patients with cancer. Databases such as PubMed (MEDLINE included), EMBASE, and Cochrane Database of Systematic Review were searched for relevant original articles that explored the correlation between SUV and ADC in English. After applying Fisher's r-to-z transformation, correlation coefficient (r) values were extracted from each study and 95% confidence intervals (CIs) were calculated. Sensitivity and subgroup analyses based on tumor type were performed to investigate the potential heterogeneity. Forty-nine studies were eligible for the meta-analysis, comprising 1927 patients. Pooled r for all studies was −0.35 (95% CI: −0.42–0.28) and exhibited a notable heterogeneity (I2 = 78.4%; P < 0.01). In terms of the cancer type subgroup analysis, combined correlation coefficients of ADC/SUV range from −0.12 (lymphoma, n = 5) to −0.59 (pancreatic cancer, n = 2). We concluded that there is an average negative correlation between ADC and SUV in patients with cancer. Higher correlations were found in the brain tumor, cervix carcinoma, and pancreas cancer. However, a larger, prospective study is warranted to validate these findings in different cancer types. PMID:29097924

  19. Meta-Analysis of the Correlation between Apparent Diffusion Coefficient and Standardized Uptake Value in Malignant Disease.

    PubMed

    Deng, Shengming; Wu, Zhifang; Wu, Yiwei; Zhang, Wei; Li, Jihui; Dai, Na; Zhang, Bin; Yan, Jianhua

    2017-01-01

    The objective of this meta-analysis is to explore the correlation between the apparent diffusion coefficient (ADC) on diffusion-weighted MR and the standard uptake value (SUV) of 18 F-FDG on PET/CT in patients with cancer. Databases such as PubMed (MEDLINE included), EMBASE, and Cochrane Database of Systematic Review were searched for relevant original articles that explored the correlation between SUV and ADC in English. After applying Fisher's r -to- z transformation, correlation coefficient ( r ) values were extracted from each study and 95% confidence intervals (CIs) were calculated. Sensitivity and subgroup analyses based on tumor type were performed to investigate the potential heterogeneity. Forty-nine studies were eligible for the meta-analysis, comprising 1927 patients. Pooled r for all studies was -0.35 (95% CI: -0.42-0.28) and exhibited a notable heterogeneity ( I 2 = 78.4%; P < 0.01). In terms of the cancer type subgroup analysis, combined correlation coefficients of ADC/SUV range from -0.12 (lymphoma, n = 5) to -0.59 (pancreatic cancer, n = 2). We concluded that there is an average negative correlation between ADC and SUV in patients with cancer. Higher correlations were found in the brain tumor, cervix carcinoma, and pancreas cancer. However, a larger, prospective study is warranted to validate these findings in different cancer types.

  20. American Orthopaedic Foot and Ankle Society ankle-hindfoot scale: A cross-cultural adaptation and validation study from Iran.

    PubMed

    Vosoughi, Amir Reza; Roustaei, Narges; Mahdaviazad, Hamideh

    2018-06-01

    The use of valid and reliable outcome rating scales is essential for evaluating the result of different treatments and interventions. The purposes of this study were to translate and culturally adapt the American Orthopaedic Foot and Ankle Society ankle-hindfoot scale (AOFAS-AHFS) into Persian languages and evaluate its psychometric properties. Forward-backward translation and cultural adaptation method were used to develop Persian version of AOFAS-AHFS. From March to July 2016, one hundred consecutive patients with ankle and hindfoot injuries were included. Internal consistency and reproducibility were evaluated using Cronbach's alpha, Spearman's rank correlation coefficient and Intraclass correlation coefficient (ICC) respectively. Construct validity reported which compare the outcome rating scale measurements with Short Form-36 (SF-36), also convergent and discriminant validity evaluated using Spearman's rank correlation coefficient. Mean age (SD) of the patients was 41.95±13.45years. Cronbach's α coefficient, Spearman's rho and ICC values were 0.71, 0.89 and 0.90 respectively. Total score of AOFAS-AHFS and SF-36 domains has a correlation ranged between 0.17-0.55. Spearman's rank correlation coefficient of 0.4 was exceeded by all items with the exception of stability. The Spearman's rank correlation between each item in functional subscales with its own subscales was higher than the correlation between these items and other subscales. Persian version of AOFAS-AHFS provides additional reliable and valid instrument which can be used to assess broad range of patients with foot and ankle disorders that speaking in Persian. However, it seems that the original version of AOFAS-AHFS needs some revisions. Copyright © 2017 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.

  1. Development of a rapid, simple assay of plasma total carotenoids

    PubMed Central

    2012-01-01

    Background Plasma total carotenoids can be used as an indicator of risk of chronic disease. Laboratory analysis of individual carotenoids by high performance liquid chromatography (HPLC) is time consuming, expensive, and not amenable to use beyond a research laboratory. The aim of this research is to establish a rapid, simple, and inexpensive spectrophotometric assay of plasma total carotenoids that has a very strong correlation with HPLC carotenoid profile analysis. Results Plasma total carotenoids from 29 volunteers ranged in concentration from 1.2 to 7.4 μM, as analyzed by HPLC. A linear correlation was found between the absorbance at 448 nm of an alcohol / heptane extract of the plasma and plasma total carotenoids analyzed by HPLC, with a Pearson correlation coefficient of 0.989. The average coefficient of variation for the spectrophotometric assay was 6.5% for the plasma samples. The limit of detection was about 0.3 μM and was linear up to about 34 μM without dilution. Correlations between the integrals of the absorption spectra in the range of carotenoid absorption and total plasma carotenoid concentration gave similar results to the absorbance correlation. Spectrophotometric assay results also agreed with the calculated expected absorbance based on published extinction coefficients for the individual carotenoids, with a Pearson correlation coefficient of 0.988. Conclusion The spectrophotometric assay of total carotenoids strongly correlated with HPLC analysis of carotenoids of the same plasma samples and expected absorbance values based on extinction coefficients. This rapid, simple, inexpensive assay, when coupled with the carotenoid health index, may be useful for nutrition intervention studies, population cohort studies, and public health interventions. PMID:23006902

  2. [Tobacco quality analysis of industrial classification of different years using near-infrared (NIR) spectrum].

    PubMed

    Wang, Yi; Xiang, Ma; Wen, Ya-Dong; Yu, Chun-Xia; Wang, Luo-Ping; Zhao, Long-Lian; Li, Jun-Hui

    2012-11-01

    In this study, tobacco quality analysis of main Industrial classification of different years was carried out applying spectrum projection and correlation methods. The group of data was near-infrared (NIR) spectrum from Hongta Tobacco (Group) Co., Ltd. 5730 tobacco leaf Industrial classification samples from Yuxi in Yunnan province from 2007 to 2010 year were collected using near infrared spectroscopy, which from different parts and colors and all belong to tobacco varieties of HONGDA. The conclusion showed that, when the samples were divided to two part by the ratio of 2:1 randomly as analysis and verification sets in the same year, the verification set corresponded with the analysis set applying spectrum projection because their correlation coefficients were above 0.98. The correlation coefficients between two different years applying spectrum projection were above 0.97. The highest correlation coefficient was the one between 2008 and 2009 year and the lowest correlation coefficient was the one between 2007 and 2010 year. At the same time, The study discussed a method to get the quantitative similarity values of different industrial classification samples. The similarity and consistency values were instructive in combination and replacement of tobacco leaf blending.

  3. Increased correlation coefficient between the written test score and tutors' performance test scores after training of tutors for assessment of medical students during problem-based learning course in Malaysia.

    PubMed

    Jaiprakash, Heethal; Min, Aung Ko Ko; Ghosh, Sarmishtha

    2016-03-01

    This paper is aimed at finding if there was a change of correlation between the written test score and tutors' performance test scores in the assessment of medical students during a problem-based learning (PBL) course in Malaysia. This is a cross-sectional observational study, conducted among 264 medical students in two groups from November 2010 to November 2012. The first group's tutors did not receive tutor training; while the second group's tutors were trained in the PBL process. Each group was divided into high, middle and low achievers based on their end-of-semester exam scores. PBL scores were taken which included written test scores and tutors' performance test scores. Pearson correlation coefficient was calculated between the two kinds of scores in each group. The correlation coefficient between the written scores and tutors' scores in group 1 was 0.099 (p<0.001) and for group 2 was 0.305 (p<0.001). The higher correlation coefficient in the group where tutors received the PBL training reinforces the importance of tutor training before their participation in the PBL course.

  4. p-capture reaction cycles in rotating massive stars and their impact on elemental abundances in globular cluster stars: A case study of O, Na and Al

    NASA Astrophysics Data System (ADS)

    Mahanta, Upakul; Goswami, Aruna; Duorah, Hiralal; Duorah, Kalpana

    2017-08-01

    Elemental abundance patterns of globular cluster stars can provide important clues for understanding cluster formation and early chemical evolution. The origin of the abundance patterns, however, still remains poorly understood. We have studied the impact of p-capture reaction cycles on the abundances of oxygen, sodium and aluminium considering nuclear reaction cycles of carbon-nitrogen-oxygen-fluorine, neon-sodium and magnesium-aluminium in massive stars in stellar conditions of temperature range 2×107 to 10×107 K and typical density of 102 gm cc-1. We have estimated abundances of oxygen, sodium and aluminium with respect to Fe, which are then assumed to be ejected from those stars because of rotation reaching a critical limit. These ejected abundances of elements are then compared with their counterparts that have been observed in some metal-poor evolved stars, mainly giants and red giants, of globular clusters M3, M4, M13 and NGC 6752. We observe an excellent agreement with [O/Fe] between the estimated and observed abundance values for globular clusters M3 and M4 with a correlation coefficient above 0.9 and a strong linear correlation for the remaining two clusters with a correlation coefficient above 0.7. The estimated [Na/Fe] is found to have a correlation coefficient above 0.7, thus implying a strong correlation for all four globular clusters. As far as [Al/Fe] is concerned, it also shows a strong correlation between the estimated abundance and the observed abundance for globular clusters M13 and NGC 6752, since here also the correlation coefficient is above 0.7 whereas for globular cluster M4 there is a moderate correlation found with a correlation coefficient above 0.6. Possible sources of these discrepancies are discussed.

  5. Chromosome3D: reconstructing three-dimensional chromosomal structures from Hi-C interaction frequency data using distance geometry simulated annealing.

    PubMed

    Adhikari, Badri; Trieu, Tuan; Cheng, Jianlin

    2016-11-07

    Reconstructing three-dimensional structures of chromosomes is useful for visualizing their shapes in a cell and interpreting their function. In this work, we reconstruct chromosomal structures from Hi-C data by translating contact counts in Hi-C data into Euclidean distances between chromosomal regions and then satisfying these distances using a structure reconstruction method rigorously tested in the field of protein structure determination. We first evaluate the robustness of the overall reconstruction algorithm on noisy simulated data at various levels of noise by comparing with some of the state-of-the-art reconstruction methods. Then, using simulated data, we validate that Spearman's rank correlation coefficient between pairwise distances in the reconstructed chromosomal structures and the experimental chromosomal contact counts can be used to find optimum conversion rules for transforming interaction frequencies to wish distances. This strategy is then applied to real Hi-C data at chromosome level for optimal transformation of interaction frequencies to wish distances and for ranking and selecting structures. The chromosomal structures reconstructed from a real-world human Hi-C dataset by our method were validated by the known two-compartment feature of the human chromosome organization. We also show that our method is robust with respect to the change of the granularity of Hi-C data, and consistently produces similar structures at different chromosomal resolutions. Chromosome3D is a robust method of reconstructing chromosome three-dimensional models using distance restraints obtained from Hi-C interaction frequency data. It is available as a web application and as an open source tool at http://sysbio.rnet.missouri.edu/chromosome3d/ .

  6. An Exploratory Analysis of Factors Affecting Participation in Air Force Knowledge Now Communities of Practice

    DTIC Science & Technology

    2004-03-01

    reliability coefficients are presented in chapter four in the factor analysis section. Along with Crobach’s Alpha coefficients, the Kaiser - Meyer - Olkin ...the pattern of correlation coefficients > 0.300 in the correlation matrix • Kaiser - Meyer - Olkin Measure of Sampling Adequacy (MSA) > 0.700 • Bartlett’s...exploratory factor analysis. The Kaiser - Meyer - Olkin measure of sampling adequacy yielded a value of .790, and Bartlett’s test of sphericity yielded a

  7. Trends of Heller myotomy hospitalizations for achalasia in the United States, 1993-2005: effect of surgery volume on perioperative outcomes.

    PubMed

    Wang, Y Richard; Dempsey, Daniel T; Friedenberg, Frank K; Richter, Joel E

    2008-10-01

    Achalasia is a rare chronic disorder of esophageal motor function. Single-center reports suggest that there has been greater use of laparoscopic Heller myotomy for achalasia in the United States since its introduction in 1992. We aimed to study the trends of Heller myotomy and the relationship between surgery volume and perioperative outcomes. The Healthcare Cost and Utilization Project Nationwide Inpatient Sample (NIS) is a 20% stratified sample of all hospitalizations in the United States. It was used to study the macro-trends of Heller myotomy hospitalizations during 1993-2005. We also used the NIS 2003-2005 micro-data to study the perioperative outcomes of Heller myotomy hospitalizations, using other achalasia and laparoscopic cholecystectomy hospitalizations as control groups. The generalized linear model with repeated observations from the same unit was used to adjust for multiple hospitalizations from the same hospital. The national estimate of Heller myotomy hospitalizations increased from 728 to 2,255 during 1993-2005, while its mean length of stay decreased from 9.9 to 4.3 days. Of the 1,117 Heller myotomy hospitalizations in the NIS 2003-2005, 10 (0.9%) had the diagnosis of esophageal perforation at discharge. Length of stay was negatively correlated with a hospital's number of Heller myotomy per year (correlation coefficient -0.171, P < 0.001). In multivariate log-linear regressions with a control group, a hospital's number of Heller myotomy per year was negatively associated with length of stay (coefficient -0.215 to -0.119, both P < 0.001) and total charges (coefficient -0.252 to -0.073, both P < 0.10). These findings were robust in alternative statistical models, specifications, and subgroup analyses. On a national level, the introduction of laparoscopic Heller myotomy for achalasia was associated with greater use of surgery and shorter length of stay. A larger volume of Heller myotomy in a hospital was associated with better perioperative outcomes in terms of shorter length of stay and lower total charges.

  8. Network-based de-noising improves prediction from microarray data.

    PubMed

    Kato, Tsuyoshi; Murata, Yukio; Miura, Koh; Asai, Kiyoshi; Horton, Paul B; Koji, Tsuda; Fujibuchi, Wataru

    2006-03-20

    Prediction of human cell response to anti-cancer drugs (compounds) from microarray data is a challenging problem, due to the noise properties of microarrays as well as the high variance of living cell responses to drugs. Hence there is a strong need for more practical and robust methods than standard methods for real-value prediction. We devised an extended version of the off-subspace noise-reduction (de-noising) method to incorporate heterogeneous network data such as sequence similarity or protein-protein interactions into a single framework. Using that method, we first de-noise the gene expression data for training and test data and also the drug-response data for training data. Then we predict the unknown responses of each drug from the de-noised input data. For ascertaining whether de-noising improves prediction or not, we carry out 12-fold cross-validation for assessment of the prediction performance. We use the Pearson's correlation coefficient between the true and predicted response values as the prediction performance. De-noising improves the prediction performance for 65% of drugs. Furthermore, we found that this noise reduction method is robust and effective even when a large amount of artificial noise is added to the input data. We found that our extended off-subspace noise-reduction method combining heterogeneous biological data is successful and quite useful to improve prediction of human cell cancer drug responses from microarray data.

  9. Stability-Indicating HPLC Determination of Gemcitabine in Pharmaceutical Formulations

    PubMed Central

    Singh, Rahul; Shakya, Ashok K.; Naik, Rajashri; Shalan, Naeem

    2015-01-01

    A simple, sensitive, inexpensive, and rapid stability indicating high performance liquid chromatographic method has been developed for determination of gemcitabine in injectable dosage forms using theophylline as internal standard. Chromatographic separation was achieved on a Phenomenex Luna C-18 column (250 mm × 4.6 mm; 5μ) with a mobile phase consisting of 90% water and 10% acetonitrile (pH 7.00 ± 0.05). The signals of gemcitabine and theophylline were recorded at 275 nm. Calibration curves were linear in the concentration range of 0.5–50 μg/mL. The correlation coefficient was 0.999 or higher. The limit of detection and limit of quantitation were 0.1498 and 0.4541 μg/mL, respectively. The inter- and intraday precision were less than 2%. Accuracy of the method ranged from 100.2% to 100.4%. Stability studies indicate that the drug was stable to sunlight and UV light. The drug gives 6 different hydrolytic products under alkaline stress and 3 in acidic condition. Aqueous and oxidative stress conditions also degrade the drug. Degradation was higher in the alkaline condition compared to other stress conditions. The robustness of the methods was evaluated using design of experiments. Validation reveals that the proposed method is specific, accurate, precise, reliable, robust, reproducible, and suitable for the quantitative analysis. PMID:25838825

  10. Automatic segmentation of brain MRIs and mapping neuroanatomy across the human lifespan

    NASA Astrophysics Data System (ADS)

    Keihaninejad, Shiva; Heckemann, Rolf A.; Gousias, Ioannis S.; Rueckert, Daniel; Aljabar, Paul; Hajnal, Joseph V.; Hammers, Alexander

    2009-02-01

    A robust model for the automatic segmentation of human brain images into anatomically defined regions across the human lifespan would be highly desirable, but such structural segmentations of brain MRI are challenging due to age-related changes. We have developed a new method, based on established algorithms for automatic segmentation of young adults' brains. We used prior information from 30 anatomical atlases, which had been manually segmented into 83 anatomical structures. Target MRIs came from 80 subjects (~12 individuals/decade) from 20 to 90 years, with equal numbers of men, women; data from two different scanners (1.5T, 3T), using the IXI database. Each of the adult atlases was registered to each target MR image. By using additional information from segmentation into tissue classes (GM, WM and CSF) to initialise the warping based on label consistency similarity before feeding this into the previous normalised mutual information non-rigid registration, the registration became robust enough to accommodate atrophy and ventricular enlargement with age. The final segmentation was obtained by combination of the 30 propagated atlases using decision fusion. Kernel smoothing was used for modelling the structural volume changes with aging. Example linear correlation coefficients with age were, for lateral ventricular volume, rmale=0.76, rfemale=0.58 and, for hippocampal volume, rmale=-0.6, rfemale=-0.4 (allρ<0.01).

  11. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images

    NASA Astrophysics Data System (ADS)

    Erdt, Marius; Sakas, Georgios

    2010-03-01

    This work presents a novel approach for model based segmentation of the kidney in images acquired by Computed Tomography (CT). The developed computer aided segmentation system is expected to support computer aided diagnosis and operation planning. We have developed a deformable model based approach based on local shape constraints that prevents the model from deforming into neighboring structures while allowing the global shape to adapt freely to the data. Those local constraints are derived from the anatomical structure of the kidney and the presence and appearance of neighboring organs. The adaptation process is guided by a rule-based deformation logic in order to improve the robustness of the segmentation in areas of diffuse organ boundaries. Our work flow consists of two steps: 1.) a user guided positioning and 2.) an automatic model adaptation using affine and free form deformation in order to robustly extract the kidney. In cases which show pronounced pathologies, the system also offers real time mesh editing tools for a quick refinement of the segmentation result. Evaluation results based on 30 clinical cases using CT data sets show an average dice correlation coefficient of 93% compared to the ground truth. The results are therefore in most cases comparable to manual delineation. Computation times of the automatic adaptation step are lower than 6 seconds which makes the proposed system suitable for an application in clinical practice.

  12. MOCC: A Fast and Robust Correlation-Based Method for Interest Point Matching under Large Scale Changes

    NASA Astrophysics Data System (ADS)

    Zhao, Feng; Huang, Qingming; Wang, Hao; Gao, Wen

    2010-12-01

    Similarity measures based on correlation have been used extensively for matching tasks. However, traditional correlation-based image matching methods are sensitive to rotation and scale changes. This paper presents a fast correlation-based method for matching two images with large rotation and significant scale changes. Multiscale oriented corner correlation (MOCC) is used to evaluate the degree of similarity between the feature points. The method is rotation invariant and capable of matching image pairs with scale changes up to a factor of 7. Moreover, MOCC is much faster in comparison with the state-of-the-art matching methods. Experimental results on real images show the robustness and effectiveness of the proposed method.

  13. Comparison of the timings between abrupt climate changes in Greenland, Antarctica, China and Japan based on robust correlation using Lake Suigetsu as a template.

    NASA Astrophysics Data System (ADS)

    Nakagawa, T.

    2014-12-01

    High-resolution pollen-derived climate records from Lake Suigetsu varved sediment core were compared with climate archives from other regions and revealed a particular spatio-temporal structure of the monsoon climate change during so-called D-O events. Leads and lags of the climate change between different regions hold the key to understand the climate system. However, robust assessment of the relative timing of the climate change is often very challenging because correlation of the climatic archives from different regions often has inevitable uncertainties. Greenland and Cariaco basin, for example, provide two of the most frequently sited palaeoclimatic proxy data representative of the high- and low-latitudinal Atlantic regions. However, robust correlation of the records from those regions is difficult because of the uncertainties in layer countings, lack of the radiocarbon age control from ice cores, marine reservoir age of the Cariaco sediments, and the absence of the tephra layers shared by both cites. Similarly, Speleothem and ice core records are not robustly correlated to each other, either for the dead carbon fraction in the speleothems and lack of reliable correlation markers. The generally accepted hypothesis of synchronous climate change between China and the Greenland is, therefore, essentially hypothetical. Lake Suigetsu provides solution to this problem. The lake Suigetsu chronology is supposed to be coherent to the speleothems' U-Th age scale. Suigetsu's semi-continuous radiocarbon dataset, which constitutes major component of the IntCal13 radiocarbon calibration model, also provides opportunity to correlate lake Suigetsu and the Greenland and Antarctic ice cores using cosmogenic isotopes as the correlation key. Results of the correlation and timing comparison, which cast new lights to the mechanism of the monsoon change, will be presented.

  14. Pixel-level multisensor image fusion based on matrix completion and robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.

    2016-01-01

    Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.

  15. Fundamental Studies of Droplet Interactions in Dense Sprays

    DTIC Science & Technology

    1992-12-31

    correlations for the drag coefficients, Nusselt numbers, and Sherwood numbers for hydrocarbon fuel droplets in dense sprays were obtained. 14. SUBJECYTEM...tions for the drag coefficients, Nusselt numbers, and Sherwood numbers for hydrocarbon fuel droplets in dense sprays were obtained. Nomenclature a...the drag coefficient, lift coefficient, moment coefficient, Nusselt number, Sherwood number, and vaporization rates are different from those of an

  16. Evaluation of candidate geomagnetic field models for IGRF-11

    NASA Astrophysics Data System (ADS)

    Finlay, C. C.; Maus, S.; Beggan, C. D.; Hamoudi, M.; Lowes, F. J.; Olsen, N.; Thébault, E.

    2010-10-01

    The eleventh generation of the International Geomagnetic Reference Field (IGRF) was agreed in December 2009 by a task force appointed by the International Association of Geomagnetism and Aeronomy (IAGA) Division V Working Group V-MOD. New spherical harmonic main field models for epochs 2005.0 (DGRF-2005) and 2010.0 (IGRF-2010), and predictive linear secular variation for the interval 2010.0-2015.0 (SV-2010-2015) were derived from weighted averages of candidate models submitted by teams led by DTU Space, Denmark (team A); NOAA/NGDC, U.S.A. (team B); BGS, U.K. (team C); IZMIRAN, Russia (team D); EOST, France (team E); IPGP, France (team F); GFZ, Germany (team G) and NASA-GSFC, U.S.A. (team H). Here, we report the evaluations of candidate models carried out by the IGRF-11 task force during October/November 2009 and describe the weightings used to derive the new IGRF-11 model. The evaluations include calculations of root mean square vector field differences between the candidates, comparisons of the power spectra, and degree correlations between the candidates and a mean model. Coefficient by coefficient analysis including determination of weighting factors used in a robust estimation of mean coefficients is also reported. Maps of differences in the vertical field intensity at Earth's surface between the candidates and weighted mean models are presented. Candidates with anomalous aspects are identified and efforts made to pinpoint both troublesome coefficients and geographical regions where large variations between candidates originate. A retrospective analysis of IGRF-10 main field candidates for epoch 2005.0 and predictive secular variation candidates for 2005.0-2010.0 using the new IGRF-11 models as a reference is also reported. The high quality and consistency of main field models derived using vector satellite data is demonstrated; based on internal consistency DGRF-2005 has a formal root mean square vector field error over Earth's surface of 1.0 nT. Difficulties nevertheless remain in accurately forecasting field evolution only five years into the future.

  17. Comparing Chemistry to Outcome: The Development of a Chemical Distance Metric, Coupled with Clustering and Hierarchal Visualization Applied to Macromolecular Crystallography

    PubMed Central

    Bruno, Andrew E.; Ruby, Amanda M.; Luft, Joseph R.; Grant, Thomas D.; Seetharaman, Jayaraman; Montelione, Gaetano T.; Hunt, John F.; Snell, Edward H.

    2014-01-01

    Many bioscience fields employ high-throughput methods to screen multiple biochemical conditions. The analysis of these becomes tedious without a degree of automation. Crystallization, a rate limiting step in biological X-ray crystallography, is one of these fields. Screening of multiple potential crystallization conditions (cocktails) is the most effective method of probing a proteins phase diagram and guiding crystallization but the interpretation of results can be time-consuming. To aid this empirical approach a cocktail distance coefficient was developed to quantitatively compare macromolecule crystallization conditions and outcome. These coefficients were evaluated against an existing similarity metric developed for crystallization, the C6 metric, using both virtual crystallization screens and by comparison of two related 1,536-cocktail high-throughput crystallization screens. Hierarchical clustering was employed to visualize one of these screens and the crystallization results from an exopolyphosphatase-related protein from Bacteroides fragilis, (BfR192) overlaid on this clustering. This demonstrated a strong correlation between certain chemically related clusters and crystal lead conditions. While this analysis was not used to guide the initial crystallization optimization, it led to the re-evaluation of unexplained peaks in the electron density map of the protein and to the insertion and correct placement of sodium, potassium and phosphate atoms in the structure. With these in place, the resulting structure of the putative active site demonstrated features consistent with active sites of other phosphatases which are involved in binding the phosphoryl moieties of nucleotide triphosphates. The new distance coefficient, CDcoeff, appears to be robust in this application, and coupled with hierarchical clustering and the overlay of crystallization outcome, reveals information of biological relevance. While tested with a single example the potential applications related to crystallography appear promising and the distance coefficient, clustering, and hierarchal visualization of results undoubtedly have applications in wider fields. PMID:24971458

  18. Interspike interval correlation in a stochastic exponential integrate-and-fire model with subthreshold and spike-triggered adaptation.

    PubMed

    Shiau, LieJune; Schwalger, Tilo; Lindner, Benjamin

    2015-06-01

    We study the spike statistics of an adaptive exponential integrate-and-fire neuron stimulated by white Gaussian current noise. We derive analytical approximations for the coefficient of variation and the serial correlation coefficient of the interspike interval assuming that the neuron operates in the mean-driven tonic firing regime and that the stochastic input is weak. Our result for the serial correlation coefficient has the form of a geometric sequence and is confirmed by the comparison to numerical simulations. The theory predicts various patterns of interval correlations (positive or negative at lag one, monotonically decreasing or oscillating) depending on the strength of the spike-triggered and subthreshold components of the adaptation current. In particular, for pure subthreshold adaptation we find strong positive ISI correlations that are usually ascribed to positive correlations in the input current. Our results i) provide an alternative explanation for interspike-interval correlations observed in vivo, ii) may be useful in fitting point neuron models to experimental data, and iii) may be instrumental in exploring the role of adaptation currents for signal detection and signal transmission in single neurons.

  19. Sectoral risk research about input-output structure of the United States

    NASA Astrophysics Data System (ADS)

    Zhang, Mao

    2018-02-01

    There exist rare researches about economic risk in sectoral level, which is significantly important for risk prewarning. This paper employed status coefficient to measure the symmetry of economic subnetwork, which is negatively correlated with sectoral risk. Then, we do empirical research in both cross section and time series dimensions. In cross section dimension, we study the correlation between sectoral status coefficient and sectoral volatility, earning rate and Sharpe ratio respectively in the year 2015. Next, in the perspective of time series, we first investigate the correlation change between sectoral status coefficient and annual total output from 1997 to 2015. Then, we divide the 71 sectors in America into agriculture, manufacturing, services and government, compare the trend terms of average sectoral status coefficients of the four industries and illustrate the causes behind it. We also find obvious abnormality in the sector of housing. At last, this paper puts forward some suggestions for the federal government.

  20. Correlation Between the Field Line and Particle Diffusion Coefficients in the Stochastic Fields of a Tokamak

    NASA Astrophysics Data System (ADS)

    Calvin, Mark; Punjabi, Alkesh

    1996-11-01

    We use the method of quasi-magnetic surfaces to calculate the correlation between the field line and particle diffusion coefficients. The magnetic topology of a tokamak is perturbed by a spectrum of neighboring resonant resistive modes. The Hamiltonian equations of motion for the field line are integrated numerically. Poincare plots of the quasi-magnetic surfaces are generated initially and after the field line has traversed a considerable distance. From the areas of the quasi-magnetic surfaces and the field line distance, we estimate the field line diffusion coefficient. We start plasma particles on the initial quasi-surface, and calculate the particle diffusion coefficient from our Monte Carlo method (Punjabi A., Boozer A., Lam M., Kim H. and Burke K., J. Plasma Phys.), 44, 405 (1990). We then estimate the correlation between the particle and field diffusion as the strength of the resistive modes is varied.

  1. Urdu translation of the Hamilton Rating Scale for Depression: Results of a validation study

    PubMed Central

    Hashmi, Ali M.; Naz, Shahana; Asif, Aftab; Khawaja, Imran S.

    2016-01-01

    Objective: To develop a standardized validated version of the Hamilton Rating Scale for Depression (HAM-D) in Urdu. Methods: After translation of the HAM-D into the Urdu language following standard guidelines, the final Urdu version (HAM-D-U) was administered to 160 depressed outpatients. Inter-item correlation was assessed by calculating Cronbach alpha. Correlation between HAM-D-U scores at baseline and after a 2-week interval was evaluated for test-retest reliability. Moreover, scores of two clinicians on HAM-D-U were compared for inter-rater reliability. For establishing concurrent validity, scores of HAM-D-U and BDI-U were compared by using Spearman correlation coefficient. The study was conducted at Mayo Hospital, Lahore, from May to December 2014. Results: The Cronbach alpha for HAM-D-U was 0.71. Composite scores for HAM-D-U at baseline and after a 2-week interval were also highly correlated with each other (Spearman correlation coefficient 0.83, p-value < 0.01) indicating good test-retest reliability. Composite scores for HAM-D-U and BDI-U were positively correlated with each other (Spearman correlation coefficient 0.85, p < 0.01) indicating good concurrent validity. Scores of two clinicians for HAM-D-U were also positively correlated (Spearman correlation coefficient 0.82, p-value < 0.01) indicated good inter-rater reliability. Conclusion: The HAM-D-U is a valid and reliable instrument for the assessment of Depression. It shows good inter-rater and test-retest reliability. The HAM-D-U can be a tool either for clinical management or research. PMID:28083049

  2. Urdu translation of the Hamilton Rating Scale for Depression: Results of a validation study.

    PubMed

    Hashmi, Ali M; Naz, Shahana; Asif, Aftab; Khawaja, Imran S

    2016-01-01

    To develop a standardized validated version of the Hamilton Rating Scale for Depression (HAM-D) in Urdu. After translation of the HAM-D into the Urdu language following standard guidelines, the final Urdu version (HAM-D-U) was administered to 160 depressed outpatients. Inter-item correlation was assessed by calculating Cronbach alpha. Correlation between HAM-D-U scores at baseline and after a 2-week interval was evaluated for test-retest reliability. Moreover, scores of two clinicians on HAM-D-U were compared for inter-rater reliability. For establishing concurrent validity, scores of HAM-D-U and BDI-U were compared by using Spearman correlation coefficient. The study was conducted at Mayo Hospital, Lahore, from May to December 2014. The Cronbach alpha for HAM-D-U was 0.71. Composite scores for HAM-D-U at baseline and after a 2-week interval were also highly correlated with each other (Spearman correlation coefficient 0.83, p-value < 0.01) indicating good test-retest reliability. Composite scores for HAM-D-U and BDI-U were positively correlated with each other (Spearman correlation coefficient 0.85, p < 0.01) indicating good concurrent validity. Scores of two clinicians for HAM-D-U were also positively correlated (Spearman correlation coefficient 0.82, p-value < 0.01) indicated good inter-rater reliability. The HAM-D-U is a valid and reliable instrument for the assessment of Depression. It shows good inter-rater and test-retest reliability. The HAM-D-U can be a tool either for clinical management or research.

  3. Equity trees and graphs via information theory

    NASA Astrophysics Data System (ADS)

    Harré, M.; Bossomaier, T.

    2010-01-01

    We investigate the similarities and differences between two measures of the relationship between equities traded in financial markets. Our measures are the correlation coefficients and the mutual information. In the context of financial markets correlation coefficients are well established whereas mutual information has not previously been as well studied despite its theoretically appealing properties. We show that asset trees which are derived from either the correlation coefficients or the mutual information have a mixture of both similarities and differences at the individual equity level and at the macroscopic level. We then extend our consideration from trees to graphs using the "genus 0" condition recently introduced in order to study the networks of equities.

  4. Experimental determination of turbulence in a GH2-GOX rocket combustion chamber

    NASA Technical Reports Server (NTRS)

    Tou, P.; Russell, R.; Ohara, J.

    1974-01-01

    The intensity of turbulence and the Lagrangian correlation coefficient for a gaseous rocket combustion chamber have been determined from the experimental measurements of the tracer gas diffusion. A combination of Taylor's turbulent diffusion theory and Spalding's numerical method for solving the conservation equations of fluid mechanics was used to calculate these quantities. Taylor's theory was extended to consider the inhomogeneity of the turbulence field in the axial direction of the combustion chamber. An exponential function was used to represent the Lagrangian correlation coefficient. The results indicate that the maximum value of the intensity of turbulence is about 15% and the Lagrangian correlation coefficient drops to about 0.12 in one inch of the chamber length.

  5. Investigation of two-phase heat transfer coefficients of argon-freon cryogenic mixed refrigerants

    NASA Astrophysics Data System (ADS)

    Baek, Seungwhan; Lee, Cheonkyu; Jeong, Sangkwon

    2014-11-01

    Mixed refrigerant Joule Thomson refrigerators are widely used in various kinds of cryogenic systems these days. Although heat transfer coefficient estimation for a multi-phase and multi-component fluid in the cryogenic temperature range is necessarily required in the heat exchanger design of mixed refrigerant Joule Thomson refrigerators, it has been rarely discussed so far. In this paper, condensation and evaporation heat transfer coefficients of argon-freon mixed refrigerant are measured in a microchannel heat exchanger. A Printed Circuit Heat Exchanger (PCHE) with 340 μm hydraulic diameter has been developed as a compact microchannel heat exchanger and utilized in the experiment. Several two-phase heat transfer coefficient correlations are examined to discuss the experimental measurement results. The result of this paper shows that cryogenic two-phase mixed refrigerant heat transfer coefficients can be estimated by conventional two-phase heat transfer coefficient correlations.

  6. Commander and User Perceptions of the Army’s Intransit Visibility (ITV) Architecture

    DTIC Science & Technology

    2007-03-01

    covariance matrix; (c) Bartlett’s test of Sphericity; and (d) Kaiser-Meyer- Olkin ( KMO ) measure of sampling adequacy. The inter-item correlation matrix...001), and all diagonal terms had a value of 1 while off-diagonal terms were 0. The KMO measure of sampling adequacy reflects the homogeneity...amongst the variables and serves as an index for comparing the magnitudes of correlation coefficients to partial correlation coefficients. KMO values at

  7. Health Service Quality Scale: Brazilian Portuguese translation, reliability and validity.

    PubMed

    Rocha, Luiz Roberto Martins; Veiga, Daniela Francescato; e Oliveira, Paulo Rocha; Song, Elaine Horibe; Ferreira, Lydia Masako

    2013-01-17

    The Health Service Quality Scale is a multidimensional hierarchical scale that is based on interdisciplinary approach. This instrument was specifically created for measuring health service quality based on marketing and health care concepts. The aim of this study was to translate and culturally adapt the Health Service Quality Scale into Brazilian Portuguese and to assess the validity and reliability of the Brazilian Portuguese version of the instrument. We conducted a cross-sectional, observational study, with public health system patients in a Brazilian university hospital. Validity was assessed using Pearson's correlation coefficient to measure the strength of the association between the Brazilian Portuguese version of the instrument and the SERVQUAL scale. Internal consistency was evaluated using Cronbach's alpha coefficient; the intraclass (ICC) and Pearson's correlation coefficients were used for test-retest reliability. One hundred and sixteen consecutive postoperative patients completed the questionnaire. Pearson's correlation coefficient for validity was 0.20. Cronbach's alpha for the first and second administrations of the final version of the instrument were 0.982 and 0.986, respectively. For test-retest reliability, Pearson's correlation coefficient was 0.89 and ICC was 0.90. The culturally adapted, Brazilian Portuguese version of the Health Service Quality Scale is a valid and reliable instrument to measure health service quality.

  8. Correlation time and diffusion coefficient imaging: application to a granular flow system.

    PubMed

    Caprihan, A; Seymour, J D

    2000-05-01

    A parametric method for spatially resolved measurements for velocity autocorrelation functions, R(u)(tau) = , expressed as a sum of exponentials, is presented. The method is applied to a granular flow system of 2-mm oil-filled spheres rotated in a half-filled horizontal cylinder, which is an Ornstein-Uhlenbeck process with velocity autocorrelation function R(u)(tau) = e(- ||tau ||/tau(c)), where tau(c) is the correlation time and D = tau(c) is the diffusion coefficient. The pulsed-field-gradient NMR method consists of applying three different gradient pulse sequences of varying motion sensitivity to distinguish the range of correlation times present for particle motion. Time-dependent apparent diffusion coefficients are measured for these three sequences and tau(c) and D are then calculated from the apparent diffusion coefficient images. For the cylinder rotation rate of 2.3 rad/s, the axial diffusion coefficient at the top center of the free surface was 5.5 x 10(-6) m(2)/s, the correlation time was 3 ms, and the velocity fluctuation or granular temperature was 1.8 x 10(-3) m(2)/s(2). This method is also applicable to study transport in systems involving turbulence and porous media flows. Copyright 2000 Academic Press.

  9. Robust alignment of chromatograms by statistically analyzing the shifts matrix generated by moving window fast Fourier transform cross-correlation.

    PubMed

    Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian

    2015-03-01

    Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Measuring monotony in two-dimensional samples

    NASA Astrophysics Data System (ADS)

    Kachapova, Farida; Kachapov, Ilias

    2010-04-01

    This note introduces a monotony coefficient as a new measure of the monotone dependence in a two-dimensional sample. Some properties of this measure are derived. In particular, it is shown that the absolute value of the monotony coefficient for a two-dimensional sample is between |r| and 1, where r is the Pearson's correlation coefficient for the sample; that the monotony coefficient equals 1 for any monotone increasing sample and equals -1 for any monotone decreasing sample. This article contains a few examples demonstrating that the monotony coefficient is a more accurate measure of the degree of monotone dependence for a non-linear relationship than the Pearson's, Spearman's and Kendall's correlation coefficients. The monotony coefficient is a tool that can be applied to samples in order to find dependencies between random variables; it is especially useful in finding couples of dependent variables in a big dataset of many variables. Undergraduate students in mathematics and science would benefit from learning and applying this measure of monotone dependence.

  11. What to Do With "Moderate" Reliability and Validity Coefficients?

    PubMed

    Post, Marcel W

    2016-07-01

    Clinimetric studies may use criteria for test-retest reliability and convergent validity such that correlation coefficients as low as .40 are supportive of reliability and validity. It can be argued that moderate (.40-.60) correlations should not be interpreted in this way and that reliability coefficients <.70 should be considered as indicative of unreliability. Convergent validity coefficients in the .40 to .60 or .40 to .70 range should be considered as indications of validity problems, or as inconclusive at best. Studies on reliability and convergent should be designed in such a way that it is realistic to expect high reliability and validity coefficients. Multitrait multimethod approaches are preferred to study construct (convergent-divergent) validity. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    PubMed

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Confidence Intervals for Squared Semipartial Correlation Coefficients: The Effect of Nonnormality

    ERIC Educational Resources Information Center

    Algina, James; Keselman, H. J.; Penfield, Randall D.

    2010-01-01

    The increase in the squared multiple correlation coefficient ([delta]R[superscript 2]) associated with a variable in a regression equation is a commonly used measure of importance in regression analysis. Algina, Keselman, and Penfield found that intervals based on asymptotic principles were typically very inaccurate, even though the sample size…

  14. Accuracy of Range Restriction Correction with Multiple Imputation in Small and Moderate Samples: A Simulation Study

    ERIC Educational Resources Information Center

    Pfaffel, Andreas; Spiel, Christiane

    2016-01-01

    Approaches to correcting correlation coefficients for range restriction have been developed under the framework of large sample theory. The accuracy of missing data techniques for correcting correlation coefficients for range restriction has thus far only been investigated with relatively large samples. However, researchers and evaluators are…

  15. Interrater Reliability Estimators Commonly Used in Scoring Language Assessments: A Monte Carlo Investigation of Estimator Accuracy

    ERIC Educational Resources Information Center

    Morgan, Grant B.; Zhu, Min; Johnson, Robert L.; Hodge, Kari J.

    2014-01-01

    Common estimators of interrater reliability include Pearson product-moment correlation coefficients, Spearman rank-order correlations, and the generalizability coefficient. The purpose of this study was to examine the accuracy of estimators of interrater reliability when varying the true reliability, number of scale categories, and number of…

  16. Correlation Revelation: The Search for Meaning in Pearson's Coefficient

    ERIC Educational Resources Information Center

    Huhn, Craig

    2016-01-01

    When the author was first charged with getting a group of students to understand the correlation coefficient, he did not anticipate the topic would challenge his own understanding, let alone cause him to eventually question the very nature of mathematics itself. On the surface, the idea seemed straightforward, one that millions of students across…

  17. Intraclass Correlation Coefficients in Hierarchical Designs: Evaluation Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    Interval estimation of intraclass correlation coefficients in hierarchical designs is discussed within a latent variable modeling framework. A method accomplishing this aim is outlined, which is applicable in two-level studies where participants (or generally lower-order units) are clustered within higher-order units. The procedure can also be…

  18. Relationships of the phase velocity with the microarchitectural parameters in bovine trabecular bone in vitro: Application of a stratified model

    NASA Astrophysics Data System (ADS)

    Lee, Kang Il

    2012-08-01

    The present study aims to provide insight into the relationships of the phase velocity with the microarchitectural parameters in bovine trabecular bone in vitro. The frequency-dependent phase velocity was measured in 22 bovine femoral trabecular bone samples by using a pair of transducers with a diameter of 25.4 mm and a center frequency of 0.5 MHz. The phase velocity exhibited positive correlation coefficients of 0.48 and 0.32 with the ratio of bone volume to total volume and the trabecular thickness, respectively, but a negative correlation coefficient of -0.62 with the trabecular separation. The best univariate predictor of the phase velocity was the trabecular separation, yielding an adjusted squared correlation coefficient of 0.36. The multivariate regression models yielded adjusted squared correlation coefficients of 0.21-0.36. The theoretical phase velocity predicted by using a stratified model for wave propagation in periodically stratified media consisting of alternating parallel solid-fluid layers showed reasonable agreements with the experimental measurements.

  19. Symbolic joint entropy reveals the coupling of various brain regions

    NASA Astrophysics Data System (ADS)

    Ma, Xiaofei; Huang, Xiaolin; Du, Sidan; Liu, Hongxing; Ning, Xinbao

    2018-01-01

    The convergence and divergence of oscillatory behavior of different brain regions are very important for the procedure of information processing. Measurements of coupling or correlation are very useful to study the difference of brain activities. In this study, EEG signals were collected from ten subjects under two conditions, i.e. eyes closed state and idle with eyes open. We propose a nonlinear algorithm, symbolic joint entropy, to compare the coupling strength among the frontal, temporal, parietal and occipital lobes and between two different states. Instead of decomposing the EEG into different frequency bands (theta, alpha, beta, gamma etc.), the novel algorithm is to investigate the coupling from the entire spectrum of brain wave activities above 4Hz. The coupling coefficients in two states with different time delay steps are compared and the group statistics are presented as well. We find that the coupling coefficient of eyes open state with delay consistently lower than that of eyes close state across the group except for one subject, whereas the results without delay are not consistent. The differences between two brain states with non-zero delay can reveal the intrinsic inter-region coupling better. We also use the well-known Hénon map data to validate the algorithm proposed in this paper. The result shows that the method is robust and has a great potential for other physiologic time series.

  20. Global industrial impact coefficient based on random walk process and inter-country input-output table

    NASA Astrophysics Data System (ADS)

    Xing, Lizhi; Dong, Xianlei; Guan, Jun

    2017-04-01

    Input-output table is very comprehensive and detailed in describing the national economic system with lots of economic relationships, which contains supply and demand information among industrial sectors. The complex network, a theory and method for measuring the structure of complex system, can describe the structural characteristics of the internal structure of the research object by measuring the structural indicators of the social and economic system, revealing the complex relationship between the inner hierarchy and the external economic function. This paper builds up GIVCN-WIOT models based on World Input-Output Database in order to depict the topological structure of Global Value Chain (GVC), and assumes the competitive advantage of nations is equal to the overall performance of its domestic sectors' impact on the GVC. Under the perspective of econophysics, Global Industrial Impact Coefficient (GIIC) is proposed to measure the national competitiveness in gaining information superiority and intermediate interests. Analysis of GIVCN-WIOT models yields several insights including the following: (1) sectors with higher Random Walk Centrality contribute more to transmitting value streams within the global economic system; (2) Half-Value Ratio can be used to measure robustness of open-economy macroeconomics in the process of globalization; (3) the positive correlation between GIIC and GDP indicates that one country's global industrial impact could reveal its international competitive advantage.

Top