Sample records for multiscale permutation entropy

  1. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  2. Multi-scale symbolic transfer entropy analysis of EEG

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-10-01

    From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.

  3. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  4. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  5. Weighted multiscale Rényi permutation entropy of nonlinear time series

    NASA Astrophysics Data System (ADS)

    Chen, Shijian; Shang, Pengjian; Wu, Yue

    2018-04-01

    In this paper, based on Rényi permutation entropy (RPE), which has been recently suggested as a relative measure of complexity in nonlinear systems, we propose multiscale Rényi permutation entropy (MRPE) and weighted multiscale Rényi permutation entropy (WMRPE) to quantify the complexity of nonlinear time series over multiple time scales. First, we apply MPRE and WMPRE to the synthetic data and make a comparison of modified methods and RPE. Meanwhile, the influence of the change of parameters is discussed. Besides, we interpret the necessity of considering not only multiscale but also weight by taking the amplitude into account. Then MRPE and WMRPE methods are employed to the closing prices of financial stock markets from different areas. By observing the curves of WMRPE and analyzing the common statistics, stock markets are divided into 4 groups: (1) DJI, S&P500, and HSI, (2) NASDAQ and FTSE100, (3) DAX40 and CAC40, and (4) ShangZheng and ShenCheng. Results show that the standard deviations of weighted methods are smaller, showing WMRPE is able to ensure the results more robust. Besides, WMPRE can provide abundant dynamical properties of complex systems, and demonstrate the intrinsic mechanism.

  6. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis

    PubMed Central

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-01-01

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods. PMID:29690526

  7. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis.

    PubMed

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-04-21

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods.

  8. Analysis of crude oil markets with improved multiscale weighted permutation entropy

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun; Liu, Cheng

    2018-03-01

    Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.

  9. Multiscale permutation entropy analysis of electrocardiogram

    NASA Astrophysics Data System (ADS)

    Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao

    2017-04-01

    To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.

  10. Generalized composite multiscale permutation entropy and Laplacian score based rolling bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zheng, Jinde; Pan, Haiyang; Yang, Shubao; Cheng, Junsheng

    2018-01-01

    Multiscale permutation entropy (MPE) is a recently proposed nonlinear dynamic method for measuring the randomness and detecting the nonlinear dynamic change of time series and can be used effectively to extract the nonlinear dynamic fault feature from vibration signals of rolling bearing. To solve the drawback of coarse graining process in MPE, an improved MPE method called generalized composite multiscale permutation entropy (GCMPE) was proposed in this paper. Also the influence of parameters on GCMPE and its comparison with the MPE are studied by analyzing simulation data. GCMPE was applied to the fault feature extraction from vibration signal of rolling bearing and then based on the GCMPE, Laplacian score for feature selection and the Particle swarm optimization based support vector machine, a new fault diagnosis method for rolling bearing was put forward in this paper. Finally, the proposed method was applied to analyze the experimental data of rolling bearing. The analysis results show that the proposed method can effectively realize the fault diagnosis of rolling bearing and has a higher fault recognition rate than the existing methods.

  11. Refined composite multiscale weighted-permutation entropy of financial time series

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  12. Classifying epileptic EEG signals with delay permutation entropy and Multi-Scale K-means.

    PubMed

    Zhu, Guohun; Li, Yan; Wen, Peng Paul; Wang, Shuaifang

    2015-01-01

    Most epileptic EEG classification algorithms are supervised and require large training datasets, that hinder their use in real time applications. This chapter proposes an unsupervised Multi-Scale K-means (MSK-means) MSK-means algorithm to distinguish epileptic EEG signals and identify epileptic zones. The random initialization of the K-means algorithm can lead to wrong clusters. Based on the characteristics of EEGs, the MSK-means MSK-means algorithm initializes the coarse-scale centroid of a cluster with a suitable scale factor. In this chapter, the MSK-means algorithm is proved theoretically superior to the K-means algorithm on efficiency. In addition, three classifiers: the K-means, MSK-means MSK-means and support vector machine (SVM), are used to identify seizure and localize epileptogenic zone using delay permutation entropy features. The experimental results demonstrate that identifying seizure with the MSK-means algorithm and delay permutation entropy achieves 4. 7 % higher accuracy than that of K-means, and 0. 7 % higher accuracy than that of the SVM.

  13. Characterising Dynamic Instability in High Water-Cut Oil-Water Flows Using High-Resolution Microwave Sensor Signals

    NASA Astrophysics Data System (ADS)

    Liu, Weixin; Jin, Ningde; Han, Yunfeng; Ma, Jing

    2018-06-01

    In the present study, multi-scale entropy algorithm was used to characterise the complex flow phenomena of turbulent droplets in high water-cut oil-water two-phase flow. First, we compared multi-scale weighted permutation entropy (MWPE), multi-scale approximate entropy (MAE), multi-scale sample entropy (MSE) and multi-scale complexity measure (MCM) for typical nonlinear systems. The results show that MWPE presents satisfied variability with scale and anti-noise ability. Accordingly, we conducted an experiment of vertical upward oil-water two-phase flow with high water-cut and collected the signals of a high-resolution microwave resonant sensor, based on which two indexes, the entropy rate and mean value of MWPE, were extracted. Besides, the effects of total flow rate and water-cut on these two indexes were analysed. Our researches show that MWPE is an effective method to uncover the dynamic instability of oil-water two-phase flow with high water-cut.

  14. A practical comparison of algorithms for the measurement of multiscale entropy in neural time series data.

    PubMed

    Kuntzelman, Karl; Jack Rhodes, L; Harrington, Lillian N; Miskovic, Vladimir

    2018-06-01

    There is a broad family of statistical methods for capturing time series regularity, with increasingly widespread adoption by the neuroscientific community. A common feature of these methods is that they permit investigators to quantify the entropy of brain signals - an index of unpredictability/complexity. Despite the proliferation of algorithms for computing entropy from neural time series data there is scant evidence concerning their relative stability and efficiency. Here we evaluated several different algorithmic implementations (sample, fuzzy, dispersion and permutation) of multiscale entropy in terms of their stability across sessions, internal consistency and computational speed, accuracy and precision using a combination of electroencephalogram (EEG) and synthetic 1/ƒ noise signals. Overall, we report fair to excellent internal consistency and longitudinal stability over a one-week period for the majority of entropy estimates, with several caveats. Computational timing estimates suggest distinct advantages for dispersion and permutation entropy over other entropy estimates. Considered alongside the psychometric evidence, we suggest several ways in which researchers can maximize computational resources (without sacrificing reliability), especially when working with high-density M/EEG data or multivoxel BOLD time series signals. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Multiscale permutation entropy analysis of EEG recordings during sevoflurane anesthesia

    NASA Astrophysics Data System (ADS)

    Li, Duan; Li, Xiaoli; Liang, Zhenhu; Voss, Logan J.; Sleigh, Jamie W.

    2010-08-01

    Electroencephalogram (EEG) monitoring of the effect of anesthetic drugs on the central nervous system has long been used in anesthesia research. Several methods based on nonlinear dynamics, such as permutation entropy (PE), have been proposed to analyze EEG series during anesthesia. However, these measures are still single-scale based and may not completely describe the dynamical characteristics of complex EEG series. In this paper, a novel measure combining multiscale PE information, called CMSPE (composite multi-scale permutation entropy), was proposed for quantifying the anesthetic drug effect on EEG recordings during sevoflurane anesthesia. Three sets of simulated EEG series during awake, light and deep anesthesia were used to select the parameters for the multiscale PE analysis: embedding dimension m, lag τ and scales to be integrated into the CMSPE index. Then, the CMSPE index and raw single-scale PE index were applied to EEG recordings from 18 patients who received sevoflurane anesthesia. Pharmacokinetic/pharmacodynamic (PKPD) modeling was used to relate the measured EEG indices and the anesthetic drug concentration. Prediction probability (Pk) statistics and correlation analysis with the response entropy (RE) index, derived from the spectral entropy (M-entropy module; GE Healthcare, Helsinki, Finland), were investigated to evaluate the effectiveness of the new proposed measure. It was found that raw single-scale PE was blind to subtle transitions between light and deep anesthesia, while the CMSPE index tracked these changes accurately. Around the time of loss of consciousness, CMSPE responded significantly more rapidly than the raw PE, with the absolute slopes of linearly fitted response versus time plots of 0.12 (0.09-0.15) and 0.10 (0.06-0.13), respectively. The prediction probability Pk of 0.86 (0.85-0.88) and 0.85 (0.80-0.86) for CMSPE and raw PE indicated that the CMSPE index correlated well with the underlying anesthetic effect. The correlation coefficient for the comparison between the CMSPE index and RE index of 0.84 (0.80-0.88) was significantly higher than the raw PE index of 0.75 (0.66-0.84). The results show that the CMSPE outperforms the raw single-scale PE in reflecting the sevoflurane drug effect on the central nervous system.

  16. A fault diagnosis scheme for planetary gearboxes using adaptive multi-scale morphology filter and modified hierarchical permutation entropy

    NASA Astrophysics Data System (ADS)

    Li, Yongbo; Li, Guoyan; Yang, Yuantao; Liang, Xihui; Xu, Minqiang

    2018-05-01

    The fault diagnosis of planetary gearboxes is crucial to reduce the maintenance costs and economic losses. This paper proposes a novel fault diagnosis method based on adaptive multi-scale morphological filter (AMMF) and modified hierarchical permutation entropy (MHPE) to identify the different health conditions of planetary gearboxes. In this method, AMMF is firstly adopted to remove the fault-unrelated components and enhance the fault characteristics. Second, MHPE is utilized to extract the fault features from the denoised vibration signals. Third, Laplacian score (LS) approach is employed to refine the fault features. In the end, the obtained features are fed into the binary tree support vector machine (BT-SVM) to accomplish the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault categories of planetary gearboxes.

  17. Scale-specific effects: A report on multiscale analysis of acupunctured EEG in entropy and power

    NASA Astrophysics Data System (ADS)

    Song, Zhenxi; Deng, Bin; Wei, Xile; Cai, Lihui; Yu, Haitao; Wang, Jiang; Wang, Ruofan; Chen, Yingyuan

    2018-02-01

    Investigating acupuncture effects contributes to improving clinical application and understanding neuronal dynamics under external stimulation. In this report, we recorded electroencephalography (EEG) signals evoked by acupuncture at ST36 acupoint with three stimulus frequencies of 50, 100 and 200 times per minutes, and selected non-acupuncture EEGs as the control group. Multiscale analyses were introduced to investigate the possible acupuncture effects on complexity and power in multiscale level. Using multiscale weighted-permutation entropy, we found the significant effects on increased complexity degree in EEG signals induced by acupuncture. The comparison of three stimulation manipulations showed that 100 times/min generated most obvious effects, and affected most cortical regions. By estimating average power spectral density, we found decreased power induced by acupuncture. The joint distribution of entropy and power indicated an inverse correlation, and this relationship was weakened by acupuncture effects, especially under the manipulation of 100 times/min frequency. Above findings are more evident and stable in large scales than small scales, which suggests that multiscale analysis allows evaluating significant effects in specific scale and enables to probe the inherent characteristics underlying physiological signals.

  18. Refined two-index entropy and multiscale analysis for complex system

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2016-10-01

    As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.

  19. Multiscale permutation entropy analysis of laser beam wandering in isotropic turbulence.

    PubMed

    Olivares, Felipe; Zunino, Luciano; Gulich, Damián; Pérez, Darío G; Rosso, Osvaldo A

    2017-10-01

    We have experimentally quantified the temporal structural diversity from the coordinate fluctuations of a laser beam propagating through isotropic optical turbulence. The main focus here is on the characterization of the long-range correlations in the wandering of a thin Gaussian laser beam over a screen after propagating through a turbulent medium. To fulfill this goal, a laboratory-controlled experiment was conducted in which coordinate fluctuations of the laser beam were recorded at a sufficiently high sampling rate for a wide range of turbulent conditions. Horizontal and vertical displacements of the laser beam centroid were subsequently analyzed by implementing the symbolic technique based on ordinal patterns to estimate the well-known permutation entropy. We show that the permutation entropy estimations at multiple time scales evidence an interplay between different dynamical behaviors. More specifically, a crossover between two different scaling regimes is observed. We confirm a transition from an integrated stochastic process contaminated with electronic noise to a fractional Brownian motion with a Hurst exponent H=5/6 as the sampling time increases. Besides, we are able to quantify, from the estimated entropy, the amount of electronic noise as a function of the turbulence strength. We have also demonstrated that these experimental observations are in very good agreement with numerical simulations of noisy fractional Brownian motions with a well-defined crossover between two different scaling regimes.

  20. Detection of frequency-mode-shift during thermoacoustic combustion oscillations in a staged aircraft engine model combustor

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hiroaki; Gotoda, Hiroshi; Tachibana, Shigeru; Yoshida, Seiji

    2017-12-01

    We conduct an experimental study using time series analysis based on symbolic dynamics to detect a precursor of frequency-mode-shift during thermoacoustic combustion oscillations in a staged aircraft engine model combustor. With increasing amount of the main fuel, a significant shift in the dominant frequency-mode occurs in noisy periodic dynamics, leading to a notable increase in oscillation amplitudes. The sustainment of noisy periodic dynamics during thermoacoustic combustion oscillations is clearly shown by the multiscale complexity-entropy causality plane in terms of statistical complexity. A modified version of the permutation entropy allows us to detect a precursor of the frequency-mode-shift before the amplification of pressure fluctuations.

  1. MEMD-enhanced multivariate fuzzy entropy for the evaluation of complexity in biomedical signals.

    PubMed

    Azami, Hamed; Smith, Keith; Escudero, Javier

    2016-08-01

    Multivariate multiscale entropy (mvMSE) has been proposed as a combination of the coarse-graining process and multivariate sample entropy (mvSE) to quantify the irregularity of multivariate signals. However, both the coarse-graining process and mvSE may not be reliable for short signals. Although the coarse-graining process can be replaced with multivariate empirical mode decomposition (MEMD), the relative instability of mvSE for short signals remains a problem. Here, we address this issue by proposing the multivariate fuzzy entropy (mvFE) with a new fuzzy membership function. The results using white Gaussian noise show that the mvFE leads to more reliable and stable results, especially for short signals, in comparison with mvSE. Accordingly, we propose MEMD-enhanced mvFE to quantify the complexity of signals. The characteristics of brain regions influenced by partial epilepsy are investigated by focal and non-focal electroencephalogram (EEG) time series. In this sense, the proposed MEMD-enhanced mvFE and mvSE are employed to discriminate focal EEG signals from non-focal ones. The results demonstrate the MEMD-enhanced mvFE values have a smaller coefficient of variation in comparison with those obtained by the MEMD-enhanced mvSE, even for long signals. The results also show that the MEMD-enhanced mvFE has better performance to quantify focal and non-focal signals compared with multivariate multiscale permutation entropy.

  2. Permutation entropy of fractional Brownian motion and fractional Gaussian noise

    NASA Astrophysics Data System (ADS)

    Zunino, L.; Pérez, D. G.; Martín, M. T.; Garavaglia, M.; Plastino, A.; Rosso, O. A.

    2008-06-01

    We have worked out theoretical curves for the permutation entropy of the fractional Brownian motion and fractional Gaussian noise by using the Bandt and Shiha [C. Bandt, F. Shiha, J. Time Ser. Anal. 28 (2007) 646] theoretical predictions for their corresponding relative frequencies. Comparisons with numerical simulations show an excellent agreement. Furthermore, the entropy-gap in the transition between these processes, observed previously via numerical results, has been here theoretically validated. Also, we have analyzed the behaviour of the permutation entropy of the fractional Gaussian noise for different time delays.

  3. Generalized permutation entropy analysis based on the two-index entropic form S q , δ

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian

    2015-05-01

    Permutation entropy (PE) is a novel measure to quantify the complexity of nonlinear time series. In this paper, we propose a generalized permutation entropy ( P E q , δ ) based on the recently postulated entropic form, S q , δ , which was proposed as an unification of the well-known Sq of nonextensive-statistical mechanics and S δ , a possibly appropriate candidate for the black-hole entropy. We find that P E q , δ with appropriate parameters can amplify minor changes and trends of complexities in comparison to PE. Experiments with this generalized permutation entropy method are performed with both synthetic and stock data showing its power. Results show that P E q , δ is an exponential function of q and the power ( k ( δ ) ) is a constant if δ is determined. Some discussions about k ( δ ) are provided. Besides, we also find some interesting results about power law.

  4. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  5. The coupling analysis between stock market indices based on permutation measures

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung

    2016-04-01

    Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  6. Confidence intervals and hypothesis testing for the Permutation Entropy with an application to epilepsy

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; O. Redelico, Francisco

    2018-04-01

    In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series. The literature describes some resampling methods of quantities used in nonlinear dynamics - as the largest Lyapunov exponent - but these seems to fail. In this contribution, we propose a parametric bootstrap methodology using a symbolic representation of the time series to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well-known stochastic processes: the 1/fα noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.

  7. Multiscale Shannon entropy and its application in the stock market

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao

    2017-10-01

    In this paper, we perform a multiscale entropy analysis on the Dow Jones Industrial Average Index using the Shannon entropy. The stock index shows the characteristic of multi-scale entropy that caused by noise in the market. The entropy is demonstrated to have significant predictive ability for the stock index in both long-term and short-term, and empirical results verify that noise does exist in the market and can affect stock price. It has important implications on market participants such as noise traders.

  8. Permutation entropy and statistical complexity analysis of turbulence in laboratory plasmas and the solar wind.

    PubMed

    Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T

    2015-02-01

    The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.

  9. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  10. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS

    PubMed Central

    Kuai, Moshen; Cheng, Gang; Li, Yong

    2018-01-01

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively. PMID:29510569

  11. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS.

    PubMed

    Kuai, Moshen; Cheng, Gang; Pang, Yusong; Li, Yong

    2018-03-05

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.

  12. Permutation Entropy and Signal Energy Increase the Accuracy of Neuropathic Change Detection in Needle EMG

    PubMed Central

    2018-01-01

    Background and Objective. Needle electromyography can be used to detect the number of changes and morphological changes in motor unit potentials of patients with axonal neuropathy. General mathematical methods of pattern recognition and signal analysis were applied to recognize neuropathic changes. This study validates the possibility of extending and refining turns-amplitude analysis using permutation entropy and signal energy. Methods. In this study, we examined needle electromyography in 40 neuropathic individuals and 40 controls. The number of turns, amplitude between turns, signal energy, and “permutation entropy” were used as features for support vector machine classification. Results. The obtained results proved the superior classification performance of the combinations of all of the above-mentioned features compared to the combinations of fewer features. The lowest accuracy from the tested combinations of features had peak-ratio analysis. Conclusion. Using the combination of permutation entropy with signal energy, number of turns and mean amplitude in SVM classification can be used to refine the diagnosis of polyneuropathies examined by needle electromyography. PMID:29606959

  13. Parameters Selection for Bivariate Multiscale Entropy Analysis of Postural Fluctuations in Fallers and Non-Fallers Older Adults.

    PubMed

    Ramdani, Sofiane; Bonnet, Vincent; Tallon, Guillaume; Lagarde, Julien; Bernard, Pierre Louis; Blain, Hubert

    2016-08-01

    Entropy measures are often used to quantify the regularity of postural sway time series. Recent methodological developments provided both multivariate and multiscale approaches allowing the extraction of complexity features from physiological signals; see "Dynamical complexity of human responses: A multivariate data-adaptive framework," in Bulletin of Polish Academy of Science and Technology, vol. 60, p. 433, 2012. The resulting entropy measures are good candidates for the analysis of bivariate postural sway signals exhibiting nonstationarity and multiscale properties. These methods are dependant on several input parameters such as embedding parameters. Using two data sets collected from institutionalized frail older adults, we numerically investigate the behavior of a recent multivariate and multiscale entropy estimator; see "Multivariate multiscale entropy: A tool for complexity analysis of multichannel data," Physics Review E, vol. 84, p. 061918, 2011. We propose criteria for the selection of the input parameters. Using these optimal parameters, we statistically compare the multivariate and multiscale entropy values of postural sway data of non-faller subjects to those of fallers. These two groups are discriminated by the resulting measures over multiple time scales. We also demonstrate that the typical parameter settings proposed in the literature lead to entropy measures that do not distinguish the two groups. This last result confirms the importance of the selection of appropriate input parameters.

  14. Quantifying complexity of financial short-term time series by composite multiscale entropy measure

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun

    2015-05-01

    It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  15. Estimation of absolute solvent and solvation shell entropies via permutation reduction

    NASA Astrophysics Data System (ADS)

    Reinhard, Friedemann; Grubmüller, Helmut

    2007-01-01

    Despite its prominent contribution to the free energy of solvated macromolecules such as proteins or DNA, and although principally contained within molecular dynamics simulations, the entropy of the solvation shell is inaccessible to straightforward application of established entropy estimation methods. The complication is twofold. First, the configurational space density of such systems is too complex for a sufficiently accurate fit. Second, and in contrast to the internal macromolecular dynamics, the configurational space volume explored by the diffusive motion of the solvent molecules is too large to be exhaustively sampled by current simulation techniques. Here, we develop a method to overcome the second problem and to significantly alleviate the first one. We propose to exploit the permutation symmetry of the solvent by transforming the trajectory in a way that renders established estimation methods applicable, such as the quasiharmonic approximation or principal component analysis. Our permutation-reduced approach involves a combinatorial problem, which is solved through its equivalence with the linear assignment problem, for which O(N3) methods exist. From test simulations of dense Lennard-Jones gases, enhanced convergence and improved entropy estimates are obtained. Moreover, our approach renders diffusive systems accessible to improved fit functions.

  16. Multi-Scale Low-Entropy Method for Optimizing the Processing Parameters during Automated Fiber Placement.

    PubMed

    Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong

    2017-09-03

    Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro-meso-scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy-enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy-enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates.

  17. Multiscale Shannon's Entropy Modeling of Orientation and Distance in Steel Fiber Micro-Tomography Data.

    PubMed

    Chiverton, John P; Ige, Olubisi; Barnett, Stephanie J; Parry, Tony

    2017-11-01

    This paper is concerned with the modeling and analysis of the orientation and distance between steel fibers in X-ray micro-tomography data. The advantage of combining both orientation and separation in a model is that it helps provide a detailed understanding of how the steel fibers are arranged, which is easy to compare. The developed models are designed to summarize the randomness of the orientation distribution of the steel fibers both locally and across an entire volume based on multiscale entropy. Theoretical modeling, simulation, and application to real imaging data are shown here. The theoretical modeling of multiscale entropy for orientation includes a proof showing the final form of the multiscale taken over a linear range of scales. A series of image processing operations are also included to overcome interslice connectivity issues to help derive the statistical descriptions of the orientation distributions of the steel fibers. The results demonstrate that multiscale entropy provides unique insights into both simulated and real imaging data of steel fiber reinforced concrete.

  18. Multiscale sample entropy and cross-sample entropy based on symbolic representation and similarity of stock markets

    NASA Astrophysics Data System (ADS)

    Wu, Yue; Shang, Pengjian; Li, Yilong

    2018-03-01

    A modified multiscale sample entropy measure based on symbolic representation and similarity (MSEBSS) is proposed in this paper to research the complexity of stock markets. The modified algorithm reduces the probability of inducing undefined entropies and is confirmed to be robust to strong noise. Considering the validity and accuracy, MSEBSS is more reliable than Multiscale entropy (MSE) for time series mingled with much noise like financial time series. We apply MSEBSS to financial markets and results show American stock markets have the lowest complexity compared with European and Asian markets. There are exceptions to the regularity that stock markets show a decreasing complexity over the time scale, indicating a periodicity at certain scales. Based on MSEBSS, we introduce the modified multiscale cross-sample entropy measure based on symbolic representation and similarity (MCSEBSS) to consider the degree of the asynchrony between distinct time series. Stock markets from the same area have higher synchrony than those from different areas. And for stock markets having relative high synchrony, the entropy values will decrease with the increasing scale factor. While for stock markets having high asynchrony, the entropy values will not decrease with the increasing scale factor sometimes they tend to increase. So both MSEBSS and MCSEBSS are able to distinguish stock markets of different areas, and they are more helpful if used together for studying other features of financial time series.

  19. Information-Theoretical Quantifier of Brain Rhythm Based on Data-Driven Multiscale Representation

    PubMed Central

    2015-01-01

    This paper presents a data-driven multiscale entropy measure to reveal the scale dependent information quantity of electroencephalogram (EEG) recordings. This work is motivated by the previous observations on the nonlinear and nonstationary nature of EEG over multiple time scales. Here, a new framework of entropy measures considering changing dynamics over multiple oscillatory scales is presented. First, to deal with nonstationarity over multiple scales, EEG recording is decomposed by applying the empirical mode decomposition (EMD) which is known to be effective for extracting the constituent narrowband components without a predetermined basis. Following calculation of Renyi entropy of the probability distributions of the intrinsic mode functions extracted by EMD leads to a data-driven multiscale Renyi entropy. To validate the performance of the proposed entropy measure, actual EEG recordings from rats (n = 9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Simulation and experimental results demonstrate that the use of the multiscale Renyi entropy leads to better discriminative capability of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective diagnostic and prognostic tool. PMID:26380297

  20. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  1. Information Measures for Multisensor Systems

    DTIC Science & Technology

    2013-12-11

    permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor

  2. Refined composite multivariate generalized multiscale fuzzy entropy: A tool for complexity analysis of multichannel signals

    NASA Astrophysics Data System (ADS)

    Azami, Hamed; Escudero, Javier

    2017-01-01

    Multiscale entropy (MSE) is an appealing tool to characterize the complexity of time series over multiple temporal scales. Recent developments in the field have tried to extend the MSE technique in different ways. Building on these trends, we propose the so-called refined composite multivariate multiscale fuzzy entropy (RCmvMFE) whose coarse-graining step uses variance (RCmvMFEσ2) or mean (RCmvMFEμ). We investigate the behavior of these multivariate methods on multichannel white Gaussian and 1/ f noise signals, and two publicly available biomedical recordings. Our simulations demonstrate that RCmvMFEσ2 and RCmvMFEμ lead to more stable results and are less sensitive to the signals' length in comparison with the other existing multivariate multiscale entropy-based methods. The classification results also show that using both the variance and mean in the coarse-graining step offers complexity profiles with complementary information for biomedical signal analysis. We also made freely available all the Matlab codes used in this paper.

  3. Revisiting the European sovereign bonds with a permutation-information-theory approach

    NASA Astrophysics Data System (ADS)

    Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-12-01

    In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.

  4. Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.

    PubMed

    Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L

    2008-06-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.

  5. Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures

    PubMed Central

    Peng, Chung-Kang; Goldberger, Ary L.

    2016-01-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763

  6. Permutation Entropy Applied to Movement Behaviors of Drosophila Melanogaster

    NASA Astrophysics Data System (ADS)

    Liu, Yuedan; Chon, Tae-Soo; Baek, Hunki; Do, Younghae; Choi, Jin Hee; Chung, Yun Doo

    Movement of different strains in Drosophila melanogaster was continuously observed by using computer interfacing techniques and was analyzed by permutation entropy (PE) after exposure to toxic chemicals, toluene (0.1 mg/m3) and formaldehyde (0.01 mg/m3). The PE values based on one-dimensional time series position (vertical) data were variable according to internal constraint (i.e. strains) and accordingly increased in response to external constraint (i.e. chemicals) by reflecting diversity in movement patterns from both normal and intoxicated states. Cross-correlation function revealed temporal associations between the PE values and between the component movement patterns in different chemicals and strains through the period of intoxication. The entropy based on the order of position data could be a useful means for complexity measure in behavioral changes and for monitoring the impact of stressors in environment.

  7. EEG entropy measures in anesthesia

    PubMed Central

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277

  8. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks

    PubMed Central

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-01-01

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques. PMID:28383496

  9. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks.

    PubMed

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-04-06

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques.

  10. Refined generalized multiscale entropy analysis for physiological signals

    NASA Astrophysics Data System (ADS)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  11. [Portable Epileptic Seizure Monitoring Intelligent System Based on Android System].

    PubMed

    Liang, Zhenhu; Wu, Shufeng; Yang, Chunlin; Jiang, Zhenzhou; Yu, Tao; Lu, Chengbiao; Li, Xiaoli

    2016-02-01

    The clinical electroencephalogram (EEG) monitoring systems based on personal computer system can not meet the requirements of portability and home usage. The epilepsy patients have to be monitored in hospital for an extended period of time, which imposes a heavy burden on hospitals. In the present study, we designed a portable 16-lead networked monitoring system based on the Android smart phone. The system uses some technologies including the active electrode, the WiFi wireless transmission, the multi-scale permutation entropy (MPE) algorithm, the back-propagation (BP) neural network algorithm, etc. Moreover, the software of Android mobile application can realize the processing and analysis of EEG data, the display of EEG waveform and the alarm of epileptic seizure. The system has been tested on the mobile phones with Android 2. 3 operating system or higher version and the results showed that this software ran accurately and steadily in the detection of epileptic seizure. In conclusion, this paper provides a portable and reliable solution for epileptic seizure monitoring in clinical and home applications.

  12. Complexity multiscale asynchrony measure and behavior for interacting financial dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Ge; Wang, Jun; Niu, Hongli

    2016-08-01

    A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.

  13. Multi-Scale Low-Entropy Method for Optimizing the Processing Parameters during Automated Fiber Placement

    PubMed Central

    Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong

    2017-01-01

    Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro–meso–scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy–enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy–enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates. PMID:28869520

  14. Credit market Jitters in the course of the financial crisis: A permutation entropy approach in measuring informational efficiency in financial assets

    NASA Astrophysics Data System (ADS)

    Siokis, Fotios M.

    2018-06-01

    We explore the evolution of the informational efficiency for specific instruments of the U.S. money, bond and stock exchange markets, prior and after the outbreak of the Great Recession. We utilize the permutation entropy and the complexity-entropy causality plane to rank the time series and measure the degree of informational efficiency. We find that after the credit crunch and the collapse of Lehman Brothers the efficiency level of specific money market instruments' yield falls considerably. This is an evidence of less uncertainty included in predicting the related yields throughout the financial disarray. Similar trend is depicted in the indices of the stock exchange markets but efficiency remains in much higher levels. On the other hand, bond market instruments maintained their efficiency levels even after the outbreak of the crisis, which could be interpreted into greater randomness and less predictability of their yields.

  15. Permutation auto-mutual information of electroencephalogram in anesthesia

    NASA Astrophysics Data System (ADS)

    Liang, Zhenhu; Wang, Yinghua; Ouyang, Gaoxiang; Voss, Logan J.; Sleigh, Jamie W.; Li, Xiaoli

    2013-04-01

    Objective. The dynamic change of brain activity in anesthesia is an interesting topic for clinical doctors and drug designers. To explore the dynamical features of brain activity in anesthesia, a permutation auto-mutual information (PAMI) method is proposed to measure the information coupling of electroencephalogram (EEG) time series obtained in anesthesia. Approach. The PAMI is developed and applied on EEG data collected from 19 patients under sevoflurane anesthesia. The results are compared with the traditional auto-mutual information (AMI), SynchFastSlow (SFS, derived from the BIS index), permutation entropy (PE), composite PE (CPE), response entropy (RE) and state entropy (SE). Performance of all indices is assessed by pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability. Main results. The PK/PD modeling and prediction probability analysis show that the PAMI index correlates closely with the anesthetic effect. The coefficient of determination R2 between PAMI values and the sevoflurane effect site concentrations, and the prediction probability Pk are higher in comparison with other indices. The information coupling in EEG series can be applied to indicate the effect of the anesthetic drug sevoflurane on the brain activity as well as other indices. The PAMI of the EEG signals is suggested as a new index to track drug concentration change. Significance. The PAMI is a useful index for analyzing the EEG dynamics during general anesthesia.

  16. Exact Test of Independence Using Mutual Information

    DTIC Science & Technology

    2014-05-23

    1000 × 0.05 = 50. Entropy 2014, 16 2844 Importantly, the permutation test, which does not preserve Markov order, resulted in 489 Type I errors! Using...Block 13 ARO Report Number Block 13: Supplementary Note © 2014 . Published in Entropy , Vol. Ed. 0 16, (7) (2014), (, (7). DoD Components reserve a...official Department of the Army position, policy or decision, unless so designated by other documentation. ... Entropy 2014, 16, 2839-2849; doi:10.3390

  17. Multivariate multiscale entropy of financial markets

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun

    2017-11-01

    In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.

  18. Bootstrapping on Undirected Binary Networks Via Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice

    2014-09-01

    We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a fine description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.

  19. Perceptual suppression revealed by adaptive multi-scale entropy analysis of local field potential in monkey visual cortex.

    PubMed

    Hu, Meng; Liang, Hualou

    2013-04-01

    Generalized flash suppression (GFS), in which a salient visual stimulus can be rendered invisible despite continuous retinal input, provides a rare opportunity to directly study the neural mechanism of visual perception. Previous work based on linear methods, such as spectral analysis, on local field potential (LFP) during GFS has shown that the LFP power at distinctive frequency bands are differentially modulated by perceptual suppression. Yet, the linear method alone may be insufficient for the full assessment of neural dynamic due to the fundamentally nonlinear nature of neural signals. In this study, we set forth to analyze the LFP data collected from multiple visual areas in V1, V2 and V4 of macaque monkeys while performing the GFS task using a nonlinear method - adaptive multi-scale entropy (AME) - to reveal the neural dynamic of perceptual suppression. In addition, we propose a new cross-entropy measure at multiple scales, namely adaptive multi-scale cross-entropy (AMCE), to assess the nonlinear functional connectivity between two cortical areas. We show that: (1) multi-scale entropy exhibits percept-related changes in all three areas, with higher entropy observed during perceptual suppression; (2) the magnitude of the perception-related entropy changes increases systematically over successive hierarchical stages (i.e. from lower areas V1 to V2, up to higher area V4); and (3) cross-entropy between any two cortical areas reveals higher degree of asynchrony or dissimilarity during perceptual suppression, indicating a decreased functional connectivity between cortical areas. These results, taken together, suggest that perceptual suppression is related to a reduced functional connectivity and increased uncertainty of neural responses, and the modulation of perceptual suppression is more effective at higher visual cortical areas. AME is demonstrated to be a useful technique in revealing the underlying dynamic of nonlinear/nonstationary neural signal.

  20. Mood states modulate complexity in heartbeat dynamics: A multiscale entropy analysis

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Nardelli, M.; Bertschy, G.; Lanata, A.; Scilingo, E. P.

    2014-07-01

    This paper demonstrates that heartbeat complex dynamics is modulated by different pathological mental states. Multiscale entropy analysis was performed on R-R interval series gathered from the electrocardiogram of eight bipolar patients who exhibited mood states among depression, hypomania, and euthymia, i.e., good affective balance. Three different methodologies for the choice of the sample entropy radius value were also compared. We show that the complexity level can be used as a marker of mental states being able to discriminate among the three pathological mood states, suggesting to use heartbeat complexity as a more objective clinical biomarker for mental disorders.

  1. Modified cross sample entropy and surrogate data analysis method for financial time series

    NASA Astrophysics Data System (ADS)

    Yin, Yi; Shang, Pengjian

    2015-09-01

    For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.

  2. Resistance Training Exercise Program for Intervention to Enhance Gait Function in Elderly Chronically Ill Patients: Multivariate Multiscale Entropy for Center of Pressure Signal Analysis

    PubMed Central

    Jiang, Bernard C.

    2014-01-01

    Falls are unpredictable accidents, and the resulting injuries can be serious in the elderly, particularly those with chronic diseases. Regular exercise is recommended to prevent and treat hypertension and other chronic diseases by reducing clinical blood pressure. The “complexity index” (CI), based on multiscale entropy (MSE) algorithm, has been applied in recent studies to show a person's adaptability to intrinsic and external perturbations and widely used measure of postural sway or stability. The multivariate multiscale entropy (MMSE) was advanced algorithm used to calculate the complexity index (CI) values of the center of pressure (COP) data. In this study, we applied the MSE & MMSE to analyze gait function of 24 elderly, chronically ill patients (44% female; 56% male; mean age, 67.56 ± 10.70 years) with either cardiovascular disease, diabetes mellitus, or osteoporosis. After a 12-week training program, postural stability measurements showed significant improvements. Our results showed beneficial effects of resistance training, which can be used to improve postural stability in the elderly and indicated that MMSE algorithms to calculate CI of the COP data were superior to the multiscale entropy (MSE) algorithm to identify the sense of balance in the elderly. PMID:25295070

  3. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  4. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  5. Permutation entropy analysis of financial time series based on Hill's diversity number

    NASA Astrophysics Data System (ADS)

    Zhang, Yali; Shang, Pengjian

    2017-12-01

    In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.

  6. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  7. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  8. Multiscale Persistent Functions for Biomolecular Structure Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Kelin; Li, Zhiming; Mu, Lin

    Here in this paper, we introduce multiscale persistent functions for biomolecular structure characterization. The essential idea is to combine our multiscale rigidity functions (MRFs) with persistent homology analysis, so as to construct a series of multiscale persistent functions, particularly multiscale persistent entropies, for structure characterization. To clarify the fundamental idea of our method, the multiscale persistent entropy (MPE) model is discussed in great detail. Mathematically, unlike the previous persistent entropy (Chintakunta et al. in Pattern Recognit 48(2):391–401, 2015; Merelli et al. in Entropy 17(10):6872–6892, 2015; Rucco et al. in: Proceedings of ECCS 2014, Springer, pp 117–128, 2016), a special resolutionmore » parameter is incorporated into our model. Various scales can be achieved by tuning its value. Physically, our MPE can be used in conformational entropy evaluation. More specifically, it is found that our method incorporates in it a natural classification scheme. This is achieved through a density filtration of an MRF built from angular distributions. To further validate our model, a systematical comparison with the traditional entropy evaluation model is done. Additionally, it is found that our model is able to preserve the intrinsic topological features of biomolecular data much better than traditional approaches, particularly for resolutions in the intermediate range. Moreover, by comparing with traditional entropies from various grid sizes, bond angle-based methods and a persistent homology-based support vector machine method (Cang et al. in Mol Based Math Biol 3:140–162, 2015), we find that our MPE method gives the best results in terms of average true positive rate in a classic protein structure classification test. More interestingly, all-alpha and all-beta protein classes can be clearly separated from each other with zero error only in our model. Finally, a special protein structure index (PSI) is proposed, for the first time, to describe the “regularity” of protein structures. Basically, a protein structure is deemed as regular if it has a consistent and orderly configuration. Our PSI model is tested on a database of 110 proteins; we find that structures with larger portions of loops and intrinsically disorder regions are always associated with larger PSI, meaning an irregular configuration, while proteins with larger portions of secondary structures, i.e., alpha-helix or beta-sheet, have smaller PSI. Essentially, PSI can be used to describe the “regularity” information in any systems.« less

  9. Inferring the Presence of Reverse Proxies Through Timing Analysis

    DTIC Science & Technology

    2015-06-01

    16 Figure 3.2 The three different instances of timing measurement configurations 17 Figure 3.3 Permutation of a web request iteration...Their data showed that they could detect at least 6 bits of entropy between unlike devices and that it was enough to determine that they are in fact...depending on the permutation being executed so that every iteration was conducted under the same distance 15 City   Lat   Long   City   Lat   Long

  10. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis.

    PubMed

    Azami, Hamed; Fernández, Alberto; Escudero, Javier

    2017-11-01

    Multiscale entropy (MSE) has been a prevalent algorithm to quantify the complexity of biomedical time series. Recent developments in the field have tried to alleviate the problem of undefined MSE values for short signals. Moreover, there has been a recent interest in using other statistical moments than the mean, i.e., variance, in the coarse-graining step of the MSE. Building on these trends, here we introduce the so-called refined composite multiscale fuzzy entropy based on the standard deviation (RCMFE σ ) and mean (RCMFE μ ) to quantify the dynamical properties of spread and mean, respectively, over multiple time scales. We demonstrate the dependency of the RCMFE σ and RCMFE μ , in comparison with other multiscale approaches, on several straightforward signal processing concepts using a set of synthetic signals. The results evidenced that the RCMFE σ and RCMFE μ values are more stable and reliable than the classical multiscale entropy ones. We also inspect the ability of using the standard deviation as well as the mean in the coarse-graining process using magnetoencephalograms in Alzheimer's disease and publicly available electroencephalograms recorded from focal and non-focal areas in epilepsy. Our results indicated that when the RCMFE μ cannot distinguish different types of dynamics of a particular time series at some scale factors, the RCMFE σ may do so, and vice versa. The results showed that RCMFE σ -based features lead to higher classification accuracies in comparison with the RCMFE μ -based ones. We also made freely available all the Matlab codes used in this study at http://dx.doi.org/10.7488/ds/1477 .

  11. Multiscale entropy analysis of human gait dynamics

    NASA Astrophysics Data System (ADS)

    Costa, M.; Peng, C.-K.; L. Goldberger, Ary; Hausdorff, Jeffrey M.

    2003-12-01

    We compare the complexity of human gait time series from healthy subjects under different conditions. Using the recently developed multiscale entropy algorithm, which provides a way to measure complexity over a range of scales, we observe that normal spontaneous walking has the highest complexity when compared to slow and fast walking and also to walking paced by a metronome. These findings have implications for modeling locomotor control and for quantifying gait dynamics in physiologic and pathologic states.

  12. The complexity of gene expression dynamics revealed by permutation entropy

    PubMed Central

    2010-01-01

    Background High complexity is considered a hallmark of living systems. Here we investigate the complexity of temporal gene expression patterns using the concept of Permutation Entropy (PE) first introduced in dynamical systems theory. The analysis of gene expression data has so far focused primarily on the identification of differentially expressed genes, or on the elucidation of pathway and regulatory relationships. We aim to study gene expression time series data from the viewpoint of complexity. Results Applying the PE complexity metric to abiotic stress response time series data in Arabidopsis thaliana, genes involved in stress response and signaling were found to be associated with the highest complexity not only under stress, but surprisingly, also under reference, non-stress conditions. Genes with house-keeping functions exhibited lower PE complexity. Compared to reference conditions, the PE of temporal gene expression patterns generally increased upon stress exposure. High-complexity genes were found to have longer upstream intergenic regions and more cis-regulatory motifs in their promoter regions indicative of a more complex regulatory apparatus needed to orchestrate their expression, and to be associated with higher correlation network connectivity degree. Arabidopsis genes also present in other plant species were observed to exhibit decreased PE complexity compared to Arabidopsis specific genes. Conclusions We show that Permutation Entropy is a simple yet robust and powerful approach to identify temporal gene expression profiles of varying complexity that is equally applicable to other types of molecular profile data. PMID:21176199

  13. Effects of propofol, sevoflurane, remifentanil, and (S)-ketamine in subanesthetic concentrations on visceral and somatosensory pain-evoked potentials.

    PubMed

    Untergehrer, Gisela; Jordan, Denis; Eyl, Sebastian; Schneider, Gerhard

    2013-02-01

    Although electroencephalographic parameters and auditory evoked potentials (AEP) reflect the hypnotic component of anesthesia, there is currently no specific and mechanism-based monitoring tool for anesthesia-induced blockade of nociceptive inputs. The aim of this study was to assess visceral pain-evoked potentials (VPEP) and contact heat-evoked potentials (CHEP) as electroencephalographic indicators of drug-induced changes of visceral and somatosensory pain. Additionally, AEP and electroencephalographic permutation entropy were used to evaluate sedative components of the applied drugs. In a study enrolling 60 volunteers, VPEP, CHEP (amplitude N2-P1), and AEP (latency Nb, amplitude Pa-Nb) were recorded without drug application and at two subanesthetic concentration levels of propofol, sevoflurane, remifentanil, or (s)-ketamine. Drug-induced changes of evoked potentials were analyzed. VPEP were generated by electric stimuli using bipolar electrodes positioned in the distal esophagus. For CHEP, heat pulses were given to the medial aspect of the right forearm using a CHEP stimulator. In addition to AEP, electroencephalographic permutation entropy was used to indicate level of sedation. With increasing concentrations of propofol, sevoflurane, remifentanil, and (s)-ketamine, VPEP and CHEP N2-P1 amplitudes decreased. AEP and electroencephalographic permutation entropy showed neither clinically relevant nor statistically significant suppression of cortical activity during drug application. Decreasing VPEP and CHEP amplitudes under subanesthetic concentrations of propofol, sevoflurane, remifentanil, and (s)-ketamine indicate suppressive drug effects. These effects seem to be specific for analgesia.

  14. Comparing Postural Stability Entropy Analyses to Differentiate Fallers and Non-Fallers

    PubMed Central

    Fino, Peter C.; Mojdehi, Ahmad R.; Adjerid, Khaled; Habibi, Mohammad; Lockhart, Thurmon E.; Ross, Shane D.

    2015-01-01

    The health and financial cost of falls has spurred research to differentiate the characteristics of fallers and non-fallers. Postural stability has received much of the attention with recent studies exploring various measures of entropy. This study compared the discriminatory ability of several entropy methods at differentiating two paradigms in the center-of-pressure (COP) of elderly individuals: 1.) eyes open (EO) versus eyes closed (EC) and 2.) fallers (F) versus non-fallers (NF). Methods were compared using the area under the curve (AUC) of the receiver-operating characteristic (ROC) curves developed from logistic regression models. Overall, multiscale entropy (MSE) and composite multiscale entropy (CompMSE) performed the best with AUCs of 0.71 for EO/EC and 0.77 for F/NF. When methods were combined together to maximize the AUC, the entropy classifier had an AUC of for 0.91 the F/NF comparison. These results suggest researchers and clinicians attempting to create clinical tests to identify fallers should consider a combination of every entropy method when creating a classifying test. Additionally, MSE and CompMSE classifiers using polar coordinate data outperformed rectangular coordinate data, encouraging more research into the most appropriate time series for postural stability entropy analysis. PMID:26464267

  15. Comparing Postural Stability Entropy Analyses to Differentiate Fallers and Non-fallers.

    PubMed

    Fino, Peter C; Mojdehi, Ahmad R; Adjerid, Khaled; Habibi, Mohammad; Lockhart, Thurmon E; Ross, Shane D

    2016-05-01

    The health and financial cost of falls has spurred research to differentiate the characteristics of fallers and non-fallers. Postural stability has received much of the attention with recent studies exploring various measures of entropy. This study compared the discriminatory ability of several entropy methods at differentiating two paradigms in the center-of-pressure of elderly individuals: (1) eyes open (EO) vs. eyes closed (EC) and (2) fallers (F) vs. non-fallers (NF). Methods were compared using the area under the curve (AUC) of the receiver-operating characteristic curves developed from logistic regression models. Overall, multiscale entropy (MSE) and composite multiscale entropy (CompMSE) performed the best with AUCs of 0.71 for EO/EC and 0.77 for F/NF. When methods were combined together to maximize the AUC, the entropy classifier had an AUC of for 0.91 the F/NF comparison. These results suggest researchers and clinicians attempting to create clinical tests to identify fallers should consider a combination of every entropy method when creating a classifying test. Additionally, MSE and CompMSE classifiers using polar coordinate data outperformed rectangular coordinate data, encouraging more research into the most appropriate time series for postural stability entropy analysis.

  16. Relative entropy as a universal metric for multiscale errors

    NASA Astrophysics Data System (ADS)

    Chaimovich, Aviel; Shell, M. Scott

    2010-06-01

    We show that the relative entropy, Srel , suggests a fundamental indicator of the success of multiscale studies, in which coarse-grained (CG) models are linked to first-principles (FP) ones. We demonstrate that Srel inherently measures fluctuations in the differences between CG and FP potential energy landscapes, and develop a theory that tightly and generally links it to errors associated with coarse graining. We consider two simple case studies substantiating these results, and suggest that Srel has important ramifications for evaluating and designing coarse-grained models.

  17. Relative entropy as a universal metric for multiscale errors.

    PubMed

    Chaimovich, Aviel; Shell, M Scott

    2010-06-01

    We show that the relative entropy, Srel, suggests a fundamental indicator of the success of multiscale studies, in which coarse-grained (CG) models are linked to first-principles (FP) ones. We demonstrate that Srel inherently measures fluctuations in the differences between CG and FP potential energy landscapes, and develop a theory that tightly and generally links it to errors associated with coarse graining. We consider two simple case studies substantiating these results, and suggest that Srel has important ramifications for evaluating and designing coarse-grained models.

  18. Correlations of multiscale entropy in the FX market

    NASA Astrophysics Data System (ADS)

    Stosic, Darko; Stosic, Dusan; Ludermir, Teresa; Stosic, Tatijana

    2016-09-01

    The regularity of price fluctuations in exchange rates plays a crucial role in FX market dynamics. Distinct variations in regularity arise from economic, social and political events, such as interday trading and financial crisis. This paper applies a multiscale time-dependent entropy method on thirty-three exchange rates to analyze price fluctuations in the FX. Correlation matrices of entropy values, termed entropic correlations, are in turn used to describe global behavior of the market. Empirical results suggest a weakly correlated market with pronounced collective behavior at bi-weekly trends. Correlations arise from cycles of low and high regularity in long-term trends. Eigenvalues of the correlation matrix also indicate a dominant European market, followed by shifting American, Asian, African, and Pacific influences. As a result, we find that entropy is a powerful tool for extracting important information from the FX market.

  19. A scale-entropy diffusion equation to describe the multi-scale features of turbulent flames near a wall

    NASA Astrophysics Data System (ADS)

    Queiros-Conde, D.; Foucher, F.; Mounaïm-Rousselle, C.; Kassem, H.; Feidt, M.

    2008-12-01

    Multi-scale features of turbulent flames near a wall display two kinds of scale-dependent fractal features. In scale-space, an unique fractal dimension cannot be defined and the fractal dimension of the front is scale-dependent. Moreover, when the front approaches the wall, this dependency changes: fractal dimension also depends on the wall-distance. Our aim here is to propose a general geometrical framework that provides the possibility to integrate these two cases, in order to describe the multi-scale structure of turbulent flames interacting with a wall. Based on the scale-entropy quantity, which is simply linked to the roughness of the front, we thus introduce a general scale-entropy diffusion equation. We define the notion of “scale-evolutivity” which characterises the deviation of a multi-scale system from the pure fractal behaviour. The specific case of a constant “scale-evolutivity” over the scale-range is studied. In this case, called “parabolic scaling”, the fractal dimension is a linear function of the logarithm of scale. The case of a constant scale-evolutivity in the wall-distance space implies that the fractal dimension depends linearly on the logarithm of the wall-distance. We then verified experimentally, that parabolic scaling represents a good approximation of the real multi-scale features of turbulent flames near a wall.

  20. Radiomics Evaluation of Histological Heterogeneity Using Multiscale Textures Derived From 3D Wavelet Transformation of Multispectral Images.

    PubMed

    Chaddad, Ahmad; Daniel, Paul; Niazi, Tamim

    2018-01-01

    Colorectal cancer (CRC) is markedly heterogeneous and develops progressively toward malignancy through several stages which include stroma (ST), benign hyperplasia (BH), intraepithelial neoplasia (IN) or precursor cancerous lesion, and carcinoma (CA). Identification of the malignancy stage of CRC pathology tissues (PT) allows the most appropriate therapeutic intervention. This study investigates multiscale texture features extracted from CRC pathology sections using 3D wavelet transform (3D-WT) filter. Multiscale features were extracted from digital whole slide images of 39 patients that were segmented in a pre-processing step using an active contour model. The capacity for multiscale texture to compare and classify between PTs was investigated using ANOVA significance test and random forest classifier models, respectively. 12 significant features derived from the multiscale texture (i.e., variance, entropy, and energy) were found to discriminate between CRC grades at a significance value of p  < 0.01 after correction. Combining multiscale texture features lead to a better predictive capacity compared to prediction models based on individual scale features with an average (±SD) classification accuracy of 93.33 (±3.52)%, sensitivity of 88.33 (± 4.12)%, and specificity of 96.89 (± 3.88)%. Entropy was found to be the best classifier feature across all the PT grades with an average of the area under the curve (AUC) value of 91.17, 94.21, 97.70, 100% for ST, BH, IN, and CA, respectively. Our results suggest that multiscale texture features based on 3D-WT are sensitive enough to discriminate between CRC grades with the entropy feature, the best predictor of pathology grade.

  1. Regional Value Analysis at Threat Evaluation

    DTIC Science & Technology

    2014-06-01

    targets based on information entropy and fuzzy optimization theory. in Industrial Engineering and Engineering Management (IEEM), 2011 IEEE...Assignment by Virtual Permutation and Tabu Search Heuristics. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 2010

  2. Permutation entropy with vector embedding delays

    NASA Astrophysics Data System (ADS)

    Little, Douglas J.; Kane, Deb M.

    2017-12-01

    Permutation entropy (PE) is a statistic used widely for the detection of structure within a time series. Embedding delay times at which the PE is reduced are characteristic timescales for which such structure exists. Here, a generalized scheme is investigated where embedding delays are represented by vectors rather than scalars, permitting PE to be calculated over a (D -1 ) -dimensional space, where D is the embedding dimension. This scheme is applied to numerically generated noise, sine wave and logistic map series, and experimental data sets taken from a vertical-cavity surface emitting laser exhibiting temporally localized pulse structures within the round-trip time of the laser cavity. Results are visualized as PE maps as a function of embedding delay, with low PE values indicating combinations of embedding delays where correlation structure is present. It is demonstrated that vector embedding delays enable identification of structure that is ambiguous or masked, when the embedding delay is constrained to scalar form.

  3. Spatiotemporal Permutation Entropy as a Measure for Complexity of Cardiac Arrhythmia

    NASA Astrophysics Data System (ADS)

    Schlemmer, Alexander; Berg, Sebastian; Lilienkamp, Thomas; Luther, Stefan; Parlitz, Ulrich

    2018-05-01

    Permutation entropy (PE) is a robust quantity for measuring the complexity of time series. In the cardiac community it is predominantly used in the context of electrocardiogram (ECG) signal analysis for diagnoses and predictions with a major application found in heart rate variability parameters. In this article we are combining spatial and temporal PE to form a spatiotemporal PE that captures both, complexity of spatial structures and temporal complexity at the same time. We demonstrate that the spatiotemporal PE (STPE) quantifies complexity using two datasets from simulated cardiac arrhythmia and compare it to phase singularity analysis and spatial PE (SPE). These datasets simulate ventricular fibrillation (VF) on a two-dimensional and a three-dimensional medium using the Fenton-Karma model. We show that SPE and STPE are robust against noise and demonstrate its usefulness for extracting complexity features at different spatial scales.

  4. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    DTIC Science & Technology

    2014-07-01

    to the query graph, or subgraph permutations with the same mismatch cost (often the case for homogeneous and/or symmetrical data/query). To avoid...decisions are generated in a bottom-up manner using the metric of entropy at the cluster level (Figure 9c). Using the definition of belief messages...for a cluster and a set of data nodes in this cluster , we compute the entropy for forward and backward messages as (,) = −∑ (

  5. Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation.

    PubMed

    Azami, Hamed; Escudero, Javier

    2016-05-01

    Signal segmentation and spike detection are two important biomedical signal processing applications. Often, non-stationary signals must be segmented into piece-wise stationary epochs or spikes need to be found among a background of noise before being further analyzed. Permutation entropy (PE) has been proposed to evaluate the irregularity of a time series. PE is conceptually simple, structurally robust to artifacts, and computationally fast. It has been extensively used in many applications, but it has two key shortcomings. First, when a signal is symbolized using the Bandt-Pompe procedure, only the order of the amplitude values is considered and information regarding the amplitudes is discarded. Second, in the PE, the effect of equal amplitude values in each embedded vector is not addressed. To address these issues, we propose a new entropy measure based on PE: the amplitude-aware permutation entropy (AAPE). AAPE is sensitive to the changes in the amplitude, in addition to the frequency, of the signals thanks to it being more flexible than the classical PE in the quantification of the signal motifs. To demonstrate how the AAPE method can enhance the quality of the signal segmentation and spike detection, a set of synthetic and realistic synthetic neuronal signals, electroencephalograms and neuronal data are processed. We compare the performance of AAPE in these problems against state-of-the-art approaches and evaluate the significance of the differences with a repeated ANOVA with post hoc Tukey's test. In signal segmentation, the accuracy of AAPE-based method is higher than conventional segmentation methods. AAPE also leads to more robust results in the presence of noise. The spike detection results show that AAPE can detect spikes well, even when presented with single-sample spikes, unlike PE. For multi-sample spikes, the changes in AAPE are larger than in PE. We introduce a new entropy metric, AAPE, that enables us to consider amplitude information in the formulation of PE. The AAPE algorithm can be used in almost every irregularity-based application in various signal and image processing fields. We also made freely available the Matlab code of the AAPE. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Extending Differential Fault Analysis to Dynamic S-Box Advanced Encryption Standard Implementations

    DTIC Science & Technology

    2014-09-18

    entropy . At the same time, researchers strive to enhance AES and mitigate these growing threats. This paper researches the extension of existing...the algorithm or use side channels to reduce entropy , such as Differential Fault Analysis (DFA). At the same time, continuing research strives to...the state matrix. The S-box is an 8-bit 16x16 table built from an affine transformation on multiplicative inverses which guarantees full permutation (S

  7. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  8. Complexity of intracranial pressure correlates with outcome after traumatic brain injury

    PubMed Central

    Lu, Cheng-Wei; Czosnyka, Marek; Shieh, Jiann-Shing; Smielewska, Anna; Pickard, John D.

    2012-01-01

    This study applied multiscale entropy analysis to investigate the correlation between the complexity of intracranial pressure waveform and outcome after traumatic brain injury. Intracranial pressure and arterial blood pressure waveforms were low-pass filtered to remove the respiratory and pulse components and then processed using a multiscale entropy algorithm to produce a complexity index. We identified significant differences across groups classified by the Glasgow Outcome Scale in intracranial pressure, pressure-reactivity index and complexity index of intracranial pressure (P < 0.0001; P = 0.001; P < 0.0001, respectively). Outcome was dichotomized as survival/death and also as favourable/unfavourable. The complexity index of intracranial pressure achieved the strongest statistical significance (F = 28.7; P < 0.0001 and F = 17.21; P < 0.0001, respectively) and was identified as a significant independent predictor of mortality and favourable outcome in a multivariable logistic regression model (P < 0.0001). The results of this study suggest that complexity of intracranial pressure assessed by multiscale entropy was significantly associated with outcome in patients with brain injury. PMID:22734128

  9. A permutation information theory tour through different interest rate maturities: the Libor case.

    PubMed

    Bariviera, Aurelio Fernández; Guercio, María Belén; Martinez, Lisana B; Rosso, Osvaldo A

    2015-12-13

    This paper analyses Libor interest rates for seven different maturities and referred to operations in British pounds, euros, Swiss francs and Japanese yen, during the period 2001-2015. The analysis is performed by means of two quantifiers derived from information theory: the permutation Shannon entropy and the permutation Fisher information measure. An anomalous behaviour in the Libor is detected in all currencies except euros during the years 2006-2012. The stochastic switch is more severe in one, two and three months maturities. Given the special mechanism of Libor setting, we conjecture that the behaviour could have been produced by the manipulation that was uncovered by financial authorities. We argue that our methodology is pertinent as a market overseeing instrument. © 2015 The Author(s).

  10. Time irreversibility of financial time series based on higher moments and multiscale Kullback-Leibler divergence

    NASA Astrophysics Data System (ADS)

    Li, Jinyang; Shang, Pengjian

    2018-07-01

    Irreversibility is an important property of time series. In this paper, we propose the higher moments and multiscale Kullback-Leibler divergence to analyze time series irreversibility. The higher moments Kullback-Leibler divergence (HMKLD) can amplify irreversibility and make the irreversibility variation more obvious. Therefore, many time series whose irreversibility is hard to be found are also able to show the variations. We employ simulated data and financial stock data to test and verify this method, and find that HMKLD of stock data is growing in the form of fluctuations. As for multiscale Kullback-Leibler divergence (MKLD), it is very complex in the dynamic system, so that exploring the law of simulation and stock system is difficult. In conventional multiscale entropy method, the coarse-graining process is non-overlapping, however we apply a different coarse-graining process and obtain a surprising discovery. The result shows when the scales are 4 and 5, their entropy is nearly similar, which demonstrates MKLD is efficient to display characteristics of time series irreversibility.

  11. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  12. Cipher image damage and decisions in real time

    NASA Astrophysics Data System (ADS)

    Silva-García, Victor Manuel; Flores-Carapia, Rolando; Rentería-Márquez, Carlos; Luna-Benoso, Benjamín; Jiménez-Vázquez, Cesar Antonio; González-Ramírez, Marlon David

    2015-01-01

    This paper proposes a method for constructing permutations on m position arrangements. Our objective is to encrypt color images using advanced encryption standard (AES), using variable permutations means a different one for each 128-bit block in the first round after the x-or operation is applied. Furthermore, this research offers the possibility of knowing the original image when the encrypted figure suffered a failure from either an attack or not. This is achieved by permuting the original image pixel positions before being encrypted with AES variable permutations, which means building a pseudorandom permutation of 250,000 position arrays or more. To this end, an algorithm that defines a bijective function between the nonnegative integer and permutation sets is built. From this algorithm, the way to build permutations on the 0,1,…,m-1 array, knowing m-1 constants, is presented. The transcendental numbers are used to select these m-1 constants in a pseudorandom way. The quality of the proposed encryption according to the following criteria is evaluated: the correlation coefficient, the entropy, and the discrete Fourier transform. A goodness-of-fit test for each basic color image is proposed to measure the bits randomness degree of the encrypted figure. On the other hand, cipher images are obtained in a loss-less encryption way, i.e., no JPEG file formats are used.

  13. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    performing. All reasonable permutations of factors will be used to develop a multitude of unique combinations. These combinations are considered different...are seen below (Duda et al., 2001). Entropy impurity: () = −�P�ωj�log2P(ωj) j (9) Gini impurity: () =�()�� = 1 2 ∗ [1...proportion of one class to another approaches 0.5, the impurity measure reaches its maximum, which for Entropy is 1.0, while it is 0.5 for Gini and

  14. Exploring total cardiac variability in healthy and pathophysiological subjects using improved refined multiscale entropy.

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2017-02-01

    Multiscale entropy (MSE) and refined multiscale entropy (RMSE) techniques are being widely used to evaluate the complexity of a time series across multiple time scales 't'. Both these techniques, at certain time scales (sometimes for the entire time scales, in the case of RMSE), assign higher entropy to the HRV time series of certain pathologies than that of healthy subjects, and to their corresponding randomized surrogate time series. This incorrect assessment of signal complexity may be due to the fact that these techniques suffer from the following limitations: (1) threshold value 'r' is updated as a function of long-term standard deviation and hence unable to explore the short-term variability as well as substantial variability inherited in beat-to-beat fluctuations of long-term HRV time series. (2) In RMSE, entropy values assigned to different filtered scaled time series are the result of changes in variance, but do not completely reflect the real structural organization inherited in original time series. In the present work, we propose an improved RMSE (I-RMSE) technique by introducing a new procedure to set the threshold value by taking into account the period-to-period variability inherited in a signal and evaluated it on simulated and real HRV database. The proposed I-RMSE assigns higher entropy to the age-matched healthy subjects than that of patients suffering from atrial fibrillation, congestive heart failure, sudden cardiac death and diabetes mellitus, for the entire time scales. The results strongly support the reduction in complexity of HRV time series in female group, old-aged, patients suffering from severe cardiovascular and non-cardiovascular diseases, and in their corresponding surrogate time series.

  15. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement

    PubMed Central

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-01-01

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893

  16. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement.

    PubMed

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-02-07

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.

  17. Multiscale entropy analysis of heart rate variability in heart failure, hypertensive, and sinoaortic-denervated rats: classical and refined approaches.

    PubMed

    Silva, Luiz Eduardo Virgilio; Lataro, Renata Maria; Castania, Jaci Airton; da Silva, Carlos Alberto Aguiar; Valencia, Jose Fernando; Murta, Luiz Otavio; Salgado, Helio Cesar; Fazan, Rubens; Porta, Alberto

    2016-07-01

    The analysis of heart rate variability (HRV) by nonlinear methods has been gaining increasing interest due to their ability to quantify the complexity of cardiovascular regulation. In this study, multiscale entropy (MSE) and refined MSE (RMSE) were applied to track the complexity of HRV as a function of time scale in three pathological conscious animal models: rats with heart failure (HF), spontaneously hypertensive rats (SHR), and rats with sinoaortic denervation (SAD). Results showed that HF did not change HRV complexity, although there was a tendency to decrease the entropy in HF animals. On the other hand, SHR group was characterized by reduced complexity at long time scales, whereas SAD animals exhibited a smaller short- and long-term irregularity. We propose that short time scales (1 to 4), accounting for fast oscillations, are more related to vagal and respiratory control, whereas long time scales (5 to 20), accounting for slow oscillations, are more related to sympathetic control. The increased sympathetic modulation is probably the main reason for the lower entropy observed at high scales for both SHR and SAD groups, acting as a negative factor for the cardiovascular complexity. This study highlights the contribution of the multiscale complexity analysis of HRV for understanding the physiological mechanisms involved in cardiovascular regulation. Copyright © 2016 the American Physiological Society.

  18. Antipsychotics reverse abnormal EEG complexity in drug-naïve schizophrenia: A multiscale entropy analysis

    PubMed Central

    Takahashi, Tetsuya; Cho, Raymond Y.; Mizuno, Tomoyuki; Kikuchi, Mitsuru; Murata, Tetsuhito; Takahashi, Koichi; Wada, Yuji

    2010-01-01

    Multiscale entropy (MSE) analysis is a novel entropy-based approach for measuring dynamical complexity in physiological systems over a range of temporal scales. To evaluate this analytic approach as an aid to elucidating the pathophysiologic mechanisms in schizophrenia, we examined MSE in EEG activity in drug-naïve schizophrenia subjects pre- and post-treatment with antipsychotics in comparison with traditional EEG analysis. We recorded eyes-closed resting state EEG from frontal, temporal, parietal and occipital regions in drug-naïve 22 schizophrenia and 24 age-matched healthy control subjects. Fifteen patients were re-evaluated within 2–8 weeks after the initiation of antipsychotic treatment. For each participant, MSE was calculated on one continuous 60 second epoch for each experimental session. Schizophrenia subjects showed significantly higher complexity at higher time scales (lower frequencies), than that of healthy controls in fronto-centro-temporal, but not in parieto-occipital regions. Post-treatment, this higher complexity decreased to healthy control subject levels selectively in fronto-central regions, while the increased complexity in temporal sites remained higher. Comparative power analysis identified spectral slowing in frontal regions in pre-treatment schizophrenia subjects, consistent with previous findings, whereas no antipsychotic treatment effect was observed. In summary, multiscale entropy measures identified abnormal dynamical EEG signal complexity in anterior brain areas in schizophrenia that normalized selectively in fronto-central areas with antipsychotic treatment. These findings show that entropy-based analytic methods may serve as a novel approach for characterizing and understanding abnormal cortical dynamics in schizophrenia, and elucidating the therapeutic mechanisms of antipsychotics. PMID:20149880

  19. Using multi-scale entropy and principal component analysis to monitor gears degradation via the motor current signature analysis

    NASA Astrophysics Data System (ADS)

    Aouabdi, Salim; Taibi, Mahmoud; Bouras, Slimane; Boutasseta, Nadir

    2017-06-01

    This paper describes an approach for identifying localized gear tooth defects, such as pitting, using phase currents measured from an induction machine driving the gearbox. A new tool of anomaly detection based on multi-scale entropy (MSE) algorithm SampEn which allows correlations in signals to be identified over multiple time scales. The motor current signature analysis (MCSA) in conjunction with principal component analysis (PCA) and the comparison of observed values with those predicted from a model built using nominally healthy data. The Simulation results show that the proposed method is able to detect gear tooth pitting in current signals.

  20. A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.

    PubMed

    Xue, Xiaoming; Zhou, Jianzhong

    2017-01-01

    To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Development of multiscale complexity and multifractality of fetal heart rate variability.

    PubMed

    Gierałtowski, Jan; Hoyer, Dirk; Tetschke, Florian; Nowack, Samuel; Schneider, Uwe; Zebrowski, Jan

    2013-11-01

    During fetal development a complex system grows and coordination over multiple time scales is formed towards an integrated behavior of the organism. Since essential cardiovascular and associated coordination is mediated by the autonomic nervous system (ANS) and the ANS activity is reflected in recordable heart rate patterns, multiscale heart rate analysis is a tool predestined for the diagnosis of prenatal maturation. The analyses over multiple time scales requires sufficiently long data sets while the recordings of fetal heart rate as well as the behavioral states studied are themselves short. Care must be taken that the analysis methods used are appropriate for short data lengths. We investigated multiscale entropy and multifractal scaling exponents from 30 minute recordings of 27 normal fetuses, aged between 23 and 38 weeks of gestational age (WGA) during the quiet state. In multiscale entropy, we found complexity lower than that of non-correlated white noise over all 20 coarse graining time scales investigated. Significant maturation age related complexity increase was strongest expressed at scale 2, both using sample entropy and generalized mutual information as complexity estimates. Multiscale multifractal analysis (MMA) in which the Hurst surface h(q,s) is calculated, where q is the multifractal parameter and s is the scale, was applied to the fetal heart rate data. MMA is a method derived from detrended fluctuation analysis (DFA). We modified the base algorithm of MMA to be applicable for short time series analysis using overlapping data windows and a reduction of the scale range. We looked for such q and s for which the Hurst exponent h(q,s) is most correlated with gestational age. We used this value of the Hurst exponent to predict the gestational age based only on fetal heart rate variability properties. Comparison with the true age of the fetus gave satisfying results (error 2.17±3.29 weeks; p<0.001; R(2)=0.52). In addition, we found that the normally used DFA scale range is non-optimal for fetal age evaluation. We conclude that 30 min recordings are appropriate and sufficient for assessing fetal age by multiscale entropy and multiscale multifractal analysis. The predominant prognostic role of scale 2 heart beats for MSE and scale 39 heart beats (at q=-0.7) for MMA cannot be explored neither by single scale complexity measures nor by standard detrended fluctuation analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Phase Transitions in Definite Total Spin States of Two-Component Fermi Gases.

    PubMed

    Yurovsky, Vladimir A

    2017-05-19

    Second-order phase transitions have no latent heat and are characterized by a change in symmetry. In addition to the conventional symmetric and antisymmetric states under permutations of bosons and fermions, mathematical group-representation theory allows for non-Abelian permutation symmetry. Such symmetry can be hidden in states with defined total spins of spinor gases, which can be formed in optical cavities. The present work shows that the symmetry reveals itself in spin-independent or coordinate-independent properties of these gases, namely as non-Abelian entropy in thermodynamic properties. In weakly interacting Fermi gases, two phases appear associated with fermionic and non-Abelian symmetry under permutations of particle states, respectively. The second-order transitions between the phases are characterized by discontinuities in specific heat. Unlike other phase transitions, the present ones are not caused by interactions and can appear even in ideal gases. Similar effects in Bose gases and strong interactions are discussed.

  3. Rolling bearing fault detection and diagnosis based on composite multiscale fuzzy entropy and ensemble support vector machines

    NASA Astrophysics Data System (ADS)

    Zheng, Jinde; Pan, Haiyang; Cheng, Junsheng

    2017-02-01

    To timely detect the incipient failure of rolling bearing and find out the accurate fault location, a novel rolling bearing fault diagnosis method is proposed based on the composite multiscale fuzzy entropy (CMFE) and ensemble support vector machines (ESVMs). Fuzzy entropy (FuzzyEn), as an improvement of sample entropy (SampEn), is a new nonlinear method for measuring the complexity of time series. Since FuzzyEn (or SampEn) in single scale can not reflect the complexity effectively, multiscale fuzzy entropy (MFE) is developed by defining the FuzzyEns of coarse-grained time series, which represents the system dynamics in different scales. However, the MFE values will be affected by the data length, especially when the data are not long enough. By combining information of multiple coarse-grained time series in the same scale, the CMFE algorithm is proposed in this paper to enhance MFE, as well as FuzzyEn. Compared with MFE, with the increasing of scale factor, CMFE obtains much more stable and consistent values for a short-term time series. In this paper CMFE is employed to measure the complexity of vibration signals of rolling bearings and is applied to extract the nonlinear features hidden in the vibration signals. Also the physically meanings of CMFE being suitable for rolling bearing fault diagnosis are explored. Based on these, to fulfill an automatic fault diagnosis, the ensemble SVMs based multi-classifier is constructed for the intelligent classification of fault features. Finally, the proposed fault diagnosis method of rolling bearing is applied to experimental data analysis and the results indicate that the proposed method could effectively distinguish different fault categories and severities of rolling bearings.

  4. A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings

    PubMed Central

    Liu, Jie; Hu, Youmin; Wu, Bo; Wang, Yan; Xie, Fengyun

    2017-01-01

    The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD). Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features’ information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components. PMID:28524088

  5. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  6. Repetitive transient extraction for machinery fault diagnosis using multiscale fractional order entropy infogram

    NASA Astrophysics Data System (ADS)

    Xu, Xuefang; Qiao, Zijian; Lei, Yaguo

    2018-03-01

    The presence of repetitive transients in vibration signals is a typical symptom of local faults of rotating machinery. Infogram was developed to extract the repetitive transients from vibration signals based on Shannon entropy. Unfortunately, the Shannon entropy is maximized for random processes and unable to quantify the repetitive transients buried in heavy random noise. In addition, the vibration signals always contain multiple intrinsic oscillatory modes due to interaction and coupling effects between machine components. Under this circumstance, high values of Shannon entropy appear in several frequency bands or high value of Shannon entropy doesn't appear in the optimal frequency band, and the infogram becomes difficult to interpret. Thus, it also becomes difficult to select the optimal frequency band for extracting the repetitive transients from the whole frequency bands. To solve these problems, multiscale fractional order entropy (MSFE) infogram is proposed in this paper. With the help of MSFE infogram, the complexity and nonlinear signatures of the vibration signals can be evaluated by quantifying spectral entropy over a range of scales in fractional domain. Moreover, the similarity tolerance of MSFE infogram is helpful for assessing the regularity of signals. A simulation and two experiments concerning a locomotive bearing and a wind turbine gear are used to validate the MSFE infogram. The results demonstrate that the MSFE infogram is more robust to the heavy noise than infogram and the high value is able to only appear in the optimal frequency band for the repetitive transient extraction.

  7. Monitoring the informational efficiency of European corporate bond markets with dynamical permutation min-entropy

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2016-08-01

    In this paper the permutation min-entropy has been implemented to unveil the presence of temporal structures in the daily values of European corporate bond indices from April 2001 to August 2015. More precisely, the informational efficiency evolution of the prices of fifteen sectorial indices has been carefully studied by estimating this information-theory-derived symbolic tool over a sliding time window. Such a dynamical analysis makes possible to obtain relevant conclusions about the effect that the 2008 credit crisis has had on the different European corporate bond sectors. It is found that the informational efficiency of some sectors, namely banks, financial services, insurance, and basic resources, has been strongly reduced due to the financial crisis whereas another set of sectors, integrated by chemicals, automobiles, media, energy, construction, industrial goods & services, technology, and telecommunications has only suffered a transitory loss of efficiency. Last but not least, the food & beverage, healthcare, and utilities sectors show a behavior close to a random walk practically along all the period of analysis, confirming a remarkable immunity against the 2008 financial crisis.

  8. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis

    PubMed Central

    Awan, Imtiaz; Aziz, Wajid; Habib, Nazneen; Alowibdi, Jalal S.; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features. PMID:29771977

  9. Studying the dynamics of interbeat interval time series of healthy and congestive heart failure subjects using scale based symbolic entropy analysis.

    PubMed

    Awan, Imtiaz; Aziz, Wajid; Shah, Imran Hussain; Habib, Nazneen; Alowibdi, Jalal S; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali

    2018-01-01

    Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features.

  10. Permutation entropy of finite-length white-noise time series.

    PubMed

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  11. Quadrantal multi-scale distribution entropy analysis of heartbeat interval series based on a modified Poincaré plot

    NASA Astrophysics Data System (ADS)

    Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao

    2013-09-01

    The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.

  12. Analysis of HD 73045 light curve data

    NASA Astrophysics Data System (ADS)

    Das, Mrinal Kanti; Bhatraju, Naveen Kumar; Joshi, Santosh

    2018-04-01

    In this work we analyzed the Kepler light curve data of HD 73045. The raw data has been smoothened using standard filters. The power spectrum has been obtained by using a fast Fourier transform routine. It shows the presence of more than one period. In order to take care of any non-stationary behavior, we carried out a wavelet analysis to obtain the wavelet power spectrum. In addition, to identify the scale invariant structure, the data has been analyzed using a multifractal detrended fluctuation analysis. Further to characterize the diversity of embedded patterns in the HD 73045 flux time series, we computed various entropy-based complexity measures e.g. sample entropy, spectral entropy and permutation entropy. The presence of periodic structure in the time series was further analyzed using the visibility network and horizontal visibility network model of the time series. The degree distributions in the two network models confirm such structures.

  13. Refined Composite Multiscale Dispersion Entropy and its Application to Biomedical Signals.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Abasolo, Daniel; Escudero, Javier

    2017-12-01

    We propose a novel complexity measure to overcome the deficiencies of the widespread and powerful multiscale entropy (MSE), including, MSE values may be undefined for short signals, and MSE is slow for real-time applications. We introduce multiscale dispersion entropy (DisEn-MDE) as a very fast and powerful method to quantify the complexity of signals. MDE is based on our recently developed DisEn, which has a computation cost of O(N), compared with O(N 2 ) for sample entropy used in MSE. We also propose the refined composite MDE (RCMDE) to improve the stability of MDE. We evaluate MDE, RCMDE, and refined composite MSE (RCMSE) on synthetic signals and three biomedical datasets. The MDE, RCMDE, and RCMSE methods show similar results, although the MDE and RCMDE are faster, lead to more stable results, and discriminate different types of physiological signals better than MSE and RCMSE. For noisy short and long time series, MDE and RCMDE are noticeably more stable than MSE and RCMSE, respectively. For short signals, MDE and RCMDE, unlike MSE and RCMSE, do not lead to undefined values. The proposed MDE and RCMDE are significantly faster than MSE and RCMSE, especially for long signals, and lead to larger differences between physiological conditions known to alter the complexity of the physiological recordings. MDE and RCMDE are expected to be useful for the analysis of physiological signals thanks to their ability to distinguish different types of dynamics. The MATLAB codes used in this paper are freely available at http://dx.doi.org/10.7488/ds/1982.

  14. Multiscale Symbolic Phase Transfer Entropy in Financial Time Series Classification

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    We address the challenge of classifying financial time series via a newly proposed multiscale symbolic phase transfer entropy (MSPTE). Using MSPTE method, we succeed to quantify the strength and direction of information flow between financial systems and classify financial time series, which are the stock indices from Europe, America and China during the period from 2006 to 2016 and the stocks of banking, aviation industry and pharmacy during the period from 2007 to 2016, simultaneously. The MSPTE analysis shows that the value of symbolic phase transfer entropy (SPTE) among stocks decreases with the increasing scale factor. It is demonstrated that MSPTE method can well divide stocks into groups by areas and industries. In addition, it can be concluded that the MSPTE analysis quantify the similarity among the stock markets. The symbolic phase transfer entropy (SPTE) between the two stocks from the same area is far less than the SPTE between stocks from different areas. The results also indicate that four stocks from America and Europe have relatively high degree of similarity and the stocks of banking and pharmaceutical industry have higher similarity for CA. It is worth mentioning that the pharmaceutical industry has weaker particular market mechanism than banking and aviation industry.

  15. A fault diagnosis scheme for rolling bearing based on local mean decomposition and improved multiscale fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Li, Yongbo; Xu, Minqiang; Wang, Rixin; Huang, Wenhu

    2016-01-01

    This paper presents a new rolling bearing fault diagnosis method based on local mean decomposition (LMD), improved multiscale fuzzy entropy (IMFE), Laplacian score (LS) and improved support vector machine based binary tree (ISVM-BT). When the fault occurs in rolling bearings, the measured vibration signal is a multi-component amplitude-modulated and frequency-modulated (AM-FM) signal. LMD, a new self-adaptive time-frequency analysis method can decompose any complicated signal into a series of product functions (PFs), each of which is exactly a mono-component AM-FM signal. Hence, LMD is introduced to preprocess the vibration signal. Furthermore, IMFE that is designed to avoid the inaccurate estimation of fuzzy entropy can be utilized to quantify the complexity and self-similarity of time series for a range of scales based on fuzzy entropy. Besides, the LS approach is introduced to refine the fault features by sorting the scale factors. Subsequently, the obtained features are fed into the multi-fault classifier ISVM-BT to automatically fulfill the fault pattern identifications. The experimental results validate the effectiveness of the methodology and demonstrate that proposed algorithm can be applied to recognize the different categories and severities of rolling bearings.

  16. Characterization of Early Partial Seizure Onset: Frequency, Complexity and Entropy

    PubMed Central

    Jouny, Christophe C.; Bergey, Gregory K.

    2011-01-01

    Objective A clear classification of partial seizures onset features is not yet established. Complexity and entropy have been very widely used to describe dynamical systems, but a systematic evaluation of these measures to characterize partial seizures has never been performed. Methods Eighteen different measures including power in frequency bands up to 300Hz, Gabor atom density (GAD), Higuchi fractal dimension (HFD), Lempel-Ziv complexity, Shannon entropy, sample entropy, and permutation entropy, were selected to test sensitivity to partial seizure onset. Intracranial recordings from forty-five patients with mesial temporal, neocortical temporal and neocortical extratemporal seizure foci were included (331 partial seizures). Results GAD, Lempel-Ziv complexity, HFD, high frequency activity, and sample entropy were the most reliable measures to assess early seizure onset. Conclusions Increases in complexity and occurrence of high-frequency components appear to be commonly associated with early stages of partial seizure evolution from all regions. The type of measure (frequency-based, complexity or entropy) does not predict the efficiency of the method to detect seizure onset. Significance Differences between measures such as GAD and HFD highlight the multimodal nature of partial seizure onsets. Improved methods for early seizure detection may be achieved from a better understanding of these underlying dynamics. PMID:21872526

  17. Bubble Entropy: An Entropy Almost Free of Parameters.

    PubMed

    Manis, George; Aktaruzzaman, Md; Sassi, Roberto

    2017-11-01

    Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.

  18. Entropy for the Complexity of Physiological Signal Dynamics.

    PubMed

    Zhang, Xiaohua Douglas

    2017-01-01

    Recently, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of biological dynamics. Portable noninvasive medical devices are crucial to capture individual characteristics of biological dynamics. The wearable noninvasive medical devices and the analysis/management of related digital medical data will revolutionize the management and treatment of diseases, subsequently resulting in the establishment of a new healthcare system. One of the key features that can be extracted from the data obtained by wearable noninvasive medical device is the complexity of physiological signals, which can be represented by entropy of biological dynamics contained in the physiological signals measured by these continuous monitoring medical devices. Thus, in this chapter I present the major concepts of entropy that are commonly used to measure the complexity of biological dynamics. The concepts include Shannon entropy, Kolmogorov entropy, Renyi entropy, approximate entropy, sample entropy, and multiscale entropy. I also demonstrate an example of using entropy for the complexity of glucose dynamics.

  19. Application of a multiscale maximum entropy image restoration algorithm to HXMT observations

    NASA Astrophysics Data System (ADS)

    Guan, Ju; Song, Li-Ming; Huo, Zhuo-Xi

    2016-08-01

    This paper introduces a multiscale maximum entropy (MSME) algorithm for image restoration of the Hard X-ray Modulation Telescope (HXMT), which is a collimated scan X-ray satellite mainly devoted to a sensitive all-sky survey and pointed observations in the 1-250 keV range. The novelty of the MSME method is to use wavelet decomposition and multiresolution support to control noise amplification at different scales. Our work is focused on the application and modification of this method to restore diffuse sources detected by HXMT scanning observations. An improved method, the ensemble multiscale maximum entropy (EMSME) algorithm, is proposed to alleviate the problem of mode mixing exiting in MSME. Simulations have been performed on the detection of the diffuse source Cen A by HXMT in all-sky survey mode. The results show that the MSME method is adapted to the deconvolution task of HXMT for diffuse source detection and the improved method could suppress noise and improve the correlation and signal-to-noise ratio, thus proving itself a better algorithm for image restoration. Through one all-sky survey, HXMT could reach a capacity of detecting a diffuse source with maximum differential flux of 0.5 mCrab. Supported by Strategic Priority Research Program on Space Science, Chinese Academy of Sciences (XDA04010300) and National Natural Science Foundation of China (11403014)

  20. Exploring the effects of climatic variables on monthly precipitation variation using a continuous wavelet-based multiscale entropy approach.

    PubMed

    Roushangar, Kiyoumars; Alizadeh, Farhad; Adamowski, Jan

    2018-08-01

    Understanding precipitation on a regional basis is an important component of water resources planning and management. The present study outlines a methodology based on continuous wavelet transform (CWT) and multiscale entropy (CWME), combined with self-organizing map (SOM) and k-means clustering techniques, to measure and analyze the complexity of precipitation. Historical monthly precipitation data from 1960 to 2010 at 31 rain gauges across Iran were preprocessed by CWT. The multi-resolution CWT approach segregated the major features of the original precipitation series by unfolding the structure of the time series which was often ambiguous. The entropy concept was then applied to components obtained from CWT to measure dispersion, uncertainty, disorder, and diversification of subcomponents. Based on different validity indices, k-means clustering captured homogenous areas more accurately, and additional analysis was performed based on the outcome of this approach. The 31 rain gauges in this study were clustered into 6 groups, each one having a unique CWME pattern across different time scales. The results of clustering showed that hydrologic similarity (multiscale variation of precipitation) was not based on geographic contiguity. According to the pattern of entropy across the scales, each cluster was assigned an entropy signature that provided an estimation of the entropy pattern of precipitation data in each cluster. Based on the pattern of mean CWME for each cluster, a characteristic signature was assigned, which provided an estimation of the CWME of a cluster across scales of 1-2, 3-8, and 9-13 months relative to other stations. The validity of the homogeneous clusters demonstrated the usefulness of the proposed approach to regionalize precipitation. Further analysis based on wavelet coherence (WTC) was performed by selecting central rain gauges in each cluster and analyzing against temperature, wind, Multivariate ENSO index (MEI), and East Atlantic (EA) and North Atlantic Oscillation (NAO), indeces. The results revealed that all climatic features except NAO influenced precipitation in Iran during the 1960-2010 period. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. A Novel Bearing Multi-Fault Diagnosis Approach Based on Weighted Permutation Entropy and an Improved SVM Ensemble Classifier.

    PubMed

    Zhou, Shenghan; Qian, Silin; Chang, Wenbing; Xiao, Yiyong; Cheng, Yang

    2018-06-14

    Timely and accurate state detection and fault diagnosis of rolling element bearings are very critical to ensuring the reliability of rotating machinery. This paper proposes a novel method of rolling bearing fault diagnosis based on a combination of ensemble empirical mode decomposition (EEMD), weighted permutation entropy (WPE) and an improved support vector machine (SVM) ensemble classifier. A hybrid voting (HV) strategy that combines SVM-based classifiers and cloud similarity measurement (CSM) was employed to improve the classification accuracy. First, the WPE value of the bearing vibration signal was calculated to detect the fault. Secondly, if a bearing fault occurred, the vibration signal was decomposed into a set of intrinsic mode functions (IMFs) by EEMD. The WPE values of the first several IMFs were calculated to form the fault feature vectors. Then, the SVM ensemble classifier was composed of binary SVM and the HV strategy to identify the bearing multi-fault types. Finally, the proposed model was fully evaluated by experiments and comparative studies. The results demonstrate that the proposed method can effectively detect bearing faults and maintain a high accuracy rate of fault recognition when a small number of training samples are available.

  2. Folding pathway of a multidomain protein depends on its topology of domain connectivity

    PubMed Central

    Inanami, Takashi; Terada, Tomoki P.; Sasai, Masaki

    2014-01-01

    How do the folding mechanisms of multidomain proteins depend on protein topology? We addressed this question by developing an Ising-like structure-based model and applying it for the analysis of free-energy landscapes and folding kinetics of an example protein, Escherichia coli dihydrofolate reductase (DHFR). DHFR has two domains, one comprising discontinuous N- and C-terminal parts and the other comprising a continuous middle part of the chain. The simulated folding pathway of DHFR is a sequential process during which the continuous domain folds first, followed by the discontinuous domain, thereby avoiding the rapid decrease in conformation entropy caused by the association of the N- and C-terminal parts during the early phase of folding. Our simulated results consistently explain the observed experimental data on folding kinetics and predict an off-pathway structural fluctuation at equilibrium. For a circular permutant for which the topological complexity of wild-type DHFR is resolved, the balance between energy and entropy is modulated, resulting in the coexistence of the two folding pathways. This coexistence of pathways should account for the experimentally observed complex folding behavior of the circular permutant. PMID:25267632

  3. Multicomponent, Multiphase Thermodynamics of Swelling Porous Media With Electroquasistatics. 1. Macroscale Field Equations

    DTIC Science & Technology

    2001-08-08

    entropy inequality with independent variables consistent with several natural systems and apply the resulting constitutive theory near equi- librium...1973. [3] L. S. Bennethum and J. H. Cushman. Multiscale , hybrid mixture theory for swelling systems - I: Balance laws. International Journal of...Engineering Science, 34(2):125–145, 1996. [4] L. S. Bennethum and J. H. Cushman. Multiscale , hybrid mixture theory for swelling systems - II: Constitutive

  4. Temperature variability analysis using wavelets and multiscale entropy in patients with systemic inflammatory response syndrome, sepsis, and septic shock.

    PubMed

    Papaioannou, Vasilios E; Chouvarda, Ioanna G; Maglaveras, Nikos K; Pneumatikos, Ioannis A

    2012-12-12

    Even though temperature is a continuous quantitative variable, its measurement has been considered a snapshot of a process, indicating whether a patient is febrile or afebrile. Recently, other diagnostic techniques have been proposed for the association between different properties of the temperature curve with severity of illness in the Intensive Care Unit (ICU), based on complexity analysis of continuously monitored body temperature. In this study, we tried to assess temperature complexity in patients with systemic inflammation during a suspected ICU-acquired infection, by using wavelets transformation and multiscale entropy of temperature signals, in a cohort of mixed critically ill patients. Twenty-two patients were enrolled in the study. In five, systemic inflammatory response syndrome (SIRS, group 1) developed, 10 had sepsis (group 2), and seven had septic shock (group 3). All temperature curves were studied during the first 24 hours of an inflammatory state. A wavelet transformation was applied, decomposing the signal in different frequency components (scales) that have been found to reflect neurogenic and metabolic inputs on temperature oscillations. Wavelet energy and entropy per different scales associated with complexity in specific frequency bands and multiscale entropy of the whole signal were calculated. Moreover, a clustering technique and a linear discriminant analysis (LDA) were applied for permitting pattern recognition in data sets and assessing diagnostic accuracy of different wavelet features among the three classes of patients. Statistically significant differences were found in wavelet entropy between patients with SIRS and groups 2 and 3, and in specific ultradian bands between SIRS and group 3, with decreased entropy in sepsis. Cluster analysis using wavelet features in specific bands revealed concrete clusters closely related with the groups in focus. LDA after wrapper-based feature selection was able to classify with an accuracy of more than 80% SIRS from the two sepsis groups, based on multiparametric patterns of entropy values in the very low frequencies and indicating reduced metabolic inputs on local thermoregulation, probably associated with extensive vasodilatation. We suggest that complexity analysis of temperature signals can assess inherent thermoregulatory dynamics during systemic inflammation and has increased discriminating value in patients with infectious versus noninfectious conditions, probably associated with severity of illness.

  5. Temperature variability analysis using wavelets and multiscale entropy in patients with systemic inflammatory response syndrome, sepsis, and septic shock

    PubMed Central

    2012-01-01

    Background Even though temperature is a continuous quantitative variable, its measurement has been considered a snapshot of a process, indicating whether a patient is febrile or afebrile. Recently, other diagnostic techniques have been proposed for the association between different properties of the temperature curve with severity of illness in the Intensive Care Unit (ICU), based on complexity analysis of continuously monitored body temperature. In this study, we tried to assess temperature complexity in patients with systemic inflammation during a suspected ICU-acquired infection, by using wavelets transformation and multiscale entropy of temperature signals, in a cohort of mixed critically ill patients. Methods Twenty-two patients were enrolled in the study. In five, systemic inflammatory response syndrome (SIRS, group 1) developed, 10 had sepsis (group 2), and seven had septic shock (group 3). All temperature curves were studied during the first 24 hours of an inflammatory state. A wavelet transformation was applied, decomposing the signal in different frequency components (scales) that have been found to reflect neurogenic and metabolic inputs on temperature oscillations. Wavelet energy and entropy per different scales associated with complexity in specific frequency bands and multiscale entropy of the whole signal were calculated. Moreover, a clustering technique and a linear discriminant analysis (LDA) were applied for permitting pattern recognition in data sets and assessing diagnostic accuracy of different wavelet features among the three classes of patients. Results Statistically significant differences were found in wavelet entropy between patients with SIRS and groups 2 and 3, and in specific ultradian bands between SIRS and group 3, with decreased entropy in sepsis. Cluster analysis using wavelet features in specific bands revealed concrete clusters closely related with the groups in focus. LDA after wrapper-based feature selection was able to classify with an accuracy of more than 80% SIRS from the two sepsis groups, based on multiparametric patterns of entropy values in the very low frequencies and indicating reduced metabolic inputs on local thermoregulation, probably associated with extensive vasodilatation. Conclusions We suggest that complexity analysis of temperature signals can assess inherent thermoregulatory dynamics during systemic inflammation and has increased discriminating value in patients with infectious versus noninfectious conditions, probably associated with severity of illness. PMID:22424316

  6. Complexity-entropy causality plane: A useful approach for distinguishing songs

    NASA Astrophysics Data System (ADS)

    Ribeiro, Haroldo V.; Zunino, Luciano; Mendes, Renio S.; Lenzi, Ervin K.

    2012-04-01

    Nowadays we are often faced with huge databases resulting from the rapid growth of data storage technologies. This is particularly true when dealing with music databases. In this context, it is essential to have techniques and tools able to discriminate properties from these massive sets. In this work, we report on a statistical analysis of more than ten thousand songs aiming to obtain a complexity hierarchy. Our approach is based on the estimation of the permutation entropy combined with an intensive complexity measure, building up the complexity-entropy causality plane. The results obtained indicate that this representation space is very promising to discriminate songs as well as to allow a relative quantitative comparison among songs. Additionally, we believe that the here-reported method may be applied in practical situations since it is simple, robust and has a fast numerical implementation.

  7. Complexity and multifractal behaviors of multiscale-continuum percolation financial system for Chinese stock markets

    NASA Astrophysics Data System (ADS)

    Zeng, Yayun; Wang, Jun; Xu, Kaixuan

    2017-04-01

    A new financial agent-based time series model is developed and investigated by multiscale-continuum percolation system, which can be viewed as an extended version of continuum percolation system. In this financial model, for different parameters of proportion and density, two Poisson point processes (where the radii of points represent the ability of receiving or transmitting information among investors) are applied to model a random stock price process, in an attempt to investigate the fluctuation dynamics of the financial market. To validate its effectiveness and rationality, we compare the statistical behaviors and the multifractal behaviors of the simulated data derived from the proposed model with those of the real stock markets. Further, the multiscale sample entropy analysis is employed to study the complexity of the returns, and the cross-sample entropy analysis is applied to measure the degree of asynchrony of return autocorrelation time series. The empirical results indicate that the proposed financial model can simulate and reproduce some significant characteristics of the real stock markets to a certain extent.

  8. EEG artifacts reduction by multivariate empirical mode decomposition and multiscale entropy for monitoring depth of anaesthesia during surgery.

    PubMed

    Liu, Quan; Chen, Yi-Feng; Fan, Shou-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2017-08-01

    Electroencephalography (EEG) has been widely utilized to measure the depth of anaesthesia (DOA) during operation. However, the EEG signals are usually contaminated by artifacts which have a consequence on the measured DOA accuracy. In this study, an effective and useful filtering algorithm based on multivariate empirical mode decomposition and multiscale entropy (MSE) is proposed to measure DOA. Mean entropy of MSE is used as an index to find artifacts-free intrinsic mode functions. The effect of different levels of artifacts on the performances of the proposed filtering is analysed using simulated data. Furthermore, 21 patients' EEG signals are collected and analysed using sample entropy to calculate the complexity for monitoring DOA. The correlation coefficients of entropy and bispectral index (BIS) results show 0.14 ± 0.30 and 0.63 ± 0.09 before and after filtering, respectively. Artificial neural network (ANN) model is used for range mapping in order to correlate the measurements with BIS. The ANN method results show strong correlation coefficient (0.75 ± 0.08). The results in this paper verify that entropy values and BIS have a strong correlation for the purpose of DOA monitoring and the proposed filtering method can effectively filter artifacts from EEG signals. The proposed method performs better than the commonly used wavelet denoising method. This study provides a fully adaptive and automated filter for EEG to measure DOA more accuracy and thus reduce risk related to maintenance of anaesthetic agents.

  9. Complexity analysis of brain activity in attention-deficit/hyperactivity disorder: A multiscale entropy analysis.

    PubMed

    Chenxi, Li; Chen, Yanni; Li, Youjun; Wang, Jue; Liu, Tian

    2016-06-01

    The multiscale entropy (MSE) is a novel method for quantifying the intrinsic dynamical complexity of physiological systems over several scales. To evaluate this method as a promising way to explore the neural mechanisms in ADHD, we calculated the MSE in EEG activity during the designed task. EEG data were collected from 13 outpatient boys with a confirmed diagnosis of ADHD and 13 age- and gender-matched normal control children during their doing multi-source interference task (MSIT). We estimated the MSE by calculating the sample entropy values of delta, theta, alpha and beta frequency bands over twenty time scales using coarse-grained procedure. The results showed increased complexity of EEG data in delta and theta frequency bands and decreased complexity in alpha frequency bands in ADHD children. The findings of this study revealed aberrant neural connectivity of kids with ADHD during interference task. The results showed that MSE method may be a new index to identify and understand the neural mechanism of ADHD. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Symmetric encryption algorithms using chaotic and non-chaotic generators: A review

    PubMed Central

    Radwan, Ahmed G.; AbdElHaleem, Sherif H.; Abd-El-Hafiz, Salwa K.

    2015-01-01

    This paper summarizes the symmetric image encryption results of 27 different algorithms, which include substitution-only, permutation-only or both phases. The cores of these algorithms are based on several discrete chaotic maps (Arnold’s cat map and a combination of three generalized maps), one continuous chaotic system (Lorenz) and two non-chaotic generators (fractals and chess-based algorithms). Each algorithm has been analyzed by the correlation coefficients between pixels (horizontal, vertical and diagonal), differential attack measures, Mean Square Error (MSE), entropy, sensitivity analyses and the 15 standard tests of the National Institute of Standards and Technology (NIST) SP-800-22 statistical suite. The analyzed algorithms include a set of new image encryption algorithms based on non-chaotic generators, either using substitution only (using fractals) and permutation only (chess-based) or both. Moreover, two different permutation scenarios are presented where the permutation-phase has or does not have a relationship with the input image through an ON/OFF switch. Different encryption-key lengths and complexities are provided from short to long key to persist brute-force attacks. In addition, sensitivities of those different techniques to a one bit change in the input parameters of the substitution key as well as the permutation key are assessed. Finally, a comparative discussion of this work versus many recent research with respect to the used generators, type of encryption, and analyses is presented to highlight the strengths and added contribution of this paper. PMID:26966561

  11. Development of isothermal-isobaric replica-permutation method for molecular dynamics and Monte Carlo simulations and its application to reveal temperature and pressure dependence of folded, misfolded, and unfolded states of chignolin

    NASA Astrophysics Data System (ADS)

    Yamauchi, Masataka; Okumura, Hisashi

    2017-11-01

    We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.

  12. Compression based entropy estimation of heart rate variability on multiple time scales.

    PubMed

    Baumert, Mathias; Voss, Andreas; Javorka, Michal

    2013-01-01

    Heart rate fluctuates beat by beat in a complex manner. The aim of this study was to develop a framework for entropy assessment of heart rate fluctuations on multiple time scales. We employed the Lempel-Ziv algorithm for lossless data compression to investigate the compressibility of RR interval time series on different time scales, using a coarse-graining procedure. We estimated the entropy of RR interval time series of 20 young and 20 old subjects and also investigated the compressibility of randomly shuffled surrogate RR time series. The original RR time series displayed significantly smaller compression entropy values than randomized RR interval data. The RR interval time series of older subjects showed significantly different entropy characteristics over multiple time scales than those of younger subjects. In conclusion, data compression may be useful approach for multiscale entropy assessment of heart rate variability.

  13. Quantum Tomography via Compressed Sensing: Error Bounds, Sample Complexity and Efficient Estimators

    DTIC Science & Technology

    2012-09-27

    particular, we require no entangling gates or ancillary systems for the procedure. In contrast with [19], our method is not restricted to processes that are...of states, such as those recently developed for use with permutation-invariant states [60], matrix product states [61] or multi-scale entangled states...process tomography: first prepare the Jamiołkowski state ρE (by adjoining an ancilla, preparing the maximally entangled state |ψ0, and applying E); then

  14. Discrimination of emotional states from scalp- and intracranial EEG using multiscale Rényi entropy.

    PubMed

    Tonoyan, Yelena; Chanwimalueang, Theerasak; Mandic, Danilo P; Van Hulle, Marc M

    2017-01-01

    A data-adaptive, multiscale version of Rényi's quadratic entropy (RQE) is introduced for emotional state discrimination from EEG recordings. The algorithm is applied to scalp EEG recordings of 30 participants watching 4 emotionally-charged video clips taken from a validated public database. Krippendorff's inter-rater statistic reveals that multiscale RQE of the mid-frontal scalp electrodes best discriminates between five emotional states. Multiscale RQE is also applied to joint scalp EEG, amygdala- and occipital pole intracranial recordings of an implanted patient watching a neutral and an emotionally charged video clip. Unlike for the neutral video clip, the RQEs of the mid-frontal scalp electrodes and the amygdala-implanted electrodes are observed to coincide in the time range where the crux of the emotionally-charged video clip is revealed. In addition, also during this time range, phase synchrony between the amygdala and mid-frontal recordings is maximal, as well as our 30 participants' inter-rater agreement on the same video clip. A source reconstruction exercise using intracranial recordings supports our assertion that amygdala could contribute to mid-frontal scalp EEG. On the contrary, no such contribution was observed for the occipital pole's intracranial recordings. Our results suggest that emotional states discriminated from mid-frontal scalp EEG are likely to be mirrored by differences in amygdala activations in particular when recorded in response to emotionally-charged scenes.

  15. Discrimination of emotional states from scalp- and intracranial EEG using multiscale Rényi entropy

    PubMed Central

    Chanwimalueang, Theerasak; Mandic, Danilo P.; Van Hulle, Marc M.

    2017-01-01

    A data-adaptive, multiscale version of Rényi’s quadratic entropy (RQE) is introduced for emotional state discrimination from EEG recordings. The algorithm is applied to scalp EEG recordings of 30 participants watching 4 emotionally-charged video clips taken from a validated public database. Krippendorff’s inter-rater statistic reveals that multiscale RQE of the mid-frontal scalp electrodes best discriminates between five emotional states. Multiscale RQE is also applied to joint scalp EEG, amygdala- and occipital pole intracranial recordings of an implanted patient watching a neutral and an emotionally charged video clip. Unlike for the neutral video clip, the RQEs of the mid-frontal scalp electrodes and the amygdala-implanted electrodes are observed to coincide in the time range where the crux of the emotionally-charged video clip is revealed. In addition, also during this time range, phase synchrony between the amygdala and mid-frontal recordings is maximal, as well as our 30 participants’ inter-rater agreement on the same video clip. A source reconstruction exercise using intracranial recordings supports our assertion that amygdala could contribute to mid-frontal scalp EEG. On the contrary, no such contribution was observed for the occipital pole’s intracranial recordings. Our results suggest that emotional states discriminated from mid-frontal scalp EEG are likely to be mirrored by differences in amygdala activations in particular when recorded in response to emotionally-charged scenes. PMID:29099846

  16. Shearlet-based measures of entropy and complexity for two-dimensional patterns

    NASA Astrophysics Data System (ADS)

    Brazhe, Alexey

    2018-06-01

    New spatial entropy and complexity measures for two-dimensional patterns are proposed. The approach is based on the notion of disequilibrium and is built on statistics of directional multiscale coefficients of the fast finite shearlet transform. Shannon entropy and Jensen-Shannon divergence measures are employed. Both local and global spatial complexity and entropy estimates can be obtained, thus allowing for spatial mapping of complexity in inhomogeneous patterns. The algorithm is validated in numerical experiments with a gradually decaying periodic pattern and Ising surfaces near critical state. It is concluded that the proposed algorithm can be instrumental in describing a wide range of two-dimensional imaging data, textures, or surfaces, where an understanding of the level of order or randomness is desired.

  17. How long the singular value decomposed entropy predicts the stock market? - Evidence from the Dow Jones Industrial Average Index

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao; Shao, Yanmin

    2016-07-01

    In this paper, a new concept of multi-scales singular value decomposition entropy based on DCCA cross correlation analysis is proposed and its predictive power for the Dow Jones Industrial Average Index is studied. Using Granger causality analysis with different time scales, it is found that, the singular value decomposition entropy has predictive power for the Dow Jones Industrial Average Index for period less than one month, but not for more than one month. This shows how long the singular value decomposition entropy predicts the stock market that extends Caraiani's result obtained in Caraiani (2014). On the other hand, the result also shows an essential characteristic of stock market as a chaotic dynamic system.

  18. Entropy of hydrological systems under small samples: Uncertainty and variability

    NASA Astrophysics Data System (ADS)

    Liu, Dengfeng; Wang, Dong; Wang, Yuankun; Wu, Jichun; Singh, Vijay P.; Zeng, Xiankui; Wang, Lachun; Chen, Yuanfang; Chen, Xi; Zhang, Liyuan; Gu, Shenghua

    2016-01-01

    Entropy theory has been increasingly applied in hydrology in both descriptive and inferential ways. However, little attention has been given to the small-sample condition widespread in hydrological practice, where either hydrological measurements are limited or are even nonexistent. Accordingly, entropy estimated under this condition may incur considerable bias. In this study, small-sample condition is considered and two innovative entropy estimators, the Chao-Shen (CS) estimator and the James-Stein-type shrinkage (JSS) estimator, are introduced. Simulation tests are conducted with common distributions in hydrology, that lead to the best-performing JSS estimator. Then, multi-scale moving entropy-based hydrological analyses (MM-EHA) are applied to indicate the changing patterns of uncertainty of streamflow data collected from the Yangtze River and the Yellow River, China. For further investigation into the intrinsic property of entropy applied in hydrological uncertainty analyses, correlations of entropy and other statistics at different time-scales are also calculated, which show connections between the concept of uncertainty and variability.

  19. Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer's disease.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Fernandez, Alberto; Escudero, Javier

    2016-08-01

    Alzheimer's disease (AD) is a progressive degenerative brain disorder affecting memory, thinking, behaviour and emotion. It is the most common form of dementia and a big social problem in western societies. The analysis of brain activity may help to diagnose this disease. Changes in entropy methods have been reported useful in research studies to characterize AD. We have recently proposed dispersion entropy (DisEn) as a very fast and powerful tool to quantify the irregularity of time series. The aim of this paper is to evaluate the ability of DisEn, in comparison with fuzzy entropy (FuzEn), sample entropy (SampEn), and permutation entropy (PerEn), to discriminate 36 AD patients from 26 elderly control subjects using resting-state magnetoencephalogram (MEG) signals. The results obtained by DisEn, FuzEn, and SampEn, unlike PerEn, show that the AD patients' signals are more regular than controls' time series. The p-values obtained by DisEn, FuzEn, SampEn, and PerEn based methods demonstrate the superiority of DisEn over PerEn, SampEn, and PerEn. Moreover, the computation time for the newly proposed DisEn-based method is noticeably less than for the FuzEn, SampEn, and PerEn based approaches.

  20. Comparison of background EEG activity of different groups of patients with idiopathic epilepsy using Shannon spectral entropy and cluster-based permutation statistical testing

    PubMed Central

    Artieda, Julio; Iriarte, Jorge

    2017-01-01

    Idiopathic epilepsy is characterized by generalized seizures with no apparent cause. One of its main problems is the lack of biomarkers to monitor the evolution of patients. The only tools they can use are limited to inspecting the amount of seizures during previous periods of time and assessing the existence of interictal discharges. As a result, there is a need for improving the tools to assist the diagnosis and follow up of these patients. The goal of the present study is to compare and find a way to differentiate between two groups of patients suffering from idiopathic epilepsy, one group that could be followed-up by means of specific electroencephalographic (EEG) signatures (intercritical activity present), and another one that could not due to the absence of these markers. To do that, we analyzed the background EEG activity of each in the absence of seizures and epileptic intercritical activity. We used the Shannon spectral entropy (SSE) as a metric to discriminate between the two groups and performed permutation-based statistical tests to detect the set of frequencies that show significant differences. By constraining the spectral entropy estimation to the [6.25–12.89) Hz range, we detect statistical differences (at below 0.05 alpha-level) between both types of epileptic patients at all available recording channels. Interestingly, entropy values follow a trend that is inversely related to the elapsed time from the last seizure. Indeed, this trend shows asymptotical convergence to the SSE values measured in a group of healthy subjects, which present SSE values lower than any of the two groups of patients. All these results suggest that the SSE, measured in a specific range of frequencies, could serve to follow up the evolution of patients suffering from idiopathic epilepsy. Future studies remain to be conducted in order to assess the predictive value of this approach for the anticipation of seizures. PMID:28922360

  1. Investigation on thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing in scramjet cooling channel based on wavelet entropy method

    NASA Astrophysics Data System (ADS)

    Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen

    2018-06-01

    As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.

  2. A multiple-fan active control wind tunnel for outdoor wind speed and direction simulation

    NASA Astrophysics Data System (ADS)

    Wang, Jia-Ying; Meng, Qing-Hao; Luo, Bing; Zeng, Ming

    2018-03-01

    This article presents a new type of active controlled multiple-fan wind tunnel. The wind tunnel consists of swivel plates and arrays of direct current fans, and the rotation speed of each fan and the shaft angle of each swivel plate can be controlled independently for simulating different kinds of outdoor wind fields. To measure the similarity between the simulated wind field and the outdoor wind field, wind speed and direction time series of two kinds of wind fields are recorded by nine two-dimensional ultrasonic anemometers, and then statistical properties of the wind signals in different time scales are analyzed based on the empirical mode decomposition. In addition, the complexity of wind speed and direction time series is also investigated using multiscale entropy and multivariate multiscale entropy. Results suggest that the simulated wind field in the multiple-fan wind tunnel has a high degree of similarity with the outdoor wind field.

  3. A fault diagnosis scheme for planetary gearboxes using modified multi-scale symbolic dynamic entropy and mRMR feature selection

    NASA Astrophysics Data System (ADS)

    Li, Yongbo; Yang, Yuantao; Li, Guoyan; Xu, Minqiang; Huang, Wenhu

    2017-07-01

    Health condition identification of planetary gearboxes is crucial to reduce the downtime and maximize productivity. This paper aims to develop a novel fault diagnosis method based on modified multi-scale symbolic dynamic entropy (MMSDE) and minimum redundancy maximum relevance (mRMR) to identify the different health conditions of planetary gearbox. MMSDE is proposed to quantify the regularity of time series, which can assess the dynamical characteristics over a range of scales. MMSDE has obvious advantages in the detection of dynamical changes and computation efficiency. Then, the mRMR approach is introduced to refine the fault features. Lastly, the obtained new features are fed into the least square support vector machine (LSSVM) to complete the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault types of planetary gearboxes.

  4. Intercomparison of Multiscale Modeling Approaches in Simulating Subsurface Flow and Transport

    NASA Astrophysics Data System (ADS)

    Yang, X.; Mehmani, Y.; Barajas-Solano, D. A.; Song, H. S.; Balhoff, M.; Tartakovsky, A. M.; Scheibe, T. D.

    2016-12-01

    Hybrid multiscale simulations that couple models across scales are critical to advance predictions of the larger system behavior using understanding of fundamental processes. In the current study, three hybrid multiscale methods are intercompared: multiscale loose-coupling method, multiscale finite volume (MsFV) method and multiscale mortar method. The loose-coupling method enables a parallel workflow structure based on the Swift scripting environment that manages the complex process of executing coupled micro- and macro-scale models without being intrusive to the at-scale simulators. The MsFV method applies microscale and macroscale models over overlapping subdomains of the modeling domain and enforces continuity of concentration and transport fluxes between models via restriction and prolongation operators. The mortar method is a non-overlapping domain decomposition approach capable of coupling all permutations of pore- and continuum-scale models with each other. In doing so, Lagrange multipliers are used at interfaces shared between the subdomains so as to establish continuity of species/fluid mass flux. Subdomain computations can be performed either concurrently or non-concurrently depending on the algorithm used. All the above methods have been proven to be accurate and efficient in studying flow and transport in porous media. However, there has not been any field-scale applications and benchmarking among various hybrid multiscale approaches. To address this challenge, we apply all three hybrid multiscale methods to simulate water flow and transport in a conceptualized 2D modeling domain of the hyporheic zone, where strong interactions between groundwater and surface water exist across multiple scales. In all three multiscale methods, fine-scale simulations are applied to a thin layer of riverbed alluvial sediments while the macroscopic simulations are used for the larger subsurface aquifer domain. Different numerical coupling methods are then applied between scales and inter-compared. Comparisons are drawn in terms of velocity distributions, solute transport behavior, algorithm-induced numerical error and computing cost. The intercomparison work provides support for confidence in a variety of hybrid multiscale methods and motivates further development and applications.

  5. Quantum Tomography via Compressed Sensing: Error Bounds, Sample Complexity and Efficient Estimators (Open Access, Publisher’s Version)

    DTIC Science & Technology

    2012-09-27

    we require no entangling gates or ancillary systems for the procedure. In contrast with [19], our method is not restricted to processes that are...states, such as those recently developed for use with permutation-invariant states [60], matrix product states [61] or multi-scale entangled states [62...by adjoining an ancilla, preparing the maximally entangled state |ψ0〉, and applying E); then do compressed quantum state tomography on ρE ; see

  6. Age-related Multiscale Changes in Brain Signal Variability in Pre-task versus Post-task Resting-state EEG.

    PubMed

    Wang, Hongye; McIntosh, Anthony R; Kovacevic, Natasa; Karachalios, Maria; Protzner, Andrea B

    2016-07-01

    Recent empirical work suggests that, during healthy aging, the variability of network dynamics changes during task performance. Such variability appears to reflect the spontaneous formation and dissolution of different functional networks. We sought to extend these observations into resting-state dynamics. We recorded EEG in young, middle-aged, and older adults during a "rest-task-rest" design and investigated if aging modifies the interaction between resting-state activity and external stimulus-induced activity. Using multiscale entropy as our measure of variability, we found that, with increasing age, resting-state dynamics shifts from distributed to more local neural processing, especially at posterior sources. In the young group, resting-state dynamics also changed from pre- to post-task, where fine-scale entropy increased in task-positive regions and coarse-scale entropy increased in the posterior cingulate, a key region associated with the default mode network. Lastly, pre- and post-task resting-state dynamics were linked to performance on the intervening task for all age groups, but this relationship became weaker with increasing age. Our results suggest that age-related changes in resting-state dynamics occur across different spatial and temporal scales and have consequences for information processing capacity.

  7. BGK-MD, Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haack, Jeffrey; Shohet, Gil

    2016-12-02

    The software implements a heterogeneous multiscale method (HMM), which involves solving a classical molecular dynamics (MD) problem and then computes the entropy production in order to compute the relaxation times towards equilibrium for use in a Bhatnagar-Gross-Krook (BGK) solver.

  8. Path length entropy analysis of diastolic heart sounds.

    PubMed

    Griffel, Benjamin; Zia, Mohammad K; Fridman, Vladamir; Saponieri, Cesare; Semmlow, John L

    2013-09-01

    Early detection of coronary artery disease (CAD) using the acoustic approach, a noninvasive and cost-effective method, would greatly improve the outcome of CAD patients. To detect CAD, we analyze diastolic sounds for possible CAD murmurs. We observed diastolic sounds to exhibit 1/f structure and developed a new method, path length entropy (PLE) and a scaled version (SPLE), to characterize this structure to improve CAD detection. We compare SPLE results to Hurst exponent, Sample entropy and Multiscale entropy for distinguishing between normal and CAD patients. SPLE achieved a sensitivity-specificity of 80%-81%, the best of the tested methods. However, PLE and SPLE are not sufficient to prove nonlinearity, and evaluation using surrogate data suggests that our cardiovascular sound recordings do not contain significant nonlinear properties. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Path Length Entropy Analysis of Diastolic Heart Sounds

    PubMed Central

    Griffel, B.; Zia, M. K.; Fridman, V.; Saponieri, C.; Semmlow, J. L.

    2013-01-01

    Early detection of coronary artery disease (CAD) using the acoustic approach, a noninvasive and cost-effective method, would greatly improve the outcome of CAD patients. To detect CAD, we analyze diastolic sounds for possible CAD murmurs. We observed diastolic sounds to exhibit 1/f structure and developed a new method, path length entropy (PLE) and a scaled version (SPLE), to characterize this structure to improve CAD detection. We compare SPLE results to Hurst exponent, Sample entropy and Multi-scale entropy for distinguishing between normal and CAD patients. SPLE achieved a sensitivity-specificity of 80%–81%, the best of the tested methods. However, PLE and SPLE are not sufficient to prove nonlinearity, and evaluation using surrogate data suggests that our cardiovascular sound recordings do not contain significant nonlinear properties. PMID:23930808

  10. Complexity analysis of the turbulent environmental fluid flow time series

    NASA Astrophysics Data System (ADS)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  11. EEG entropy measures indicate decrease of cortical information processing in Disorders of Consciousness.

    PubMed

    Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel

    2016-02-01

    Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  13. Shortening a loop can increase protein native state entropy.

    PubMed

    Gavrilov, Yulian; Dagan, Shlomi; Levy, Yaakov

    2015-12-01

    Protein loops are essential structural elements that influence not only function but also protein stability and folding rates. It was recently reported that shortening a loop in the AcP protein may increase its native state conformational entropy. This effect on the entropy of the folded state can be much larger than the lower entropic penalty of ordering a shorter loop upon folding, and can therefore result in a more pronounced stabilization than predicted by polymer model for loop closure entropy. In this study, which aims at generalizing the effect of loop length shortening on native state dynamics, we use all-atom molecular dynamics simulations to study how gradual shortening a very long or solvent-exposed loop region in four different proteins can affect their stability. For two proteins, AcP and Ubc7, we show an increase in native state entropy in addition to the known effect of the loop length on the unfolded state entropy. However, for two permutants of SH3 domain, shortening a loop results only with the expected change in the entropy of the unfolded state, which nicely reproduces the observed experimental stabilization. Here, we show that an increase in the native state entropy following loop shortening is not unique to the AcP protein, yet nor is it a general rule that applies to all proteins following the truncation of any loop. This modification of the loop length on the folded state and on the unfolded state may result with a greater effect on protein stability. © 2015 Wiley Periodicals, Inc.

  14. Scale-dependent intrinsic entropies of complex time series.

    PubMed

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).

  15. Classification of Partial Discharge Signals by Combining Adaptive Local Iterative Filtering and Entropy Features

    PubMed Central

    Morison, Gordon; Boreham, Philip

    2018-01-01

    Electromagnetic Interference (EMI) is a technique for capturing Partial Discharge (PD) signals in High-Voltage (HV) power plant apparatus. EMI signals can be non-stationary which makes their analysis difficult, particularly for pattern recognition applications. This paper elaborates upon a previously developed software condition-monitoring model for improved EMI events classification based on time-frequency signal decomposition and entropy features. The idea of the proposed method is to map multiple discharge source signals captured by EMI and labelled by experts, including PD, from the time domain to a feature space, which aids in the interpretation of subsequent fault information. Here, instead of using only one permutation entropy measure, a more robust measure, called Dispersion Entropy (DE), is added to the feature vector. Multi-Class Support Vector Machine (MCSVM) methods are utilized for classification of the different discharge sources. Results show an improved classification accuracy compared to previously proposed methods. This yields to a successful development of an expert’s knowledge-based intelligent system. Since this method is demonstrated to be successful with real field data, it brings the benefit of possible real-world application for EMI condition monitoring. PMID:29385030

  16. Simultaneous Multi-Scale Diffusion Estimation and Tractography Guided by Entropy Spectrum Pathways

    PubMed Central

    Galinsky, Vitaly L.; Frank, Lawrence R.

    2015-01-01

    We have developed a method for the simultaneous estimation of local diffusion and the global fiber tracts based upon the information entropy flow that computes the maximum entropy trajectories between locations and depends upon the global structure of the multi-dimensional and multi-modal diffusion field. Computation of the entropy spectrum pathways requires only solving a simple eigenvector problem for the probability distribution for which efficient numerical routines exist, and a straight forward integration of the probability conservation through ray tracing of the convective modes guided by a global structure of the entropy spectrum coupled with a small scale local diffusion. The intervoxel diffusion is sampled by multi b-shell multi q-angle DWI data expanded in spherical waves. This novel approach to fiber tracking incorporates global information about multiple fiber crossings in every individual voxel and ranks it in the most scientifically rigorous way. This method has potential significance for a wide range of applications, including studies of brain connectivity. PMID:25532167

  17. New Insights into the Fractional Order Diffusion Equation Using Entropy and Kurtosis.

    PubMed

    Ingo, Carson; Magin, Richard L; Parrish, Todd B

    2014-11-01

    Fractional order derivative operators offer a concise description to model multi-scale, heterogeneous and non-local systems. Specifically, in magnetic resonance imaging, there has been recent work to apply fractional order derivatives to model the non-Gaussian diffusion signal, which is ubiquitous in the movement of water protons within biological tissue. To provide a new perspective for establishing the utility of fractional order models, we apply entropy for the case of anomalous diffusion governed by a fractional order diffusion equation generalized in space and in time. This fractional order representation, in the form of the Mittag-Leffler function, gives an entropy minimum for the integer case of Gaussian diffusion and greater values of spectral entropy for non-integer values of the space and time derivatives. Furthermore, we consider kurtosis, defined as the normalized fourth moment, as another probabilistic description of the fractional time derivative. Finally, we demonstrate the implementation of anomalous diffusion, entropy and kurtosis measurements in diffusion weighted magnetic resonance imaging in the brain of a chronic ischemic stroke patient.

  18. Improved wavelet packet classification algorithm for vibrational intrusions in distributed fiber-optic monitoring systems

    NASA Astrophysics Data System (ADS)

    Wang, Bingjie; Pi, Shaohua; Sun, Qi; Jia, Bo

    2015-05-01

    An improved classification algorithm that considers multiscale wavelet packet Shannon entropy is proposed. Decomposition coefficients at all levels are obtained to build the initial Shannon entropy feature vector. After subtracting the Shannon entropy map of the background signal, components of the strongest discriminating power in the initial feature vector are picked out to rebuild the Shannon entropy feature vector, which is transferred to radial basis function (RBF) neural network for classification. Four types of man-made vibrational intrusion signals are recorded based on a modified Sagnac interferometer. The performance of the improved classification algorithm has been evaluated by the classification experiments via RBF neural network under different diffusion coefficients. An 85% classification accuracy rate is achieved, which is higher than the other common algorithms. The classification results show that this improved classification algorithm can be used to classify vibrational intrusion signals in an automatic real-time monitoring system.

  19. The effect of orthostatic stress on multiscale entropy of heart rate and blood pressure.

    PubMed

    Turianikova, Zuzana; Javorka, Kamil; Baumert, Mathias; Calkovska, Andrea; Javorka, Michal

    2011-09-01

    Cardiovascular control acts over multiple time scales, which introduces a significant amount of complexity to heart rate and blood pressure time series. Multiscale entropy (MSE) analysis has been developed to quantify the complexity of a time series over multiple time scales. In previous studies, MSE analyses identified impaired cardiovascular control and increased cardiovascular risk in various pathological conditions. Despite the increasing acceptance of the MSE technique in clinical research, information underpinning the involvement of the autonomic nervous system in the MSE of heart rate and blood pressure is lacking. The objective of this study is to investigate the effect of orthostatic challenge on the MSE of heart rate and blood pressure variability (HRV, BPV) and the correlation between MSE (complexity measures) and traditional linear (time and frequency domain) measures. MSE analysis of HRV and BPV was performed in 28 healthy young subjects on 1000 consecutive heart beats in the supine and standing positions. Sample entropy values were assessed on scales of 1-10. We found that MSE of heart rate and blood pressure signals is sensitive to changes in autonomic balance caused by postural change from the supine to the standing position. The effect of orthostatic challenge on heart rate and blood pressure complexity depended on the time scale under investigation. Entropy values did not correlate with the mean values of heart rate and blood pressure and showed only weak correlations with linear HRV and BPV measures. In conclusion, the MSE analysis of heart rate and blood pressure provides a sensitive tool to detect changes in autonomic balance as induced by postural change.

  20. Multiscale synchrony behaviors of paired financial time series by 3D multi-continuum percolation

    NASA Astrophysics Data System (ADS)

    Wang, M.; Wang, J.; Wang, B. T.

    2018-02-01

    Multiscale synchrony behaviors and nonlinear dynamics of paired financial time series are investigated, in an attempt to study the cross correlation relationships between two stock markets. A random stock price model is developed by a new system called three-dimensional (3D) multi-continuum percolation system, which is utilized to imitate the formation mechanism of price dynamics and explain the nonlinear behaviors found in financial time series. We assume that the price fluctuations are caused by the spread of investment information. The cluster of 3D multi-continuum percolation represents the cluster of investors who share the same investment attitude. In this paper, we focus on the paired return series, the paired volatility series, and the paired intrinsic mode functions which are decomposed by empirical mode decomposition. A new cross recurrence quantification analysis is put forward, combining with multiscale cross-sample entropy, to investigate the multiscale synchrony of these paired series from the proposed model. The corresponding research is also carried out for two China stock markets as comparison.

  1. Multiscale Entropy of Electroencephalogram as a Potential Predictor for the Prognosis of Neonatal Seizures.

    PubMed

    Lu, Wen-Yu; Chen, Jyun-Yu; Chang, Chi-Feng; Weng, Wen-Chin; Lee, Wang-Tso; Shieh, Jiann-Shing

    2015-01-01

    Increasing animal studies supported the harmful effects of prolonged or frequent neonatal seizures in developing brain, including increased risk of later epilepsy. Various nonlinear analytic measures had been applied to investigate the change of brain complexity with age. This study focuses on clarifying the relationship between later epilepsy and the changes of electroencephalogram (EEG) complexity in neonatal seizures. EEG signals from 19 channels of the whole brain from 32 neonates below 2 months old were acquired. The neonates were classified into 3 groups: 9 were normal controls, 9 were neonatal seizures without later epilepsy, and 14 were neonatal seizures with later epilepsy. Sample entropy (SamEn), multiscale entropy (MSE) and complexity index (CI) were analyzed. Although there was no significant change in SamEn, the CI values showed significantly decreased over Channels C3, C4, and Cz in patients with neonatal seizures and later epilepsy compared with control group. More multifocal epileptiform discharges in EEG, more abnormal neuroimaging findings, and higher incidence of future developmental delay were noted in the group with later epilepsy. Decreased MSE and CI values in patients with neonatal seizures and later epilepsy may reflect the mixed effects of acute insults, underlying brain immaturity, and prolonged seizures-related injuries. The analysis of MSE and CI can therefore provide a quantifiable and accurate way to decrypt the mystery of neonatal seizures, and could be a promising predictor.

  2. Analysis of financial time series using multiscale entropy based on skewness and kurtosis

    NASA Astrophysics Data System (ADS)

    Xu, Meng; Shang, Pengjian

    2018-01-01

    There is a great interest in studying dynamic characteristics of the financial time series of the daily stock closing price in different regions. Multi-scale entropy (MSE) is effective, mainly in quantifying the complexity of time series on different time scales. This paper applies a new method for financial stability from the perspective of MSE based on skewness and kurtosis. To better understand the superior coarse-graining method for the different kinds of stock indexes, we take into account the developmental characteristics of the three continents of Asia, North America and European stock markets. We study the volatility of different financial time series in addition to analyze the similarities and differences of coarsening time series from the perspective of skewness and kurtosis. A kind of corresponding relationship between the entropy value of stock sequences and the degree of stability of financial markets, were observed. The three stocks which have particular characteristics in the eight piece of stock sequences were discussed, finding the fact that it matches the result of applying the MSE method to showing results on a graph. A comparative study is conducted to simulate over synthetic and real world data. Results show that the modified method is more effective to the change of dynamics and has more valuable information. The result is obtained at the same time, finding the results of skewness and kurtosis discrimination is obvious, but also more stable.

  3. On the efficiency of sovereign bond markets

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Fernández Bariviera, Aurelio; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2012-09-01

    The existence of memory in financial time series has been extensively studied for several stock markets around the world by means of different approaches. However, fixed income markets, i.e. those where corporate and sovereign bonds are traded, have been much less studied. We believe that, given the relevance of these markets, not only from the investors', but also from the issuers' point of view (government and firms), it is necessary to fill this gap in the literature. In this paper, we study the sovereign market efficiency of thirty bond indices of both developed and emerging countries, using an innovative statistical tool in the financial literature: the complexity-entropy causality plane. This representation space allows us to establish an efficiency ranking of different markets and distinguish different bond market dynamics. We conclude that the classification derived from the complexity-entropy causality plane is consistent with the qualifications assigned by major rating companies to the sovereign instruments. Additionally, we find a correlation between permutation entropy, economic development and market size that could be of interest for policy makers and investors.

  4. Classification enhancement for post-stroke dementia using fuzzy neighborhood preserving analysis with QR-decomposition.

    PubMed

    Al-Qazzaz, Noor Kamal; Ali, Sawal; Ahmad, Siti Anom; Escudero, Javier

    2017-07-01

    The aim of the present study was to discriminate the electroencephalogram (EEG) of 5 patients with vascular dementia (VaD), 15 patients with stroke-related mild cognitive impairment (MCI), and 15 control normal subjects during a working memory (WM) task. We used independent component analysis (ICA) and wavelet transform (WT) as a hybrid preprocessing approach for EEG artifact removal. Three different features were extracted from the cleaned EEG signals: spectral entropy (SpecEn), permutation entropy (PerEn) and Tsallis entropy (TsEn). Two classification schemes were applied - support vector machine (SVM) and k-nearest neighbors (kNN) - with fuzzy neighborhood preserving analysis with QR-decomposition (FNPAQR) as a dimensionality reduction technique. The FNPAQR dimensionality reduction technique increased the SVM classification accuracy from 82.22% to 90.37% and from 82.6% to 86.67% for kNN. These results suggest that FNPAQR consistently improves the discrimination of VaD, MCI patients and control normal subjects and it could be a useful feature selection to help the identification of patients with VaD and MCI.

  5. Analyzing the complexity of cone production in longleaf pine by multiscale entropy

    Treesearch

    Xiongwen Chen; Qinfeng Guo; Dale G. Brockway

    2016-01-01

    The longleaf pine (Pinus palustris Mill.) forests are important ecosystems in the southeastern USA because of their ecological and economic value. Since European settlement, longleaf pine ecosystems have dramatically declined in extent, to the degree that they are now listed as endangered ecosystems. Its sporadic seed production, which...

  6. Integration of High-Resolution Laser Displacement Sensors and 3D Printing for Structural Health Monitoring

    PubMed Central

    Chang, Shu-Wei; Kuo, Shih-Yu; Huang, Ting-Hsuan

    2017-01-01

    This paper presents a novel experimental design for complex structural health monitoring (SHM) studies achieved by integrating 3D printing technologies, high-resolution laser displacement sensors, and multiscale entropy SHM theory. A seven-story structure with a variety of composite bracing systems was constructed using a dual-material 3D printer. A wireless Bluetooth vibration speaker was used to excite the ground floor of the structure, and high-resolution laser displacement sensors (1-μm resolution) were used to monitor the displacement history on different floors. Our results showed that the multiscale entropy SHM method could detect damage on the 3D-printed structures. The results of this study demonstrate that integrating 3D printing technologies and high-resolution laser displacement sensors enables the design of cheap, fast processing, complex, small-scale civil structures for future SHM studies. The novel experimental design proposed in this study provides a suitable platform for investigating the validity and sensitivity of SHM in different composite structures and damage conditions for real life applications in the future. PMID:29271937

  7. Integration of High-Resolution Laser Displacement Sensors and 3D Printing for Structural Health Monitoring.

    PubMed

    Chang, Shu-Wei; Lin, Tzu-Kang; Kuo, Shih-Yu; Huang, Ting-Hsuan

    2017-12-22

    This paper presents a novel experimental design for complex structural health monitoring (SHM) studies achieved by integrating 3D printing technologies, high-resolution laser displacement sensors, and multiscale entropy SHM theory. A seven-story structure with a variety of composite bracing systems was constructed using a dual-material 3D printer. A wireless Bluetooth vibration speaker was used to excite the ground floor of the structure, and high-resolution laser displacement sensors (1-μm resolution) were used to monitor the displacement history on different floors. Our results showed that the multiscale entropy SHM method could detect damage on the 3D-printed structures. The results of this study demonstrate that integrating 3D printing technologies and high-resolution laser displacement sensors enables the design of cheap, fast processing, complex, small-scale civil structures for future SHM studies. The novel experimental design proposed in this study provides a suitable platform for investigating the validity and sensitivity of SHM in different composite structures and damage conditions for real life applications in the future.

  8. Paroxysmal atrial fibrillation recognition based on multi-scale Rényi entropy of ECG.

    PubMed

    Xin, Yi; Zhao, Yizhang; Mu, Yuanhui; Li, Qin; Shi, Caicheng

    2017-07-20

    Atrial fibrillation (AF) is a common type of arrhythmia disease, which has a high morbidity and can lead to some serious complications. The ability to detect and in turn prevent AF is extremely significant to the patient and clinician. Using ECG to detect AF and develop a robust and effective algorithm is the primary objective of this study. Some studies show that after AF occurs, the regulatory mechanism of vagus nerve and sympathetic nerve will change. Each R-R interval will be absolutely unequal. After studying the physiological mechanism of AF, we will calculate the Rényi entropy of the wavelet coefficients of heart rate variability (HRV) in order to measure the complexity of PAF signals, as well as extract the multi-scale features of paroxysmal atrial fibrillation (PAF). The data used in this study is obtained from MIT-BIH PAF Prediction Challenge Database and the correct rate in classifying PAF patients from normal persons is 92.48%. The results of this experiment proved that AF could be detected by using this method and, in turn, provide opinions for clinical diagnosis.

  9. Dynamical glucometry: Use of multiscale entropy analysis in diabetes

    NASA Astrophysics Data System (ADS)

    Costa, Madalena D.; Henriques, Teresa; Munshi, Medha N.; Segal, Alissa R.; Goldberger, Ary L.

    2014-09-01

    Diabetes mellitus (DM) is one of the world's most prevalent medical conditions. Contemporary management focuses on lowering mean blood glucose values toward a normal range, but largely ignores the dynamics of glucose fluctuations. We probed analyte time series obtained from continuous glucose monitor (CGM) sensors. We show that the fluctuations in CGM values sampled every 5 min are not uncorrelated noise. Next, using multiscale entropy analysis, we quantified the complexity of the temporal structure of the CGM time series from a group of elderly subjects with type 2 DM and age-matched controls. We further probed the structure of these CGM time series using detrended fluctuation analysis. Our findings indicate that the dynamics of glucose fluctuations from control subjects are more complex than those of subjects with type 2 DM over time scales ranging from about 5 min to 5 h. These findings support consideration of a new framework, dynamical glucometry, to guide mechanistic research and to help assess and compare therapeutic interventions, which should enhance complexity of glucose fluctuations and not just lower mean and variance of blood glucose levels.

  10. Efficiency and credit ratings: a permutation-information-theory analysis

    NASA Astrophysics Data System (ADS)

    Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-08-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.

  11. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  12. Spatiotemporal Dependency of Age-Related Changes in Brain Signal Variability

    PubMed Central

    McIntosh, A. R.; Vakorin, V.; Kovacevic, N.; Wang, H.; Diaconescu, A.; Protzner, A. B.

    2014-01-01

    Recent theoretical and empirical work has focused on the variability of network dynamics in maturation. Such variability seems to reflect the spontaneous formation and dissolution of different functional networks. We sought to extend these observations into healthy aging. Two different data sets, one EEG (total n = 48, ages 18–72) and one magnetoencephalography (n = 31, ages 20–75) were analyzed for such spatiotemporal dependency using multiscale entropy (MSE) from regional brain sources. In both data sets, the changes in MSE were timescale dependent, with higher entropy at fine scales and lower at more coarse scales with greater age. The signals were parsed further into local entropy, related to information processed within a regional source, and distributed entropy (information shared between two sources, i.e., functional connectivity). Local entropy increased for most regions, whereas the dominant change in distributed entropy was age-related reductions across hemispheres. These data further the understanding of changes in brain signal variability across the lifespan, suggesting an inverted U-shaped curve, but with an important qualifier. Unlike earlier in maturation, where the changes are more widespread, changes in adulthood show strong spatiotemporal dependence. PMID:23395850

  13. Generation of skeletal mechanism by means of projected entropy participation indices

    NASA Astrophysics Data System (ADS)

    Paolucci, Samuel; Valorani, Mauro; Ciottoli, Pietro Paolo; Galassi, Riccardo Malpica

    2017-11-01

    When the dynamics of reactive systems develop very-slow and very-fast time scales separated by a range of active time scales, with gaps in the fast/active and slow/active time scales, then it is possible to achieve multi-scale adaptive model reduction along-with the integration of the ODEs using the G-Scheme. The scheme assumes that the dynamics is decomposed into active, slow, fast, and invariant subspaces. We derive expressions that establish a direct link between time scales and entropy production by using estimates provided by the G-Scheme. To calculate the contribution to entropy production, we resort to a standard model of a constant pressure, adiabatic, batch reactor, where the mixture temperature of the reactants is initially set above the auto-ignition temperature. Numerical experiments show that the contribution to entropy production of the fast subspace is of the same magnitude as the error threshold chosen for the identification of the decomposition of the tangent space, and the contribution of the slow subspace is generally much smaller than that of the active subspace. The information on entropy production associated with reactions within each subspace is used to define an entropy participation index that is subsequently utilized for model reduction.

  14. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  15. Characterization of complexities in combustion instability in a lean premixed gas-turbine model combustor.

    PubMed

    Gotoda, Hiroshi; Amano, Masahito; Miyano, Takaya; Ikawa, Takuya; Maki, Koshiro; Tachibana, Shigeru

    2012-12-01

    We characterize complexities in combustion instability in a lean premixed gas-turbine model combustor by nonlinear time series analysis to evaluate permutation entropy, fractal dimensions, and short-term predictability. The dynamic behavior in combustion instability near lean blowout exhibits a self-affine structure and is ascribed to fractional Brownian motion. It undergoes chaos by the onset of combustion oscillations with slow amplitude modulation. Our results indicate that nonlinear time series analysis is capable of characterizing complexities in combustion instability close to lean blowout.

  16. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.

  17. Forbidden patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano

    2008-03-01

    The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.

  18. Infrared small target detection based on multiscale center-surround contrast measure

    NASA Astrophysics Data System (ADS)

    Fu, Hao; Long, Yunli; Zhu, Ran; An, Wei

    2018-04-01

    Infrared(IR) small target detection plays a critical role in the Infrared Search And Track (IRST) system. Although it has been studied for years, there are some difficulties remained to the clutter environment. According to the principle of human discrimination of small targets from a natural scene that there is a signature of discontinuity between the object and its neighboring regions, we develop an efficient method for infrared small target detection called multiscale centersurround contrast measure (MCSCM). First, to determine the maximum neighboring window size, an entropy-based window selection technique is used. Then, we construct a novel multiscale center-surround contrast measure to calculate the saliency map. Compared with the original image, the MCSCM map has less background clutters and noise residual. Subsequently, a simple threshold is used to segment the target. Experimental results show our method achieves better performance.

  19. Comparing vector-based and Bayesian memory models using large-scale datasets: User-generated hashtag and tag prediction on Twitter and Stack Overflow.

    PubMed

    Stanley, Clayton; Byrne, Michael D

    2016-12-01

    The growth of social media and user-created content on online sites provides unique opportunities to study models of human declarative memory. By framing the task of choosing a hashtag for a tweet and tagging a post on Stack Overflow as a declarative memory retrieval problem, 2 cognitively plausible declarative memory models were applied to millions of posts and tweets and evaluated on how accurately they predict a user's chosen tags. An ACT-R based Bayesian model and a random permutation vector-based model were tested on the large data sets. The results show that past user behavior of tag use is a strong predictor of future behavior. Furthermore, past behavior was successfully incorporated into the random permutation model that previously used only context. Also, ACT-R's attentional weight term was linked to an entropy-weighting natural language processing method used to attenuate high-frequency words (e.g., articles and prepositions). Word order was not found to be a strong predictor of tag use, and the random permutation model performed comparably to the Bayesian model without including word order. This shows that the strength of the random permutation model is not in the ability to represent word order, but rather in the way in which context information is successfully compressed. The results of the large-scale exploration show how the architecture of the 2 memory models can be modified to significantly improve accuracy, and may suggest task-independent general modifications that can help improve model fit to human data in a much wider range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Permutation entropy analysis of heart rate variability for the assessment of cardiovascular autonomic neuropathy in type 1 diabetes mellitus.

    PubMed

    Carricarte Naranjo, Claudia; Sanchez-Rodriguez, Lazaro M; Brown Martínez, Marta; Estévez Báez, Mario; Machado García, Andrés

    2017-07-01

    Heart rate variability (HRV) analysis is a relevant tool for the diagnosis of cardiovascular autonomic neuropathy (CAN). To our knowledge, no previous investigation on CAN has assessed the complexity of HRV from an ordinal perspective. Therefore, the aim of this work is to explore the potential of permutation entropy (PE) analysis of HRV complexity for the assessment of CAN. For this purpose, we performed a short-term PE analysis of HRV in healthy subjects and type 1 diabetes mellitus patients, including patients with CAN. Standard HRV indicators were also calculated in the control group. A discriminant analysis was used to select the variables combination with best discriminative power between control and CAN patients groups, as well as for classifying cases. We found that for some specific temporal scales, PE indicators were significantly lower in CAN patients than those calculated for controls. In such cases, there were ordinal patterns with high probabilities of occurrence, while others were hardly found. We posit this behavior occurs due to a decrease of HRV complexity in the diseased system. Discriminant functions based on PE measures or probabilities of occurrence of ordinal patterns provided an average of 75% and 96% classification accuracy. Correlations of PE and HRV measures showed to depend only on temporal scale, regardless of pattern length. PE analysis at some specific temporal scales, seem to provide additional information to that obtained with traditional HRV methods. We concluded that PE analysis of HRV is a promising method for the assessment of CAN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Complexity of cardiovascular rhythms during head-up tilt test by entropy of patterns.

    PubMed

    Wejer, Dorota; Graff, Beata; Makowiec, Danuta; Budrejko, Szymon; Struzik, Zbigniew R

    2017-05-01

    The head-up tilt (HUT) test, which provokes transient dynamical alterations in the regulation of cardiovascular system, provides insights into complex organization of this system. Based on signals with heart period intervals (RR-intervals) and/or systolic blood pressure (SBP), differences in the cardiovascular regulation between vasovagal patients (VVS) and the healthy people group (CG) are investigated. Short-term relations among signal data represented symbolically by three-beat patterns allow to qualify and quantify the complexity of the cardiovascular regulation by Shannon entropy. Four types of patterns: permutation, ordinal, deterministic and dynamical, are used, and different resolutions of signal values in the the symbolization are applied in order to verify how entropy of patterns depends on a way in which values of signals are preprocessed. At rest, in the physiologically important signal resolution ranges, independently of the type of patterns used in estimates, the complexity of SBP signals in VVS is different from the complexity found in CG. Entropy of VVS is higher than CG what could be interpreted as substantial presence of noisy ingredients in SBP of VVS. After tilting this relation switches. Entropy of CG occurs significantly higher than VVS for SBP signals. In the case of RR-intervals and large resolutions, the complexity after the tilt becomes reduced when compared to the complexity of RR-intervals at rest for both groups. However, in the case of VVS patients this reduction is significantly stronger than in CG. Our observations about opposite switches in entropy between CG and VVS might support a hypothesis that baroreflex in VVS affects stronger the heart rate because of the inefficient regulation (possibly impaired local vascular tone alternations) of the blood pressure.

  2. Multiscale entropy analysis of biological signals: a fundamental bi-scaling law

    PubMed Central

    Gao, Jianbo; Hu, Jing; Liu, Feiyan; Cao, Yinhe

    2015-01-01

    Since introduced in early 2000, multiscale entropy (MSE) has found many applications in biosignal analysis, and been extended to multivariate MSE. So far, however, no analytic results for MSE or multivariate MSE have been reported. This has severely limited our basic understanding of MSE. For example, it has not been studied whether MSE estimated using default parameter values and short data set is meaningful or not. Nor is it known whether MSE has any relation with other complexity measures, such as the Hurst parameter, which characterizes the correlation structure of the data. To overcome this limitation, and more importantly, to guide more fruitful applications of MSE in various areas of life sciences, we derive a fundamental bi-scaling law for fractal time series, one for the scale in phase space, the other for the block size used for smoothing. We illustrate the usefulness of the approach by examining two types of physiological data. One is heart rate variability (HRV) data, for the purpose of distinguishing healthy subjects from patients with congestive heart failure, a life-threatening condition. The other is electroencephalogram (EEG) data, for the purpose of distinguishing epileptic seizure EEG from normal healthy EEG. PMID:26082711

  3. Neurophysiological basis of creativity in healthy elderly people: a multiscale entropy approach.

    PubMed

    Ueno, Kanji; Takahashi, Tetsuya; Takahashi, Koichi; Mizukami, Kimiko; Tanaka, Yuji; Wada, Yuji

    2015-03-01

    Creativity, which presumably involves various connections within and across different neural networks, reportedly underpins the mental well-being of older adults. Multiscale entropy (MSE) can characterize the complexity inherent in EEG dynamics with multiple temporal scales. It can therefore provide useful insight into neural networks. Given that background, we sought to clarify the neurophysiological bases of creativity in healthy elderly subjects by assessing EEG complexity with MSE, with emphasis on assessment of neural networks. We recorded resting state EEG of 20 healthy elderly subjects. MSE was calculated for each subject for continuous 20-s epochs. Their relevance to individual creativity was examined concurrently with intellectual function. Higher individual creativity was linked closely to increased EEG complexity across higher temporal scales, but no significant relation was found with intellectual function (IQ score). Considering the general "loss of complexity" theory of aging, our finding of increased EEG complexity in elderly people with heightened creativity supports the idea that creativity is associated with activated neural networks. Results reported here underscore the potential usefulness of MSE analysis for characterizing the neurophysiological bases of elderly people with heightened creativity. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. A Novel Bit-level Image Encryption Method Based on Chaotic Map and Dynamic Grouping

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Ji; Shen, Yan

    2012-10-01

    In this paper, a novel bit-level image encryption method based on dynamic grouping is proposed. In the proposed method, the plain-image is divided into several groups randomly, then permutation-diffusion process on bit level is carried out. The keystream generated by logistic map is related to the plain-image, which confuses the relationship between the plain-image and the cipher-image. The computer simulation results of statistical analysis, information entropy analysis and sensitivity analysis show that the proposed encryption method is secure and reliable enough to be used for communication application.

  5. Noise and poise: Enhancement of postural complexity in the elderly with a stochastic-resonance based therapy

    NASA Astrophysics Data System (ADS)

    Costa, M.; Priplata, A. A.; Lipsitz, L. A.; Wu, Z.; Huang, N. E.; Goldberger, A. L.; Peng, C.-K.

    2007-03-01

    Pathologic states are associated with a loss of dynamical complexity. Therefore, therapeutic interventions that increase physiologic complexity may enhance health status. Using multiscale entropy analysis, we show that the postural sway dynamics of healthy young and healthy elderly subjects are more complex than that of elderly subjects with a history of falls. Application of subsensory noise to the feet has been demonstrated to improve postural stability in the elderly. We next show that this therapy significantly increases the multiscale complexity of sway fluctuations in healthy elderly subjects. Quantification of changes in dynamical complexity of biologic variability may be the basis of a new approach to assessing risk and to predicting the efficacy of clinical interventions, including noise-based therapies.

  6. A multi-scale Q1/P0 approach to langrangian shock hydrodynamics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shashkov, Mikhail; Love, Edward; Scovazzi, Guglielmo

    A new multi-scale, stabilized method for Q1/P0 finite element computations of Lagrangian shock hydrodynamics is presented. Instabilities (of hourglass type) are controlled by a stabilizing operator derived using the variational multi-scale analysis paradigm. The resulting stabilizing term takes the form of a pressure correction. With respect to currently implemented hourglass control approaches, the novelty of the method resides in its residual-based character. The stabilizing residual has a definite physical meaning, since it embeds a discrete form of the Clausius-Duhem inequality. Effectively, the proposed stabilization samples and acts to counter the production of entropy due to numerical instabilities. The proposed techniquemore » is applicable to materials with no shear strength, for which there exists a caloric equation of state. The stabilization operator is incorporated into a mid-point, predictor/multi-corrector time integration algorithm, which conserves mass, momentum and total energy. Encouraging numerical results in the context of compressible gas dynamics confirm the potential of the method.« less

  7. Electromyographic Permutation Entropy Quantifies Diaphragmatic Denervation and Reinnervation

    PubMed Central

    Kretschmer, Alexander; Lehmeyer, Veronika; Kellermann, Kristine; Schaller, Stephan J.; Blobner, Manfred; Kochs, Eberhard F.; Fink, Heidrun

    2014-01-01

    Spontaneous reinnervation after diaphragmatic paralysis due to trauma, surgery, tumors and spinal cord injuries is frequently observed. A possible explanation could be collateral reinnervation, since the diaphragm is commonly double-innervated by the (accessory) phrenic nerve. Permutation entropy (PeEn), a complexity measure for time series, may reflect a functional state of neuromuscular transmission by quantifying the complexity of interactions across neural and muscular networks. In an established rat model, electromyographic signals of the diaphragm after phrenicotomy were analyzed using PeEn quantifying denervation and reinnervation. Thirty-three anesthetized rats were unilaterally phrenicotomized. After 1, 3, 9, 27 and 81 days, diaphragmatic electromyographic PeEn was analyzed in vivo from sternal, mid-costal and crural areas of both hemidiaphragms. After euthanasia of the animals, both hemidiaphragms were dissected for fiber type evaluation. The electromyographic incidence of an accessory phrenic nerve was 76%. At day 1 after phrenicotomy, PeEn (normalized values) was significantly diminished in the sternal (median: 0.69; interquartile range: 0.66–0.75) and mid-costal area (0.68; 0.66–0.72) compared to the non-denervated side (0.84; 0.78–0.90) at threshold p<0.05. In the crural area, innervated by the accessory phrenic nerve, PeEn remained unchanged (0.79; 0.72–0.86). During reinnervation over 81 days, PeEn normalized in the mid-costal area (0.84; 0.77–0.86), whereas it remained reduced in the sternal area (0.77; 0.70–0.81). Fiber type grouping, a histological sign for reinnervation, was found in the mid-costal area in 20% after 27 days and in 80% after 81 days. Collateral reinnervation can restore diaphragm activity after phrenicotomy. Electromyographic PeEn represents a new, distinctive assessment characterizing intramuscular function following denervation and reinnervation. PMID:25532023

  8. Spectral Dynamics of Resting State fMRI Within the Ventral Tegmental Area and Dorsal Raphe Nuclei in Medication-Free Major Depressive Disorder in Young Adults.

    PubMed

    Wohlschläger, Afra; Karne, Harish; Jordan, Denis; Lowe, Mark J; Jones, Stephen E; Anand, Amit

    2018-01-01

    Background: Dorsal raphe nucleus (DRN) and ventral tegmental area (VTA) are major brainstem monamine nuclei consisting of serotonin and dopamine neurons respectively. Animal studies show that firing patterns in both nuclei are altered when animals exhibit depression like behaviors. Functional MRI studies in humans have shown reduced VTA activation and DRN connectivity in depression. This study for the first time aims at investigating the functional integrity of local neuronal firing concurrently in both the VTA and DRN in vivo in humans using spectral analysis of resting state low frequency fluctuation fMRI. Method: A total of 97 medication-free subjects-67 medication-free young patients (ages 18-30) with major depressive disorder and 30 closely matched healthy controls were included in the study to detect aberrant dynamics in DRN and VTA. For the investigation of altered localized dynamics we conducted power spectral analysis and above this spectral cross correlation between the two groups. Complementary to this, spectral dependence of permutation entropy, an information theoretical measure, was compared between groups. Results: Patients displayed significant spectral slowing in VTA vs. controls ( p = 0.035, corrected). In DRN, spectral slowing was less pronounced, but the amount of slowing significantly correlated with 17-item Hamilton Depression Rating scores of depression severity ( p = 0.038). Signal complexity as assessed via permutation entropy showed spectral alterations inline with the results on spectral slowing. Conclusion: Our results indicate that altered functional dynamics of VTA and DRN in depression can be detected from regional fMRI signal. On this basis, impact of antidepressant treatment and treatment response can be assessed using these markers in future studies.

  9. OmpF, a nucleotide-sensing nanoprobe, computational evaluation of single channel activities

    NASA Astrophysics Data System (ADS)

    Abdolvahab, R. H.; Mobasheri, H.; Nikouee, A.; Ejtehadi, M. R.

    2016-09-01

    The results of highthroughput practical single channel experiments should be formulated and validated by signal analysis approaches to increase the recognition precision of translocating molecules. For this purpose, the activities of the single nano-pore forming protein, OmpF, in the presence of nucleotides were recorded in real time by the voltage clamp technique and used as a means for nucleotide recognition. The results were analyzed based on the permutation entropy of current Time Series (TS), fractality, autocorrelation, structure function, spectral density, and peak fraction to recognize each nucleotide, based on its signature effect on the conductance, gating frequency and voltage sensitivity of channel at different concentrations and membrane potentials. The amplitude and frequency of ion current fluctuation increased in the presence of Adenine more than Cytosine and Thymine in milli-molar (0.5 mM) concentrations. The variance of the current TS at various applied voltages showed a non-monotonic trend whose initial increasing slope in the presence of Thymine changed to a decreasing one in the second phase and was different from that of Adenine and Cytosine; e.g., by increasing the voltage from 40 to 140 mV in the 0.5 mM concentration of Adenine or Cytosine, the variance decreased by one third while for the case of Thymine it was doubled. Moreover, according to the structure function of TS, the fractality of current TS differed as a function of varying membrane potentials (pd) and nucleotide concentrations. Accordingly, the calculated permutation entropy of the TS, validated the biophysical approach defined for the recognition of different nucleotides at various concentrations, pd's and polarities. Thus, the promising outcomes of the combined experimental and theoretical methodologies presented here can be implemented as a complementary means in pore-based nucleotide recognition approaches.

  10. Complexity-Entropy Causality Plane as a Complexity Measure for Two-Dimensional Patterns

    PubMed Central

    Ribeiro, Haroldo V.; Zunino, Luciano; Lenzi, Ervin K.; Santoro, Perseu A.; Mendes, Renio S.

    2012-01-01

    Complexity measures are essential to understand complex systems and there are numerous definitions to analyze one-dimensional data. However, extensions of these approaches to two or higher-dimensional data, such as images, are much less common. Here, we reduce this gap by applying the ideas of the permutation entropy combined with a relative entropic index. We build up a numerical procedure that can be easily implemented to evaluate the complexity of two or higher-dimensional patterns. We work out this method in different scenarios where numerical experiments and empirical data were taken into account. Specifically, we have applied the method to fractal landscapes generated numerically where we compare our measures with the Hurst exponent; liquid crystal textures where nematic-isotropic-nematic phase transitions were properly identified; 12 characteristic textures of liquid crystals where the different values show that the method can distinguish different phases; and Ising surfaces where our method identified the critical temperature and also proved to be stable. PMID:22916097

  11. Analysis of spontaneous EEG activity in Alzheimer's disease using cross-sample entropy and graph theory.

    PubMed

    Gomez, Carlos; Poza, Jesus; Gomez-Pilar, Javier; Bachiller, Alejandro; Juan-Cruz, Celia; Tola-Arribas, Miguel A; Carreres, Alicia; Cano, Monica; Hornero, Roberto

    2016-08-01

    The aim of this pilot study was to analyze spontaneous electroencephalography (EEG) activity in Alzheimer's disease (AD) by means of Cross-Sample Entropy (Cross-SampEn) and two local measures derived from graph theory: clustering coefficient (CC) and characteristic path length (PL). Five minutes of EEG activity were recorded from 37 patients with dementia due to AD and 29 elderly controls. Our results showed that Cross-SampEn values were lower in the AD group than in the control one for all the interactions among EEG channels. This finding indicates that EEG activity in AD is characterized by a lower statistical dissimilarity among channels. Significant differences were found mainly for fronto-central interactions (p <; 0.01, permutation test). Additionally, the application of graph theory measures revealed diverse neural network changes, i.e. lower CC and higher PL values in AD group, leading to a less efficient brain organization. This study suggests the usefulness of our approach to provide further insights into the underlying brain dynamics associated with AD.

  12. Gender-specific heart rate dynamics in severe intrauterine growth-restricted fetuses.

    PubMed

    Gonçalves, Hernâni; Bernardes, João; Ayres-de-Campos, Diogo

    2013-06-01

    Management of intrauterine growth restriction (IUGR) remains a major issue in perinatology. The objective of this paper was the assessment of gender-specific fetal heart rate (FHR) dynamics as a diagnostic tool in severe IUGR. FHR was analyzed in the antepartum period in 15 severe IUGR fetuses and 18 controls, matched for gestational age, in relation to fetal gender. Linear and entropy methods, such as mean FHR (mFHR), low (LF), high (HF) and movement frequency (MF), approximate, sample and multiscale entropy. Sensitivities and specificities were estimated using Fisher linear discriminant analysis and the leave-one-out method. Overall, IUGR fetuses presented significantly lower mFHR and entropy compared with controls. However, gender-specific analysis showed that significantly lower mFHR was only evident in IUGR males and lower entropy in IUGR females. In addition, lower LF/(MF+HF) was patent in IUGR females compared with controls, but not in males. Rather high sensitivities and specificities were achieved in the detection of the FHR recordings related with IUGR male fetuses, when gender-specific analysis was performed at gestational ages less than 34 weeks. Severe IUGR fetuses present gender-specific linear and entropy FHR changes, compared with controls, characterized by a significantly lower entropy and sympathetic-vagal balance in females than in males. These findings need to be considered in order to achieve better diagnostic results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Multiscale entropy analysis of electroencephalography during sleep in patients with Parkinson disease.

    PubMed

    Chung, Chen-Chih; Kang, Jiunn-Horng; Yuan, Rey-Yue; Wu, Dean; Chen, Chih-Chung; Chi, Nai-Fang; Chen, Po-Chih; Hu, Chaur-Jong

    2013-07-01

    Sleep disorders are frequently seen in patients with Parkinson disease (PD), including rapid eye movement (REM) behavior disorder and periodic limb movement disorder. However, knowledge about changes in non-REM sleep in patients with PD is limited. This study explored the characteristics of electroencephalography (EEG) during sleep in patients with PD and non-PD controls. We further conducted multiscale entropy (MSE) analysis to evaluate and compare the complexity of sleep EEG for the 2 groups. There were 9 patients with PD (Hoehn-Yahr stage 1 or 2) and 11 non-PD controls. All participants underwent standard whole-night polysomnography (PSG), which included 23 channels, 6 of which were for EEG. The raw data of the EEG were extracted and subjected to MSE analysis. Patients with PD had a longer sleep onset time and a higher spontaneous EEG arousal index. Sleep stage-specific increased MSE was observed in patients with PD during non-REM sleep. The difference was more marked and significant at higher time scale factors (TSFs). In conclusion, increased biosignal complexity, as revealed by MSE analysis, was found in patients with PD during non-REM sleep at high TSFs. This finding might reflect a compensatory mechanism for early defects in neuronal network control machinery in PD.

  14. Multiscale Transient and Steady-State Study of the Influence of Microstructure Degradation and Chromium Oxide Poisoning on Solid Oxide Fuel Cell Cathode Performance

    NASA Astrophysics Data System (ADS)

    Li, Guanchen; von Spakovsky, Michael R.; Shen, Fengyu; Lu, Kathy

    2018-01-01

    Oxygen reduction in a solid oxide fuel cell cathode involves a nonequilibrium process of coupled mass and heat diffusion and electrochemical and chemical reactions. These phenomena occur at multiple temporal and spatial scales, making the modeling, especially in the transient regime, very difficult. Nonetheless, multiscale models are needed to improve the understanding of oxygen reduction and guide cathode design. Of particular importance for long-term operation are microstructure degradation and chromium oxide poisoning both of which degrade cathode performance. Existing methods are phenomenological or empirical in nature and their application limited to the continuum realm with quantum effects not captured. In contrast, steepest-entropy-ascent quantum thermodynamics can be used to model nonequilibrium processes (even those far-from equilibrium) at all scales. The nonequilibrium relaxation is characterized by entropy generation, which can unify coupled phenomena into one framework to model transient and steady behavior. The results reveal the effects on performance of the different timescales of the varied phenomena involved and their coupling. Results are included here for the effects of chromium oxide concentrations on cathode output as is a parametric study of the effects of interconnect-three-phase-boundary length, oxygen mean free path, and adsorption site effectiveness. A qualitative comparison with experimental results is made.

  15. Age-related variation in EEG complexity to photic stimulation: A multiscale entropy analysis

    PubMed Central

    Takahashi, Tetsuya; Cho, Raymond Y.; Murata, Tetsuhito; Mizuno, Tomoyuki; Kikuchi, Mitsuru; Mizukami, Kimiko; Kosaka, Hirotaka; Takahashi, Koichi; Wada, Yuji

    2010-01-01

    Objective This study was intended to examine variations in electroencephalographic (EEG) complexity in response to photic stimulation (PS) during aging to test the hypothesis that the aging process reduces physiologic complexity and functional responsiveness. Methods Multiscale entropy (MSE), an estimate of time-series signal complexity associated with long-range temporal correlation, is used as a recently proposed method for quantifying EEG complexity with multiple coarse-grained sequences. We recorded EEG in 13 healthy elderly subjects and 12 healthy young subjects during pre-PS and post-PS conditions and estimated their respective MSE values. Results For the pre-PS condition, no significant complexity difference was found between the groups. However, a significant MSE change (complexity increase) was found post-PS only in young subjects, thereby revealing a power-law scaling property, which means long-range temporal correlation. Conclusions Enhancement of long-range temporal correlation in young subjects after PS might reflect a cortical response to stimuli, which was absent in elderly subjects. These results are consistent with the general “loss of complexity/diminished functional response to stimuli” theory of aging. Significance Our findings demonstrate that application of MSE analysis to EEG is a powerful approach for studying age-related changes in brain function. PMID:19231279

  16. Reduced Data Dualscale Entropy Analysis of HRV Signals for Improved Congestive Heart Failure Detection

    NASA Astrophysics Data System (ADS)

    Kuntamalla, Srinivas; Lekkala, Ram Gopal Reddy

    2014-10-01

    Heart rate variability (HRV) is an important dynamic variable of the cardiovascular system, which operates on multiple time scales. In this study, Multiscale entropy (MSE) analysis is applied to HRV signals taken from Physiobank to discriminate Congestive Heart Failure (CHF) patients from healthy young and elderly subjects. The discrimination power of the MSE method is decreased as the amount of the data reduces and the lowest amount of the data at which there is a clear discrimination between CHF and normal subjects is found to be 4000 samples. Further, this method failed to discriminate CHF from healthy elderly subjects. In view of this, the Reduced Data Dualscale Entropy Analysis method is proposed to reduce the data size required (as low as 500 samples) for clearly discriminating the CHF patients from young and elderly subjects with only two scales. Further, an easy to interpret index is derived using this new approach for the diagnosis of CHF. This index shows 100 % accuracy and correlates well with the pathophysiology of heart failure.

  17. Differential Changes with Age in Multiscale Entropy of Electromyography Signals from Leg Muscles during Treadmill Walking

    PubMed Central

    Kang, Hyun Gu; Dingwell, Jonathan B.

    2016-01-01

    Age-related gait changes may be due to the loss of complexity in the neuromuscular system. This theory is disputed due to inconsistent results from single-scale analyses. Also, behavioral adaptations may confound these changes. We examined whether EMG dynamics during gait is less complex in older adults over a range of timescales using the multiscale entropy method, and whether slower walking attenuates this effect. Surface EMG was measured from the left vastus lateralis (VL), biceps femoris (BF), gastrocnemius (GA), and tibialis anterior (TA) in 17 young and 18 older adults as they walked on a treadmill for 5 minutes at 0.8x-1.2x of preferred speed. Sample entropy (SE) and the complexity index (CI) of the EMG signals were calculated after successive coarse-graining to extract dynamics at timescales of 27 to 270 Hz, with m = 2 and r = 0.15 SD. SE and CI were lower across the timescales in older adults in VL and BF, but higher in GA (all p<0.001); these results held for VL and GA even after accounting for longer EMG burst durations in older adults. CI was higher during slower walking speed in VL and BF (p<0.001). Results were mostly similar for m = 3 and r = 0.01–0.35. Smaller r was more sensitive to age-related differences. The decrease in complexity with aging in the timescales studied was limited to proximal muscles, particularly VL. The increase in GA may be driven by other factors. Walking slower may reflect a behavioral adaptation that allows the nervous system to function with greater complexity. PMID:27570974

  18. Fuzzy entropy thresholding and multi-scale morphological approach for microscopic image enhancement

    NASA Astrophysics Data System (ADS)

    Zhou, Jiancan; Li, Yuexiang; Shen, Linlin

    2017-07-01

    Microscopic images provide lots of useful information for modern diagnosis and biological research. However, due to the unstable lighting condition during image capturing, two main problems, i.e., high-level noises and low image contrast, occurred in the generated cell images. In this paper, a simple but efficient enhancement framework is proposed to address the problems. The framework removes image noises using a hybrid method based on wavelet transform and fuzzy-entropy, and enhances the image contrast with an adaptive morphological approach. Experiments on real cell dataset were made to assess the performance of proposed framework. The experimental results demonstrate that our proposed enhancement framework increases the cell tracking accuracy to an average of 74.49%, which outperforms the benchmark algorithm, i.e., 46.18%.

  19. Monitoring the Depth of Anesthesia Using a New Adaptive Neurofuzzy System.

    PubMed

    Shalbaf, Ahmad; Saffar, Mohsen; Sleigh, Jamie W; Shalbaf, Reza

    2018-05-01

    Accurate and noninvasive monitoring of the depth of anesthesia (DoA) is highly desirable. Since the anesthetic drugs act mainly on the central nervous system, the analysis of brain activity using electroencephalogram (EEG) is very useful. This paper proposes a novel automated method for assessing the DoA using EEG. First, 11 features including spectral, fractal, and entropy are extracted from EEG signal and then, by applying an algorithm according to exhaustive search of all subsets of features, a combination of the best features (Beta-index, sample entropy, shannon permutation entropy, and detrended fluctuation analysis) is selected. Accordingly, we feed these extracted features to a new neurofuzzy classification algorithm, adaptive neurofuzzy inference system with linguistic hedges (ANFIS-LH). This structure can successfully model systems with nonlinear relationships between input and output, and also classify overlapped classes accurately. ANFIS-LH, which is based on modified classical fuzzy rules, reduces the effects of the insignificant features in input space, which causes overlapping and modifies the output layer structure. The presented method classifies EEG data into awake, light, general, and deep states during anesthesia with sevoflurane in 17 patients. Its accuracy is 92% compared to a commercial monitoring system (response entropy index) successfully. Moreover, this method reaches the classification accuracy of 93% to categorize EEG signal to awake and general anesthesia states by another database of propofol and volatile anesthesia in 50 patients. To sum up, this method is potentially applicable to a new real-time monitoring system to help the anesthesiologist with continuous assessment of DoA quickly and accurately.

  20. Physiologic variability at the verge of systemic inflammation: multiscale entropy of heart rate variability is affected by very low doses of endotoxin.

    PubMed

    Herlitz, Georg N; Arlow, Renee L; Cheung, Nora H; Coyle, Susette M; Griffel, Benjamin; Macor, Marie A; Lowry, Stephen F; Calvano, Steve E; Gale, Stephen C

    2015-02-01

    Human injury or infection induces systemic inflammation with characteristic neuroendocrine responses. Fluctuations in autonomic function during inflammation are reflected by beat-to-beat variation in heart rate, termed heart rate variability (HRV). In the present study, we determine threshold doses of endotoxin needed to induce observable changes in markers of systemic inflammation, investigate whether metrics of HRV exhibit a differing threshold dose from other inflammatory markers, and investigate the size of data sets required for meaningful use of multiscale entropy (MSE) analysis of HRV. Healthy human volunteers (n = 25) were randomized to receive placebo (normal saline) or endotoxin/lipopolysaccharide (LPS): 0.1, 0.25, 0.5, 1.0, or 2.0 ng/kg administered intravenously. Vital signs were recorded every 30 min for 6 h and then at 9, 12, and 24 h after LPS. Blood samples were drawn at specific time points for cytokine measurements. Heart rate variability analysis was performed using electrocardiogram epochs of 5 min. Multiscale entropy for HRV was calculated for all dose groups to scale factor 40. The lowest significant threshold dose was noted in core temperature at 0.25 ng/kg. Endogenous tumor necrosis factor α and interleukin 6 were significantly responsive at the next dosage level (0.5 ng/kg) along with elevations in circulating leukocytes and heart rate. Responses were exaggerated at higher doses (1 and 2 ng/kg). Time domain and frequency domain HRV metrics similarly suggested a threshold dose, differing from placebo at 1.0 and 2.0 ng/kg, below which no clear pattern in response was evident. By applying repeated-measures analysis of variance across scale factors, a significant decrease in MSE was seen at 1.0 and 2.0 ng/kg by 2 h after exposure to LPS. Although not statistically significant below 1.0 ng/kg, MSE unexpectedly decreased across all groups in an orderly dose-response pattern not seen in the other outcomes. By using repeated-measures analysis of variance across scale factors, MSE can detect autonomic change after LPS challenge in a group of 25 subjects using electrocardiogram epochs of only 5 min and entropy analysis to scale factor of only 40, potentially facilitating MSE's wider use as a research tool or bedside monitor. Traditional markers of inflammation generally exhibit threshold dose behavior. In contrast, MSE's apparent continuous dose-response pattern, although not statistically verifiable in this study, suggests a potential subclinical harbinger of infectious or other insult. The possible derangement of autonomic complexity prior to or independent of the cytokine surge cannot be ruled out. Future investigation should focus on confirmation of overt inflammation following observed decreases in MSE in a clinical setting.

  1. An adaptive technique for multiscale approximate entropy (MAEbin) threshold (r) selection: application to heart rate variability (HRV) and systolic blood pressure variability (SBPV) under postural stress.

    PubMed

    Singh, Amritpal; Saini, Barjinder Singh; Singh, Dilbag

    2016-06-01

    Multiscale approximate entropy (MAE) is used to quantify the complexity of a time series as a function of time scale τ. Approximate entropy (ApEn) tolerance threshold selection 'r' is based on either: (1) arbitrary selection in the recommended range (0.1-0.25) times standard deviation of time series (2) or finding maximum ApEn (ApEnmax) i.e., the point where self-matches start to prevail over other matches and choosing the corresponding 'r' (rmax) as threshold (3) or computing rchon by empirically finding the relation between rmax, SD1/SD2 ratio and N using curve fitting, where, SD1 and SD2 are short-term and long-term variability of a time series respectively. None of these methods is gold standard for selection of 'r'. In our previous study [1], an adaptive procedure for selection of 'r' is proposed for approximate entropy (ApEn). In this paper, this is extended to multiple time scales using MAEbin and multiscale cross-MAEbin (XMAEbin). We applied this to simulations i.e. 50 realizations (n = 50) of random number series, fractional Brownian motion (fBm) and MIX (P) [1] series of data length of N = 300 and short term recordings of HRV and SBPV performed under postural stress from supine to standing. MAEbin and XMAEbin analysis was performed on laboratory recorded data of 50 healthy young subjects experiencing postural stress from supine to upright. The study showed that (i) ApEnbin of HRV is more than SBPV in supine position but is lower than SBPV in upright position (ii) ApEnbin of HRV decreases from supine i.e. 1.7324 ± 0.112 (mean ± SD) to upright 1.4916 ± 0.108 due to vagal inhibition (iii) ApEnbin of SBPV increases from supine i.e. 1.5535 ± 0.098 to upright i.e. 1.6241 ± 0.101 due sympathetic activation (iv) individual and cross complexities of RRi and systolic blood pressure (SBP) series depend on time scale under consideration (v) XMAEbin calculated using ApEnmax is correlated with cross-MAE calculated using ApEn (0.1-0.26) in steps of 0.02 at each time scale in supine and upright position and is concluded that ApEn0.26 has highest correlation at most scales (vi) choice of 'r' is critical in interpreting interactions between RRi and SBP and in ascertaining true complexity of the individual RRi and SBP series.

  2. Heart rate complexity in sinoaortic-denervated mice.

    PubMed

    Silva, Luiz Eduardo V; Rodrigues, Fernanda Luciano; de Oliveira, Mauro; Salgado, Hélio Cesar; Fazan, Rubens

    2015-02-01

    What is the central question of this study? New measurements for cardiovascular complexity, such as detrended fluctuation analysis (DFA) and multiscale entropy (MSE), have been shown to predict cardiovascular outcomes. Given that cardiovascular diseases are accompanied by autonomic imbalance and decreased baroreflex sensitivity, the central question is: do baroreceptors contribute to cardiovascular complexity? What is the main finding and its importance? Sinoaortic denervation altered both DFA scaling exponents and MSE, indicating that both short- and long-term mechanisms of complexity are altered in sinoaortic denervated mice, resulting in a loss of physiological complexity. These results suggest that the baroreflex is a key element in the complex structures involved in heart rate variability regulation. Recently, heart rate (HR) oscillations have been recognized as complex behaviours derived from non-linear processes. Physiological complexity theory is based on the idea that healthy systems present high complexity, i.e. non-linear, fractal variability at multiple scales, with long-range correlations. The loss of complexity in heart rate variability (HRV) has been shown to predict adverse cardiovascular outcomes. Based on the idea that most cardiovascular diseases are accompanied by autonomic imbalance and a decrease in baroreflex sensitivity, we hypothesize that the baroreflex plays an important role in complex cardiovascular behaviour. Mice that had been subjected to sinoaortic denervation (SAD) were implanted with catheters in the femoral artery and jugular vein 5 days prior to the experiment. After recording the baseline arterial pressure (AP), pulse interval time series were generated from the intervals between consecutive values of diastolic pressure. The complexity of the HRV was determined using detrended fluctuation analysis and multiscale entropy. The detrended fluctuation analysis α1 scaling exponent (a short-term index) was remarkably decreased in the SAD mice (0.79 ± 0.06 versus 1.13 ± 0.04 for the control mice), whereas SAD slightly increased the α2 scaling exponent (a long-term index; 1.12 ± 0.03 versus 1.04 ± 0.02 for control mice). In the SAD mice, the total multiscale entropy was decreased (13.2 ± 1.3) compared with the control mice (18.9 ± 1.4). In conclusion, fractal and regularity structures of HRV are altered in SAD mice, affecting both short- and long-term mechanisms of complexity, suggesting that the baroreceptors play a considerable role in the complex structure of HRV. © 2014 The Authors. Experimental Physiology © 2014 The Physiological Society.

  3. Complexity of heart rate fluctuations in near-term sheep and human fetuses during sleep.

    PubMed

    Frank, Birgit; Frasch, Martin G; Schneider, Uwe; Roedel, Marcus; Schwab, Matthias; Hoyer, Dirk

    2006-10-01

    We investigated how the complexity of fetal heart rate fluctuations (fHRF) is related to the sleep states in sheep and human fetuses. The complexity as a function of time scale for fetal heart rate data for 7 sheep and 27 human fetuses was estimated in rapid eye movement (REM) and non-REM sleep by means of permutation entropy and the associated Kullback-Leibler entropy. We found that in humans, fHRF complexity is higher in non-REM than REM sleep, whereas in sheep this relationship is reversed. To show this relation, choice of the appropriate time scale is crucial. In sheep fetuses, we found differences in the complexity of fHRF between REM and non-REM sleep only for larger time scales (above 2.5 s), whereas in human fetuses the complexity was clearly different between REM and non-REM sleep over the whole range of time scales. This may be due to inherent time scales of complexity, which reflect species-specific functions of the autonomic nervous system. Such differences have to be considered when animal data are translated to the human situation.

  4. Retinex enhancement of infrared images.

    PubMed

    Li, Ying; He, Renjie; Xu, Guizhi; Hou, Changzhi; Sun, Yunyan; Guo, Lei; Rao, Liyun; Yan, Weili

    2008-01-01

    With the ability of imaging the temperature distribution of body, infrared imaging is promising in diagnostication and prognostication of diseases. However the poor quality of the raw original infrared images prevented applications and one of the essential problems is the low contrast appearance of the imagined object. In this paper, the image enhancement technique based on the Retinex theory is studied, which is a process that automatically retrieve the visual realism to images. The algorithms, including Frackle-McCann algorithm, McCann99 algorithm, single-scale Retinex algorithm, multi-scale Retinex algorithm and multi-scale Retinex algorithm with color restoration, are experienced to the enhancement of infrared images. The entropy measurements along with the visual inspection were compared and results shown the algorithms based on Retinex theory have the ability in enhancing the infrared image. Out of the algorithms compared, MSRCR demonstrated the best performance.

  5. The role of sympathetic and vagal cardiac control on complexity of heart rate dynamics.

    PubMed

    Silva, Luiz Eduardo Virgilio; Silva, Carlos Alberto Aguiar; Salgado, Helio Cesar; Fazan, Rubens

    2017-03-01

    Analysis of heart rate variability (HRV) by nonlinear approaches has been gaining interest due to their ability to extract additional information from heart rate (HR) dynamics that are not detectable by traditional approaches. Nevertheless, the physiological interpretation of nonlinear approaches remains unclear. Therefore, we propose long-term (60 min) protocols involving selective blockade of cardiac autonomic receptors to investigate the contribution of sympathetic and parasympathetic function upon nonlinear dynamics of HRV. Conscious male Wistar rats had their electrocardiogram (ECG) recorded under three distinct conditions: basal, selective (atenolol or atropine), or combined (atenolol plus atropine) pharmacological blockade of autonomic muscarinic or β 1 -adrenergic receptors. Time series of RR interval were assessed by multiscale entropy (MSE) and detrended fluctuation analysis (DFA). Entropy over short (1 to 5, MSE 1-5 ) and long (6 to 30, MSE 6-30 ) time scales was computed, as well as DFA scaling exponents at short (α short , 5 ≤ n ≤ 15), mid (α mid , 30 ≤ n ≤ 200), and long (α long , 200 ≤ n ≤ 1,700) window sizes. The results show that MSE 1-5 is reduced under atropine blockade and MSE 6-30 is reduced under atropine, atenolol, or combined blockade. In addition, while atropine expressed its maximal effect at scale six, the effect of atenolol on MSE increased with scale. For DFA, α short decreased during atenolol blockade, while the α mid increased under atropine blockade. Double blockade decreased α short and increased α long Results with surrogate data show that the dynamics during combined blockade is not random. In summary, sympathetic and vagal control differently affect entropy (MSE) and fractal properties (DFA) of HRV. These findings are important to guide future studies. NEW & NOTEWORTHY Although multiscale entropy (MSE) and detrended fluctuation analysis (DFA) are recognizably useful prognostic/diagnostic methods, their physiological interpretation remains unclear. The present study clarifies the effect of the cardiac autonomic control on MSE and DFA, assessed during long periods (1 h). These findings are important to help the interpretation of future studies. Copyright © 2017 the American Physiological Society.

  6. Measuring center of pressure signals to quantify human balance using multivariate multiscale entropy by designing a force platform.

    PubMed

    Huang, Cheng-Wei; Sue, Pei-Der; Abbod, Maysam F; Jiang, Bernard C; Shieh, Jiann-Shing

    2013-08-08

    To assess the improvement of human body balance, a low cost and portable measuring device of center of pressure (COP), known as center of pressure and complexity monitoring system (CPCMS), has been developed for data logging and analysis. In order to prove that the system can estimate the different magnitude of different sways in comparison with the commercial Advanced Mechanical Technology Incorporation (AMTI) system, four sway tests have been developed (i.e., eyes open, eyes closed, eyes open with water pad, and eyes closed with water pad) to produce different sway displacements. Firstly, static and dynamic tests were conducted to investigate the feasibility of the system. Then, correlation tests of the CPCMS and AMTI systems have been compared with four sway tests. The results are within the acceptable range. Furthermore, multivariate empirical mode decomposition (MEMD) and enhanced multivariate multiscale entropy (MMSE) analysis methods have been used to analyze COP data reported by the CPCMS and compare it with the AMTI system. The improvements of the CPCMS are 35% to 70% (open eyes test) and 60% to 70% (eyes closed test) with and without water pad. The AMTI system has shown an improvement of 40% to 80% (open eyes test) and 65% to 75% (closed eyes test). The results indicate that the CPCMS system can achieve similar results to the commercial product so it can determine the balance.

  7. A Novel Application of Multiscale Entropy in Electroencephalography to Predict the Efficacy of Acetylcholinesterase Inhibitor in Alzheimer's Disease

    PubMed Central

    Tsai, Ping-Huang; Liu, Fang-Chun; Tsao, Jenho; Wang, Yung-Hung; Lo, Men-Tzung

    2015-01-01

    Alzheimer's disease (AD) is the most common form of dementia. According to one hypothesis, AD is caused by the reduced synthesis of the neurotransmitter acetylcholine. Therefore, acetylcholinesterase (AChE) inhibitors are considered to be an effective therapy. For clinicians, however, AChE inhibitors are not a predictable treatment for individual patients. We aimed to disclose the difference by biosignal processing. In this study, we used multiscale entropy (MSE) analysis, which can disclose the embedded information in different time scales, in electroencephalography (EEG), in an attempt to predict the efficacy of AChE inhibitors. Seventeen newly diagnosed AD patients were enrolled, with an initial minimental state examination (MMSE) score of 18.8 ± 4.5. After 12 months of AChE inhibitor therapy, 7 patients were responsive and 10 patients were nonresponsive. The major difference between these two groups is Slope 2 (MSE6 to 20). The area below the receiver operating characteristic (ROC) curve of Slope 2 is 0.871 (95% CI = 0.69–1). The sensitivity is 85.7% and the specificity is 60%, whereas the cut-off value of Slope 2 is −0.024. Therefore, MSE analysis of EEG signals, especially Slope 2, provides a potential tool for predicting the efficacy of AChE inhibitors prior to therapy. PMID:26120358

  8. Measuring Center of Pressure Signals to Quantify Human Balance Using Multivariate Multiscale Entropy by Designing a Force Platform

    PubMed Central

    Huang, Cheng-Wei; Sue, Pei-Der; Abbod, Maysam F.; Jiang, Bernard C.; Shieh, Jiann-Shing

    2013-01-01

    To assess the improvement of human body balance, a low cost and portable measuring device of center of pressure (COP), known as center of pressure and complexity monitoring system (CPCMS), has been developed for data logging and analysis. In order to prove that the system can estimate the different magnitude of different sways in comparison with the commercial Advanced Mechanical Technology Incorporation (AMTI) system, four sway tests have been developed (i.e., eyes open, eyes closed, eyes open with water pad, and eyes closed with water pad) to produce different sway displacements. Firstly, static and dynamic tests were conducted to investigate the feasibility of the system. Then, correlation tests of the CPCMS and AMTI systems have been compared with four sway tests. The results are within the acceptable range. Furthermore, multivariate empirical mode decomposition (MEMD) and enhanced multivariate multiscale entropy (MMSE) analysis methods have been used to analyze COP data reported by the CPCMS and compare it with the AMTI system. The improvements of the CPCMS are 35% to 70% (open eyes test) and 60% to 70% (eyes closed test) with and without water pad. The AMTI system has shown an improvement of 40% to 80% (open eyes test) and 65% to 75% (closed eyes test). The results indicate that the CPCMS system can achieve similar results to the commercial product so it can determine the balance. PMID:23966184

  9. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  10. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    NASA Astrophysics Data System (ADS)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  11. An EEMD-ICA Approach to Enhancing Artifact Rejection for Noisy Multivariate Neural Data.

    PubMed

    Zeng, Ke; Chen, Dan; Ouyang, Gaoxiang; Wang, Lizhe; Liu, Xianzeng; Li, Xiaoli

    2016-06-01

    As neural data are generally noisy, artifact rejection is crucial for data preprocessing. It has long been a grand research challenge for an approach which is able: 1) to remove the artifacts and 2) to avoid loss or disruption of the structural information at the same time, thus the risk of introducing bias to data interpretation may be minimized. In this study, an approach (namely EEMD-ICA) was proposed to first decompose multivariate neural data that are possibly noisy into intrinsic mode functions (IMFs) using ensemble empirical mode decomposition (EEMD). Independent component analysis (ICA) was then applied to the IMFs to separate the artifactual components. The approach was tested against the classical ICA and the automatic wavelet ICA (AWICA) methods, which were dominant methods for artifact rejection. In order to evaluate the effectiveness of the proposed approach in handling neural data possibly with intensive noises, experiments on artifact removal were performed using semi-simulated data mixed with a variety of noises. Experimental results indicate that the proposed approach continuously outperforms the counterparts in terms of both normalized mean square error (NMSE) and Structure SIMilarity (SSIM). The superiority becomes even greater with the decrease of SNR in all cases, e.g., SSIM of the EEMD-ICA can almost double that of AWICA and triple that of ICA. To further examine the potentials of the approach in sophisticated applications, the approach together with the counterparts were used to preprocess a real-life epileptic EEG with absence seizure. Experiments were carried out with the focus on characterizing the dynamics of the data after artifact rejection, i.e., distinguishing seizure-free, pre-seizure and seizure states. Using multi-scale permutation entropy to extract feature and linear discriminant analysis for classification, the EEMD-ICA performed the best for classifying the states (87.4%, about 4.1% and 8.7% higher than that of AWICA and ICA respectively), which was closest to the results of the manually selected dataset (89.7%).

  12. Logarithmic profile mapping multi-scale Retinex for restoration of low illumination images

    NASA Astrophysics Data System (ADS)

    Shi, Haiyan; Kwok, Ngaiming; Wu, Hongkun; Li, Ruowei; Liu, Shilong; Lin, Ching-Feng; Wong, Chin Yeow

    2018-04-01

    Images are valuable information sources for many scientific and engineering applications. However, images captured in poor illumination conditions would have a large portion of dark regions that could heavily degrade the image quality. In order to improve the quality of such images, a restoration algorithm is developed here that transforms the low input brightness to a higher value using a modified Multi-Scale Retinex approach. The algorithm is further improved by a entropy based weighting with the input and the processed results to refine the necessary amplification at regions of low brightness. Moreover, fine details in the image are preserved by applying the Retinex principles to extract and then re-insert object edges to obtain an enhanced image. Results from experiments using low and normal illumination images have shown satisfactory performances with regard to the improvement in information contents and the mitigation of viewing artifacts.

  13. Multiscale Information Transfer in Functional Corticomuscular Coupling Estimation Following Stroke: A Pilot Study

    PubMed Central

    Chen, Xiaoling; Xie, Ping; Zhang, Yuanyuan; Chen, Yuling; Yang, Fangmei; Zhang, Litai; Li, Xiaoli

    2018-01-01

    Recently, functional corticomuscular coupling (FCMC) between the cortex and the contralateral muscle has been used to evaluate motor function after stroke. As we know, the motor-control system is a closed-loop system that is regulated by complex self-regulating and interactive mechanisms which operate in multiple spatial and temporal scales. Multiscale analysis can represent the inherent complexity. However, previous studies in FCMC for stroke patients mainly focused on the coupling strength in single-time scale, without considering the changes of the inherently directional and multiscale properties in sensorimotor systems. In this paper, a multiscale-causal model, named multiscale transfer entropy, was used to quantify the functional connection between electroencephalogram over the scalp and electromyogram from the flexor digitorum superficialis (FDS) recorded simultaneously during steady-state grip task in eight stroke patients and eight healthy controls. Our results showed that healthy controls exhibited higher coupling when the scale reached up to about 12, and the FCMC in descending direction was stronger at certain scales (1, 7, 12, and 14) than that in ascending direction. Further analysis showed these multi-time scale characteristics mainly focused on the beta1 band at scale 11 and beta2 band at scale 9, 11, 13, and 15. Compared to controls, the multiscale properties of the FCMC for stroke were changed, the strengths in both directions were reduced, and the gaps between the descending and ascending directions were disappeared over all scales. Further analysis in specific bands showed that the reduced FCMC mainly focused on the alpha2 at higher scale, beta1 and beta2 across almost the entire scales. This study about multi-scale confirms that the FCMC between the brain and muscles is capable of complex and directional characteristics, and these characteristics in functional connection for stroke are destroyed by the structural lesion in the brain that might disrupt coordination, feedback, and information transmission in efferent control and afferent feedback. The study demonstrates for the first time the multiscale and directional characteristics of the FCMC for stroke patients, and provides a preliminary observation for application in clinical assessment following stroke. PMID:29765351

  14. Possible signatures of dissipation from time-series analysis techniques using a turbulent laboratory magnetohydrodynamic plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaffner, D. A.; Brown, M. R.; Rock, A. B.

    The frequency spectrum of magnetic fluctuations as measured on the Swarthmore Spheromak Experiment is broadband and exhibits a nearly Kolmogorov 5/3 scaling. It features a steepening region which is indicative of dissipation of magnetic fluctuation energy similar to that observed in fluid and magnetohydrodynamic turbulence systems. Two non-spectrum based time-series analysis techniques are implemented on this data set in order to seek other possible signatures of turbulent dissipation beyond just the steepening of fluctuation spectra. Presented here are results for the flatness, permutation entropy, and statistical complexity, each of which exhibits a particular character at spectral steepening scales which canmore » then be compared to the behavior of the frequency spectrum.« less

  15. An entropy-based nonparametric test for the validation of surrogate endpoints.

    PubMed

    Miao, Xiaopeng; Wang, Yong-Cheng; Gangopadhyay, Ashis

    2012-06-30

    We present a nonparametric test to validate surrogate endpoints based on measure of divergence and random permutation. This test is a proposal to directly verify the Prentice statistical definition of surrogacy. The test does not impose distributional assumptions on the endpoints, and it is robust to model misspecification. Our simulation study shows that the proposed nonparametric test outperforms the practical test of the Prentice criterion in terms of both robustness of size and power. We also evaluate the performance of three leading methods that attempt to quantify the effect of surrogate endpoints. The proposed method is applied to validate magnetic resonance imaging lesions as the surrogate endpoint for clinical relapses in a multiple sclerosis trial. Copyright © 2012 John Wiley & Sons, Ltd.

  16. High order entropy conservative central schemes for wide ranges of compressible gas dynamics and MHD flows

    NASA Astrophysics Data System (ADS)

    Sjögreen, Björn; Yee, H. C.

    2018-07-01

    The Sjogreen and Yee [31] high order entropy conservative numerical method for compressible gas dynamics is extended to include discontinuities and also extended to equations of ideal magnetohydrodynamics (MHD). The basic idea is based on Tadmor's [40] original work for inviscid perfect gas flows. For the MHD four formulations of the MHD are considered: (a) the conservative MHD, (b) the Godunov [14] non-conservative form, (c) the Janhunen [19] - MHD with magnetic field source terms, and (d) a MHD with source terms by Brackbill and Barnes [5]. Three forms of the high order entropy numerical fluxes for the MHD in the finite difference framework are constructed. They are based on the extension of the low order form of Chandrashekar and Klingenberg [9], and two forms with modifications of the Winters and Gassner [49] numerical fluxes. For flows containing discontinuities and multiscale turbulence fluctuations the high order entropy conservative numerical fluxes as the new base scheme under the Yee and Sjogreen [31] and Kotov et al. [21,22] high order nonlinear filter approach is developed. The added nonlinear filter step on the high order centered entropy conservative spatial base scheme is only utilized at isolated computational regions, while maintaining high accuracy almost everywhere for long time integration of unsteady flows and DNS and LES of turbulence computations. Representative test cases for both smooth flows and problems containing discontinuities for the gas dynamics and the ideal MHD are included. The results illustrate the improved stability by using the high order entropy conservative numerical flux as the base scheme instead of the pure high order central scheme.

  17. Kinetics and thermodynamics of living copolymerization processes

    PubMed Central

    2016-01-01

    Theoretical advances are reported on the kinetics and thermodynamics of free and template-directed living copolymerizations. Until recently, the kinetic theory of these processes had only been established in the fully irreversible regime, in which the attachment rates are only considered. However, the entropy production is infinite in this regime and the approach to thermodynamic equilibrium cannot be investigated. For this purpose, the detachment rates should also be included. Inspite of this complication, the kinetics can be exactly solved in the regimes of steady growth and depolymerization. In this way, analytical expressions are obtained for the mean growth velocity, the statistical properties of the copolymer sequences, as well as the thermodynamic entropy production. The results apply to DNA replication, transcription and translation, allowing us to understand important aspects of molecular evolution. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698043

  18. Kinetics and thermodynamics of living copolymerization processes.

    PubMed

    Gaspard, Pierre

    2016-11-13

    Theoretical advances are reported on the kinetics and thermodynamics of free and template-directed living copolymerizations. Until recently, the kinetic theory of these processes had only been established in the fully irreversible regime, in which the attachment rates are only considered. However, the entropy production is infinite in this regime and the approach to thermodynamic equilibrium cannot be investigated. For this purpose, the detachment rates should also be included. Inspite of this complication, the kinetics can be exactly solved in the regimes of steady growth and depolymerization. In this way, analytical expressions are obtained for the mean growth velocity, the statistical properties of the copolymer sequences, as well as the thermodynamic entropy production. The results apply to DNA replication, transcription and translation, allowing us to understand important aspects of molecular evolution.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).

  19. Kinetics and thermodynamics of living copolymerization processes

    NASA Astrophysics Data System (ADS)

    Gaspard, Pierre

    2016-11-01

    Theoretical advances are reported on the kinetics and thermodynamics of free and template-directed living copolymerizations. Until recently, the kinetic theory of these processes had only been established in the fully irreversible regime, in which the attachment rates are only considered. However, the entropy production is infinite in this regime and the approach to thermodynamic equilibrium cannot be investigated. For this purpose, the detachment rates should also be included. Inspite of this complication, the kinetics can be exactly solved in the regimes of steady growth and depolymerization. In this way, analytical expressions are obtained for the mean growth velocity, the statistical properties of the copolymer sequences, as well as the thermodynamic entropy production. The results apply to DNA replication, transcription and translation, allowing us to understand important aspects of molecular evolution. This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  20. Gas-water two-phase flow characterization with Electrical Resistance Tomography and Multivariate Multiscale Entropy analysis.

    PubMed

    Tan, Chao; Zhao, Jia; Dong, Feng

    2015-03-01

    Flow behavior characterization is important to understand gas-liquid two-phase flow mechanics and further establish its description model. An Electrical Resistance Tomography (ERT) provides information regarding flow conditions at different directions where the sensing electrodes implemented. We extracted the multivariate sample entropy (MSampEn) by treating ERT data as a multivariate time series. The dynamic experimental results indicate that the MSampEn is sensitive to complexity change of flow patterns including bubbly flow, stratified flow, plug flow and slug flow. MSampEn can characterize the flow behavior at different direction of two-phase flow, and reveal the transition between flow patterns when flow velocity changes. The proposed method is effective to analyze two-phase flow pattern transition by incorporating information of different scales and different spatial directions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Linear and nonlinear features of fetal heart rate on the assessment of fetal development in the course of pregnancy and the impact of fetal gender.

    PubMed

    Spyridou, K; Chouvarda, I; Hadjileontiadis, L; Maglaveras, N

    2018-01-30

    This work aims to investigate the impact of gestational age and fetal gender on fetal heart rate (FHR) tracings. Different linear and nonlinear parameters indicating correlation or complexity were used to study the influence of fetal age and gender on FHR tracings. The signals were recorded from 99 normal pregnant women in a singleton pregnancy at gestational ages from 28 to 40 weeks, before the onset of labor. There were 56 female fetuses and 43 male. Analysis of FHR shows that the means as well as measures of irregularity of FHR, such as approximate entropy and algorithmic complexity, decrease as gestation progresses. There were also indications that mutual information and multiscale entropy were lower in male fetuses in early pregnancy. Fetal age and gender seem to influence FHR tracings. Taking this into consideration would improve the interpretation of FHR monitoring.

  2. Fusion of infrared and visible images based on BEMD and NSDFB

    NASA Astrophysics Data System (ADS)

    Zhu, Pan; Huang, Zhanhua; Lei, Hai

    2016-07-01

    This paper presents a new fusion method based on the adaptive multi-scale decomposition of bidimensional empirical mode decomposition (BEMD) and the flexible directional expansion of nonsubsampled directional filter banks (NSDFB) for visible-infrared images. Compared with conventional multi-scale fusion methods, BEMD is non-parametric and completely data-driven, which is relatively more suitable for non-linear signals decomposition and fusion. NSDFB can provide direction filtering on the decomposition levels to capture more geometrical structure of the source images effectively. In our fusion framework, the entropies of the two patterns of source images are firstly calculated and the residue of the image whose entropy is larger is extracted to make it highly relevant with the other source image. Then, the residue and the other source image are decomposed into low-frequency sub-bands and a sequence of high-frequency directional sub-bands in different scales by using BEMD and NSDFB. In this fusion scheme, two relevant fusion rules are used in low-frequency sub-bands and high-frequency directional sub-bands, respectively. Finally, the fused image is obtained by applying corresponding inverse transform. Experimental results indicate that the proposed fusion algorithm can obtain state-of-the-art performance for visible-infrared images fusion in both aspects of objective assessment and subjective visual quality even for the source images obtained in different conditions. Furthermore, the fused results have high contrast, remarkable target information and rich details information that are more suitable for human visual characteristics or machine perception.

  3. Full spinal cord regeneration after total transection is not possible due to entropy change.

    PubMed

    Zielinski, P; Sokal, P

    2016-09-01

    Transected spinal cord regeneration is a main challenge of regenerative medicine. The mainstream of research is focused on the promotion of spinal axons growth, which is strongly inhibited in mammals. Assuming that the inhibition of the axonal growth may be ever overcome, the complexity of neural reconnections may be the second serious stand to overcome. Peripheral nerve axons regeneration seem to form a random pattern of their targets reconnections. The hypothesis is that due to the laws of entropy or irreversible information loss the full spinal cord restoration after the transection is not possible. The hypothesis is discussed based on several assumptions. Simplifying the dissertation spinal cord is represented by 2millions of pyramidal axons. After the transection each of these axons has to make a growth and reconnect with exactly matching targets below the transection, in the same number. Axons are guided by neurotrophic factors and afterwards reconnected with neuroplasticity mechanisms. Assuming random reconnections, there are 2,000,000! permutations [Formula: see text] , therefore the chance of ideally random but correct reconnection of pyramidal axons with adequate targets is 1/2,000,000!. Apart from pyramidal axons, there are other axons, like extrapyramidal, sensory and associative. Empirical data and analysis of neurotrophic factors and organogenesis mechanisms may seem to slightly contradict the hypothesis, but strictly adhering to the second law of thermodynamics and entropy laws the full restoration of the transected cord may never be possible. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Decreased resting-state brain activity complexity in schizophrenia characterized by both increased regularity and randomness.

    PubMed

    Yang, Albert C; Hong, Chen-Jee; Liou, Yin-Jay; Huang, Kai-Lin; Huang, Chu-Chung; Liu, Mu-En; Lo, Men-Tzung; Huang, Norden E; Peng, Chung-Kang; Lin, Ching-Po; Tsai, Shih-Jen

    2015-06-01

    Schizophrenia is characterized by heterogeneous pathophysiology. Using multiscale entropy (MSE) analysis, which enables capturing complex dynamics of time series, we characterized MSE patterns of blood-oxygen-level-dependent (BOLD) signals across different time scales and determined whether BOLD activity in patients with schizophrenia exhibits increased complexity (increased entropy in all time scales), decreased complexity toward regularity (decreased entropy in all time scales), or decreased complexity toward uncorrelated randomness (high entropy in short time scales followed by decayed entropy as the time scale increases). We recruited 105 patients with schizophrenia with an age of onset between 18 and 35 years and 210 age- and sex-matched healthy volunteers. Results showed that MSE of BOLD signals in patients with schizophrenia exhibited two routes of decreased BOLD complexity toward either regular or random patterns. Reduced BOLD complexity toward regular patterns was observed in the cerebellum and temporal, middle, and superior frontal regions, and reduced BOLD complexity toward randomness was observed extensively in the inferior frontal, occipital, and postcentral cortices as well as in the insula and middle cingulum. Furthermore, we determined that the two types of complexity change were associated differently with psychopathology; specifically, the regular type of BOLD complexity change was associated with positive symptoms of schizophrenia, whereas the randomness type of BOLD complexity was associated with negative symptoms of the illness. These results collectively suggested that resting-state dynamics in schizophrenia exhibit two routes of pathologic change toward regular or random patterns, which contribute to the differences in syndrome domains of psychosis in patients with schizophrenia. © 2015 Wiley Periodicals, Inc.

  5. Using Renyi entropy to detect early cardiac autonomic neuropathy.

    PubMed

    Cornforth, David J; Tarvainen, Mika P; Jelinek, Herbert F

    2013-01-01

    Cardiac Autonomic Neuropathy (CAN) is a disease that involves nerve damage leading to abnormal control of heart rate. CAN affects the correct operation of the heart and in turn leads to associated arrhythmias and heart attack. An open question is to what extent this condition is detectable by the measurement of Heart Rate Variability (HRV). An even more desirable option is to detect CAN in its early, preclinical stage, to improve treatment and outcomes. In previous work we have shown a difference in the Renyi spectrum between participants identified with well-defined CAN and controls. In this work we applied the multi-scale Renyi entropy for identification of early CAN in diabetes patients. Results suggest that Renyi entropy derived from a 20 minute, Lead-II ECG recording, forms a useful contribution to the detection of CAN even in the early stages of the disease. The positive α parameters (1 ≤ α ≤ 5) associated with the Renyi distribution indicated a significant difference (p < 0.00004) between controls and early CAN as well as definite CAN. This is a significant achievement given the simple nature of the information collected, and raises prospects of a simple screening test and improved outcomes of patients.

  6. Time-delay signature of chaos in 1550 nm VCSELs with variable-polarization FBG feedback.

    PubMed

    Li, Yan; Wu, Zheng-Mao; Zhong, Zhu-Qiang; Yang, Xian-Jie; Mao, Song; Xia, Guang-Qiong

    2014-08-11

    Based on the framework of spin-flip model (SFM), the output characteristics of a 1550 nm vertical-cavity surface-emitting laser (VCSEL) subject to variable-polarization fiber Bragg grating (FBG) feedback (VPFBGF) have been investigated. With the aid of the self-correlation function (SF) and the permutation entropy (PE) function, the time-delay signature (TDS) of chaos in the VPFBGF-VCSEL is evaluated, and then the influences of the operation parameters on the TDS of chaos are analyzed. The results show that the TDS of chaos can be suppressed efficiently through selecting suitable coupling coefficient and feedback rate of the FBG, and is weaker than that of chaos generated by traditional variable-polarization mirror feedback VCSELs (VPMF-VCSELs) or polarization-preserved FBG feedback VCSELs (PPFBGF-VCSELs).

  7. Tree tensor network approach to simulating Shor's algorithm

    NASA Astrophysics Data System (ADS)

    Dumitrescu, Eugene

    2017-12-01

    Constructively simulating quantum systems furthers our understanding of qualitative and quantitative features which may be analytically intractable. In this paper, we directly simulate and explore the entanglement structure present in the paradigmatic example for exponential quantum speedups: Shor's algorithm. To perform our simulation, we construct a dynamic tree tensor network which manifestly captures two salient circuit features for modular exponentiation. These are the natural two-register bipartition and the invariance of entanglement with respect to permutations of the top-register qubits. Our construction help identify the entanglement entropy properties, which we summarize by a scaling relation. Further, the tree network is efficiently projected onto a matrix product state from which we efficiently execute the quantum Fourier transform. Future simulation of quantum information states with tensor networks exploiting circuit symmetries is discussed.

  8. Folding behavior of ribosomal protein S6 studied by modified Go¯ -like model

    NASA Astrophysics Data System (ADS)

    Wu, L.; Zhang, J.; Wang, J.; Li, W. F.; Wang, W.

    2007-03-01

    Recent experimental and theoretical studies suggest that, although topology is the determinant factor in protein folding, especially for small single-domain proteins, energetic factors also play an important role in the folding process. The ribosomal protein S6 has been subjected to intensive studies. A radical change of the transition state in its circular permutants has been observed, which is believed to be caused by a biased distribution of contact energies. Since the simplistic topology-only Gō -like model is not able to reproduce such an observation, we modify the model by introducing variable contact energies between residues based on their physicochemical properties. The modified Gō -like model can successfully reproduce the Φ -value distributions, folding nucleus, and folding pathways of both the wild-type and circular permutants of S6. Furthermore, by comparing the results of the modified and the simplistic models, we find that the hydrophobic effect constructs the major force that balances the loop entropies. This may indicate that nature maintains the folding cooperativity of this protein by carefully arranging the location of hydrophobic residues in the sequence. Our study reveals a strategy or mechanism used by nature to get out of the dilemma when the native structure, possibly required by biological function, conflicts with folding cooperativity. Finally, the possible relationship between such a design of nature and amyloidosis is also discussed.

  9. Relationship of multiscale entropy to task difficulty and sway velocity in healthy young adults.

    PubMed

    Lubetzky, Anat V; Price, Robert; Ciol, Marcia A; Kelly, Valerie E; McCoy, Sarah W

    2015-01-01

    Multiscale entropy (MSE) is a nonlinear measure of postural control that quantifies how complex the postural sway is by assigning a complexity index to the center of pressure (COP) oscillations. While complexity has been shown to be task dependent, the relationship between sway complexity and level of task challenge is currently unclear. This study tested whether MSE can detect short-term changes in postural control in response to increased standing balance task difficulty in healthy young adults and compared this response to that of a traditional measure of postural steadiness, root mean square of velocity (VRMS). COP data from 20 s of quiet stance were analyzed when 30 healthy young adults stood on the following surfaces: on floor and foam with eyes open and closed and on the compliant side of a Both Sides Up (BOSU) ball with eyes open. Complexity index (CompI) was derived from MSE curves. Repeated measures analysis of variance across standing conditions showed a statistically significant effect of condition (p < 0.001) in both the anterior-posterior and medio-lateral directions for both CompI and VRMS. In the medio-lateral direction there was a gradual increase in CompI and VRMS with increased standing challenge. In the anterior-posterior direction, VRMS showed a gradual increase whereas CompI showed significant differences between the BOSU and all other conditions. CompI was moderately and significantly correlated with VRMS. Both nonlinear and traditional measures of postural control were sensitive to the task and increased with increasing difficulty of standing balance tasks in healthy young adults.

  10. Neurophysiological Basis of Multi-Scale Entropy of Brain Complexity and Its Relationship With Functional Connectivity.

    PubMed

    Wang, Danny J J; Jann, Kay; Fan, Chang; Qiao, Yang; Zang, Yu-Feng; Lu, Hanbing; Yang, Yihong

    2018-01-01

    Recently, non-linear statistical measures such as multi-scale entropy (MSE) have been introduced as indices of the complexity of electrophysiology and fMRI time-series across multiple time scales. In this work, we investigated the neurophysiological underpinnings of complexity (MSE) of electrophysiology and fMRI signals and their relations to functional connectivity (FC). MSE and FC analyses were performed on simulated data using neural mass model based brain network model with the Brain Dynamics Toolbox, on animal models with concurrent recording of fMRI and electrophysiology in conjunction with pharmacological manipulations, and on resting-state fMRI data from the Human Connectome Project. Our results show that the complexity of regional electrophysiology and fMRI signals is positively correlated with network FC. The associations between MSE and FC are dependent on the temporal scales or frequencies, with higher associations between MSE and FC at lower temporal frequencies. Our results from theoretical modeling, animal experiment and human fMRI indicate that (1) Regional neural complexity and network FC may be two related aspects of brain's information processing: the more complex regional neural activity, the higher FC this region has with other brain regions; (2) MSE at high and low frequencies may represent local and distributed information processing across brain regions. Based on literature and our data, we propose that the complexity of regional neural signals may serve as an index of the brain's capacity of information processing-increased complexity may indicate greater transition or exploration between different states of brain networks, thereby a greater propensity for information processing.

  11. Chaotic characteristics enhanced by impeller of perturbed six-bent-bladed turbine in stirred tank

    NASA Astrophysics Data System (ADS)

    Luan, Deyu; Zhang, Shengfeng; Lu, Jianping; Zhang, Xiaoguang

    The fundamental way of improving the mixing efficiency is to induce the chaotic flow in a stirred vessel. The impeller form plays an important role for changing the structure of flow field and realizing chaotic mixing. Based on the velocity time series acquired by the experiment of particle image velocimetry (PIV), with the software Matlab, the macro-instability (MI), largest Lyapunov exponent (LLE), and Kolmogorov entropy in the water stirred tank is investigated respectively with the impeller of perturbed six-bent-bladed turbine (6PBT). The results show that the MI characteristics are obvious and two peak values of MI frequency are observed at the speed N = 60 rpm. With the increasing speed (more than 100 rpm), the peak characteristics of MI frequency disappear and a multi-scale wavelet structure of characterizing the chaotic flow field appears. Moreover, under the speed N = 60 rpm, the LLE is less than 0 and Kolmogorov entropy is 0, which means that the flow field is in the periodic moving state. As the speed is increased to more than 100 rpm, the LLE and Kolmogorov entropy are all more than 0, which indicates that the flow field goes into the chaotic mixing. When the speed reaches up to about 210 rpm, both of the LLE and Kolmogorov entropy achieve the optimum values, which will result in an excellent chaos with the highest mixing efficient. So it is feasible that the MI frequency, the LLE and the Kolmogorov entropy can be used to analyze the flow field characteristics in a stirred tank. The research results promote the understanding of the chaotic mixing mechanism and provide a theoretical reference for the development of new type impeller.

  12. Search for HRV-parameters that detect a sympathetic shift in heart failure patients on β-blocker treatment.

    PubMed

    Zhang, Yanru; de Peuter, Olav R; Kamphuisen, Pieter W; Karemaker, John M

    2013-01-01

    A sympathetic shift in heart rate variability (HRV) from high to lower frequencies may be an early signal of deterioration in a monitored patient. Most chronic heart failure (CHF) patients receive β-blockers. This tends to obscure HRV observation by increasing the fast variations. We tested which HRV parameters would still detect the change into a sympathetic state. β-blocker (Carvedilol®) treated CHF patients underwent a protocol of 10 min supine rest, followed by 10 min active standing. CHF patients (NYHA Class II-IV) n = 15, 10m/5f, mean age 58.4 years (47-72); healthy controls n = 29, 18m/11f, mean age 62.9 years (49-78). Interbeat intervals (IBI) were extracted from the finger blood pressure wave (Nexfin®). Both linear and non-linear HRV analyses were applied that (1) might be able to differentiate patients from healthy controls under resting conditions and (2) detect the change into a sympathetic state in the present short recordings. Linear: mean-IBI, SD-IBI, root mean square of successive differences (rMSSD), pIBI-50 (the proportion of intervals that differs by more than 50 ms from the previous), LF, HF, and LF/HF ratio. Non-linear: Sample entropy (SampEn), Multiscale entropy (MSE), and derived: Multiscale variance (MSV) and Multiscale rMSSD (MSD). In the supine resting situation patients differed from controls by having higher HF and, consequently, lower LF/HF. In addition their longer range (τ = 6-10) MSE was lower as well. The sympathetic shift was, in controls, detected by mean-IBI, rMSSD, pIBI-50, and LF/HF, all going down; in CHF by mean-IBI, rMSSD, pIBI-50, and MSD (τ = 6-10) going down. MSD6-10 introduced here works as a band-pass filter favoring frequencies from 0.02 to 0.1 Hz. In β-blocker treated CHF patients, traditional time domain analysis (mean-IBI, rMSSD, pIBI-50) and MSD6-10 provide the most useful information to detect a condition change.

  13. Decreased Complexity in Alzheimer's Disease: Resting-State fMRI Evidence of Brain Entropy Mapping.

    PubMed

    Wang, Bin; Niu, Yan; Miao, Liwen; Cao, Rui; Yan, Pengfei; Guo, Hao; Li, Dandan; Guo, Yuxiang; Yan, Tianyi; Wu, Jinglong; Xiang, Jie; Zhang, Hui

    2017-01-01

    Alzheimer's disease (AD) is a frequently observed, irreversible brain function disorder among elderly individuals. Resting-state functional magnetic resonance imaging (rs-fMRI) has been introduced as an alternative approach to assessing brain functional abnormalities in AD patients. However, alterations in the brain rs-fMRI signal complexities in mild cognitive impairment (MCI) and AD patients remain unclear. Here, we described the novel application of permutation entropy (PE) to investigate the abnormal complexity of rs-fMRI signals in MCI and AD patients. The rs-fMRI signals of 30 normal controls (NCs), 33 early MCI (EMCI), 32 late MCI (LMCI), and 29 AD patients were obtained from the Alzheimer's disease Neuroimaging Initiative (ADNI) database. After preprocessing, whole-brain entropy maps of the four groups were extracted and subjected to Gaussian smoothing. We performed a one-way analysis of variance (ANOVA) on the brain entropy maps of the four groups. The results after adjusting for age and sex differences together revealed that the patients with AD exhibited lower complexity than did the MCI and NC controls. We found five clusters that exhibited significant differences and were distributed primarily in the occipital, frontal, and temporal lobes. The average PE of the five clusters exhibited a decreasing trend from MCI to AD. The AD group exhibited the least complexity. Additionally, the average PE of the five clusters was significantly positively correlated with the Mini-Mental State Examination (MMSE) scores and significantly negatively correlated with Functional Assessment Questionnaire (FAQ) scores and global Clinical Dementia Rating (CDR) scores in the patient groups. Significant correlations were also found between the PE and regional homogeneity (ReHo) in the patient groups. These results indicated that declines in PE might be related to changes in regional functional homogeneity in AD. These findings suggested that complexity analyses using PE in rs-fMRI signals can provide important information about the fMRI characteristics of cognitive impairments in MCI and AD.

  14. Decreased Complexity in Alzheimer's Disease: Resting-State fMRI Evidence of Brain Entropy Mapping

    PubMed Central

    Wang, Bin; Niu, Yan; Miao, Liwen; Cao, Rui; Yan, Pengfei; Guo, Hao; Li, Dandan; Guo, Yuxiang; Yan, Tianyi; Wu, Jinglong; Xiang, Jie; Zhang, Hui

    2017-01-01

    Alzheimer's disease (AD) is a frequently observed, irreversible brain function disorder among elderly individuals. Resting-state functional magnetic resonance imaging (rs-fMRI) has been introduced as an alternative approach to assessing brain functional abnormalities in AD patients. However, alterations in the brain rs-fMRI signal complexities in mild cognitive impairment (MCI) and AD patients remain unclear. Here, we described the novel application of permutation entropy (PE) to investigate the abnormal complexity of rs-fMRI signals in MCI and AD patients. The rs-fMRI signals of 30 normal controls (NCs), 33 early MCI (EMCI), 32 late MCI (LMCI), and 29 AD patients were obtained from the Alzheimer's disease Neuroimaging Initiative (ADNI) database. After preprocessing, whole-brain entropy maps of the four groups were extracted and subjected to Gaussian smoothing. We performed a one-way analysis of variance (ANOVA) on the brain entropy maps of the four groups. The results after adjusting for age and sex differences together revealed that the patients with AD exhibited lower complexity than did the MCI and NC controls. We found five clusters that exhibited significant differences and were distributed primarily in the occipital, frontal, and temporal lobes. The average PE of the five clusters exhibited a decreasing trend from MCI to AD. The AD group exhibited the least complexity. Additionally, the average PE of the five clusters was significantly positively correlated with the Mini-Mental State Examination (MMSE) scores and significantly negatively correlated with Functional Assessment Questionnaire (FAQ) scores and global Clinical Dementia Rating (CDR) scores in the patient groups. Significant correlations were also found between the PE and regional homogeneity (ReHo) in the patient groups. These results indicated that declines in PE might be related to changes in regional functional homogeneity in AD. These findings suggested that complexity analyses using PE in rs-fMRI signals can provide important information about the fMRI characteristics of cognitive impairments in MCI and AD. PMID:29209199

  15. Comparative analysis of electric field influence on the quantum wells with different boundary conditions.: I. Energy spectrum, quantum information entropy and polarization.

    PubMed

    Olendski, Oleg

    2015-04-01

    Analytical solutions of the Schrödinger equation for the one-dimensional quantum well with all possible permutations of the Dirichlet and Neumann boundary conditions (BCs) in perpendicular to the interfaces uniform electric field [Formula: see text] are used for the comparative investigation of their interaction and its influence on the properties of the system. Limiting cases of the weak and strong voltages allow an easy mathematical treatment and its clear physical explanation; in particular, for the small [Formula: see text], the perturbation theory derives for all geometries a linear dependence of the polarization on the field with the BC-dependent proportionality coefficient being positive (negative) for the ground (excited) states. Simple two-level approximation elementary explains the negative polarizations as a result of the field-induced destructive interference of the unperturbed modes and shows that in this case the admixture of only the neighboring states plays a dominant role. Different magnitudes of the polarization for different BCs in this regime are explained physically and confirmed numerically. Hellmann-Feynman theorem reveals a fundamental relation between the polarization and the speed of the energy change with the field. It is proved that zero-voltage position entropies [Formula: see text] are BC independent and for all states but the ground Neumann level (which has [Formula: see text]) are equal to [Formula: see text] while the momentum entropies [Formula: see text] depend on the edge requirements and the level. Varying electric field changes position and momentum entropies in the opposite directions such that the entropic uncertainty relation is satisfied. Other physical quantities such as the BC-dependent zero-energy and zero-polarization fields are also studied both numerically and analytically. Applications to different branches of physics, such as ocean fluid dynamics and atmospheric and metallic waveguide electrodynamics, are discussed.

  16. Correntropy-based partial directed coherence for testing multivariate Granger causality in nonlinear processes

    NASA Astrophysics Data System (ADS)

    Kannan, Rohit; Tangirala, Arun K.

    2014-06-01

    Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.

  17. Phase transitions between lower and higher level management learning in times of crisis: an experimental study based on synergetics.

    PubMed

    Liening, Andreas; Strunk, Guido; Mittelstadt, Ewald

    2013-10-01

    Much has been written about the differences between single- and double-loop learning, or more general between lower level and higher level learning. Especially in times of a fundamental crisis, a transition between lower and higher level learning would be an appropriate reaction to a challenge coming entirely out of the dark. However, so far there is no quantitative method to monitor such a transition. Therefore we introduce theory and methods of synergetics and present results from an experimental study based on the simulation of a crisis within a business simulation game. Hypothesized critical fluctuations - as a marker for so-called phase transitions - have been assessed with permutation entropy. Results show evidence for a phase transition during the crisis, which can be interpreted as a transition between lower and higher level learning.

  18. Towards Characterization, Modeling, and Uncertainty Quantification in Multi-scale Mechanics of Oragnic-rich Shales

    NASA Astrophysics Data System (ADS)

    Abedi, S.; Mashhadian, M.; Noshadravan, A.

    2015-12-01

    Increasing the efficiency and sustainability in operation of hydrocarbon recovery from organic-rich shales requires a fundamental understanding of chemomechanical properties of organic-rich shales. This understanding is manifested in form of physics-bases predictive models capable of capturing highly heterogeneous and multi-scale structure of organic-rich shale materials. In this work we present a framework of experimental characterization, micromechanical modeling, and uncertainty quantification that spans from nanoscale to macroscale. Application of experiments such as coupled grid nano-indentation and energy dispersive x-ray spectroscopy and micromechanical modeling attributing the role of organic maturity to the texture of the material, allow us to identify unique clay mechanical properties among different samples that are independent of maturity of shale formations and total organic content. The results can then be used to inform the physically-based multiscale model for organic rich shales consisting of three levels that spans from the scale of elementary building blocks (e.g. clay minerals in clay-dominated formations) of organic rich shales to the scale of the macroscopic inorganic/organic hard/soft inclusion composite. Although this approach is powerful in capturing the effective properties of organic-rich shale in an average sense, it does not account for the uncertainty in compositional and mechanical model parameters. Thus, we take this model one step forward by systematically incorporating the main sources of uncertainty in modeling multiscale behavior of organic-rich shales. In particular we account for the uncertainty in main model parameters at different scales such as porosity, elastic properties and mineralogy mass percent. To that end, we use Maximum Entropy Principle and random matrix theory to construct probabilistic descriptions of model inputs based on available information. The Monte Carlo simulation is then carried out to propagate the uncertainty and consequently construct probabilistic descriptions of properties at multiple length-scales. The combination of experimental characterization and stochastic multi-scale modeling presented in this work improves the robustness in the prediction of essential subsurface parameters in engineering scale.

  19. Novel permutation measures for image encryption algorithms

    NASA Astrophysics Data System (ADS)

    Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.

    2016-10-01

    This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.

  20. Blocks in cycles and k-commuting permutations.

    PubMed

    Moreno, Rutilo; Rivera, Luis Manuel

    2016-01-01

    We introduce and study k -commuting permutations. One of our main results is a characterization of permutations that k -commute with a given permutation. Using this characterization, we obtain formulas for the number of permutations that k -commute with a permutation [Formula: see text], for some cycle types of [Formula: see text]. Our enumerative results are related with integer sequences in "The On-line Encyclopedia of Integer Sequences", and in some cases provide new interpretations for such sequences.

  1. Fast algorithms for transforming back and forth between a signed permutation and its equivalent simple permutation.

    PubMed

    Gog, Simon; Bader, Martin

    2008-10-01

    The problem of sorting signed permutations by reversals is a well-studied problem in computational biology. The first polynomial time algorithm was presented by Hannenhalli and Pevzner in 1995. The algorithm was improved several times, and nowadays the most efficient algorithm has a subquadratic running time. Simple permutations played an important role in the development of these algorithms. Although the latest result of Tannier et al. does not require simple permutations, the preliminary version of their algorithm as well as the first polynomial time algorithm of Hannenhalli and Pevzner use the structure of simple permutations. More precisely, the latter algorithms require a precomputation that transforms a permutation into an equivalent simple permutation. To the best of our knowledge, all published algorithms for this transformation have at least a quadratic running time. For further investigations on genome rearrangement problems, the existence of a fast algorithm for the transformation could be crucial. Another important task is the back transformation, i.e. if we have a sorting on the simple permutation, transform it into a sorting on the original permutation. Again, the naive approach results in an algorithm with quadratic running time. In this paper, we present a linear time algorithm for transforming a permutation into an equivalent simple permutation, and an O(n log n) algorithm for the back transformation of the sorting sequence.

  2. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  3. Decreased complexity of glucose dynamics in diabetes: evidence from multiscale entropy analysis of continuous glucose monitoring system data.

    PubMed

    Chen, Jin-Long; Chen, Pin-Fan; Wang, Hung-Ming

    2014-07-15

    Parameters of glucose dynamics recorded by the continuous glucose monitoring system (CGMS) could help in the control of glycemic fluctuations, which is important in diabetes management. Multiscale entropy (MSE) analysis has recently been developed to measure the complexity of physical and physiological time sequences. A reduced MSE complexity index indicates the increased repetition patterns of the time sequence, and, thus, a decreased complexity in this system. No study has investigated the MSE analysis of glucose dynamics in diabetes. This study was designed to compare the complexity of glucose dynamics between the diabetic patients (n = 17) and the control subjects (n = 13), who were matched for sex, age, and body mass index via MSE analysis using the CGMS data. Compared with the control subjects, the diabetic patients revealed a significant increase (P < 0.001) in the mean (diabetic patients 166.0 ± 10.4 vs. control subjects 93.3 ± 1.5 mg/dl), the standard deviation (51.7 ± 4.3 vs. 11.1 ± 0.5 mg/dl), and the mean amplitude of glycemic excursions (127.0 ± 9.2 vs. 27.7 ± 1.3 mg/dl) of the glucose levels; and a significant decrease (P < 0.001) in the MSE complexity index (5.09 ± 0.23 vs. 7.38 ± 0.28). In conclusion, the complexity of glucose dynamics is decreased in diabetes. This finding implies the reactivity of glucoregulation is impaired in the diabetic patients. Such impairment presenting as an increased regularity of glycemic fluctuating pattern could be detected by MSE analysis. Thus, the MSE complexity index could potentially be used as a biomarker in the monitoring of diabetes.

  4. Application of multi-scale wavelet entropy and multi-resolution Volterra models for climatic downscaling

    NASA Astrophysics Data System (ADS)

    Sehgal, V.; Lakhanpal, A.; Maheswaran, R.; Khosa, R.; Sridhar, Venkataramana

    2018-01-01

    This study proposes a wavelet-based multi-resolution modeling approach for statistical downscaling of GCM variables to mean monthly precipitation for five locations at Krishna Basin, India. Climatic dataset from NCEP is used for training the proposed models (Jan.'69 to Dec.'94) and are applied to corresponding CanCM4 GCM variables to simulate precipitation for the validation (Jan.'95-Dec.'05) and forecast (Jan.'06-Dec.'35) periods. The observed precipitation data is obtained from the India Meteorological Department (IMD) gridded precipitation product at 0.25 degree spatial resolution. This paper proposes a novel Multi-Scale Wavelet Entropy (MWE) based approach for clustering climatic variables into suitable clusters using k-means methodology. Principal Component Analysis (PCA) is used to obtain the representative Principal Components (PC) explaining 90-95% variance for each cluster. A multi-resolution non-linear approach combining Discrete Wavelet Transform (DWT) and Second Order Volterra (SoV) is used to model the representative PCs to obtain the downscaled precipitation for each downscaling location (W-P-SoV model). The results establish that wavelet-based multi-resolution SoV models perform significantly better compared to the traditional Multiple Linear Regression (MLR) and Artificial Neural Networks (ANN) based frameworks. It is observed that the proposed MWE-based clustering and subsequent PCA, helps reduce the dimensionality of the input climatic variables, while capturing more variability compared to stand-alone k-means (no MWE). The proposed models perform better in estimating the number of precipitation events during the non-monsoon periods whereas the models with clustering without MWE over-estimate the rainfall during the dry season.

  5. A transposase strategy for creating libraries of circularly permuted proteins.

    PubMed

    Mehta, Manan M; Liu, Shirley; Silberg, Jonathan J

    2012-05-01

    A simple approach for creating libraries of circularly permuted proteins is described that is called PERMutation Using Transposase Engineering (PERMUTE). In PERMUTE, the transposase MuA is used to randomly insert a minitransposon that can function as a protein expression vector into a plasmid that contains the open reading frame (ORF) being permuted. A library of vectors that express different permuted variants of the ORF-encoded protein is created by: (i) using bacteria to select for target vectors that acquire an integrated minitransposon; (ii) excising the ensemble of ORFs that contain an integrated minitransposon from the selected vectors; and (iii) circularizing the ensemble of ORFs containing integrated minitransposons using intramolecular ligation. Construction of a Thermotoga neapolitana adenylate kinase (AK) library using PERMUTE revealed that this approach produces vectors that express circularly permuted proteins with distinct sequence diversity from existing methods. In addition, selection of this library for variants that complement the growth of Escherichia coli with a temperature-sensitive AK identified functional proteins with novel architectures, suggesting that PERMUTE will be useful for the directed evolution of proteins with new functions.

  6. A transposase strategy for creating libraries of circularly permuted proteins

    PubMed Central

    Mehta, Manan M.; Liu, Shirley; Silberg, Jonathan J.

    2012-01-01

    A simple approach for creating libraries of circularly permuted proteins is described that is called PERMutation Using Transposase Engineering (PERMUTE). In PERMUTE, the transposase MuA is used to randomly insert a minitransposon that can function as a protein expression vector into a plasmid that contains the open reading frame (ORF) being permuted. A library of vectors that express different permuted variants of the ORF-encoded protein is created by: (i) using bacteria to select for target vectors that acquire an integrated minitransposon; (ii) excising the ensemble of ORFs that contain an integrated minitransposon from the selected vectors; and (iii) circularizing the ensemble of ORFs containing integrated minitransposons using intramolecular ligation. Construction of a Thermotoga neapolitana adenylate kinase (AK) library using PERMUTE revealed that this approach produces vectors that express circularly permuted proteins with distinct sequence diversity from existing methods. In addition, selection of this library for variants that complement the growth of Escherichia coli with a temperature-sensitive AK identified functional proteins with novel architectures, suggesting that PERMUTE will be useful for the directed evolution of proteins with new functions. PMID:22319214

  7. Multiscale Medical Image Fusion in Wavelet Domain

    PubMed Central

    Khare, Ashish

    2013-01-01

    Wavelet transforms have emerged as a powerful tool in image fusion. However, the study and analysis of medical image fusion is still a challenging area of research. Therefore, in this paper, we propose a multiscale fusion of multimodal medical images in wavelet domain. Fusion of medical images has been performed at multiple scales varying from minimum to maximum level using maximum selection rule which provides more flexibility and choice to select the relevant fused images. The experimental analysis of the proposed method has been performed with several sets of medical images. Fusion results have been evaluated subjectively and objectively with existing state-of-the-art fusion methods which include several pyramid- and wavelet-transform-based fusion methods and principal component analysis (PCA) fusion method. The comparative analysis of the fusion results has been performed with edge strength (Q), mutual information (MI), entropy (E), standard deviation (SD), blind structural similarity index metric (BSSIM), spatial frequency (SF), and average gradient (AG) metrics. The combined subjective and objective evaluations of the proposed fusion method at multiple scales showed the effectiveness and goodness of the proposed approach. PMID:24453868

  8. Magnetospheric Multiscale Mission Observations of Magnetic Flux Ropes in the Earth's Plasma Sheet

    NASA Astrophysics Data System (ADS)

    Slavin, J. A.; Akhavan-Tafti, M.; Poh, G.; Le, G.; Russell, C. T.; Nakamura, R.; Baumjohann, W.; Torbert, R. B.; Gershman, D. J.; Pollock, C. J.; Giles, B. L.; Moore, T. E.; Burch, J. L.

    2017-12-01

    A major discovery by the Cluster mission and the previous generation of science missions is the presence of earthward and tailward moving magnetic flux ropes in the Earth's plasma sheet. However, the lack of high-time resolution plasma measurements severely limited progress concerning the formation and evolution of these reconnection generated structures. We use high-time resolution magnetic and electric field and plasma measurements from the Magnetospheric Multiscale mission's first tail season to investigate: 1) the distribution of flux rope diameters relative to the local ion and electron inertial lengths; 2) the internal force balance sustaining these structures; and 3) the magnetic connectivity of the flux ropes to the Earth and/or the interplanetary medium; 4) the specific entropy of earthward moving flux ropes and the possible effect of "buoyancy" on how deep they penetrate into the inner magnetosphere; and 5) evidence for coalescence of adjacent flux ropes and/or the division of existing flux ropes through the formation of secondary X-lines. The results of these initial analyses will be discussed in terms of their implications for reconnection-driven magnetospheric dynamics and substorms.

  9. Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation

    PubMed Central

    Recchia, Gabriel; Sahlgren, Magnus; Kanerva, Pentti; Jones, Michael N.

    2015-01-01

    Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, “noisy” permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics. PMID:25954306

  10. Finite state model and compatibility theory - New analysis tools for permutation networks

    NASA Technical Reports Server (NTRS)

    Huang, S.-T.; Tripathi, S. K.

    1986-01-01

    A simple model to describe the fundamental operation theory of shuffle-exchange-type permutation networks, the finite permutation machine (FPM), is described, and theorems which transform the control matrix result to a continuous compatible vector result are developed. It is found that only 2n-1 shuffle exchange passes are necessary, and that 3n-3 passes are sufficient, to realize all permutations, reducing the sufficient number of passes by two from previous results. The flexibility of the approach is demonstrated by the description of a stack permutation machine (SPM) which can realize all permutations, and by showing that the FPM corresponding to the Benes (1965) network belongs to the SPM. The FPM corresponding to the network with two cascaded reverse-exchange networks is found to realize all permutations, and a simple mechanism to verify several equivalence relationships of various permutation networks is discussed.

  11. Sorting permutations by prefix and suffix rearrangements.

    PubMed

    Lintzmayer, Carla Negri; Fertin, Guillaume; Dias, Zanoni

    2017-02-01

    Some interesting combinatorial problems have been motivated by genome rearrangements, which are mutations that affect large portions of a genome. When we represent genomes as permutations, the goal is to transform a given permutation into the identity permutation with the minimum number of rearrangements. When they affect segments from the beginning (respectively end) of the permutation, they are called prefix (respectively suffix) rearrangements. This paper presents results for rearrangement problems that involve prefix and suffix versions of reversals and transpositions considering unsigned and signed permutations. We give 2-approximation and ([Formula: see text])-approximation algorithms for these problems, where [Formula: see text] is a constant divided by the number of breakpoints (pairs of consecutive elements that should not be consecutive in the identity permutation) in the input permutation. We also give bounds for the diameters concerning these problems and provide ways of improving the practical results of our algorithms.

  12. Low-Complexity Discriminative Feature Selection From EEG Before and After Short-Term Memory Task.

    PubMed

    Behzadfar, Neda; Firoozabadi, S Mohammad P; Badie, Kambiz

    2016-10-01

    A reliable and unobtrusive quantification of changes in cortical activity during short-term memory task can be used to evaluate the efficacy of interfaces and to provide real-time user-state information. In this article, we investigate changes in electroencephalogram signals in short-term memory with respect to the baseline activity. The electroencephalogram signals have been analyzed using 9 linear and nonlinear/dynamic measures. We applied statistical Wilcoxon examination and Davis-Bouldian criterion to select optimal discriminative features. The results show that among the features, the permutation entropy significantly increased in frontal lobe and the occipital second lower alpha band activity decreased during memory task. These 2 features reflect the same mental task; however, their correlation with memory task varies in different intervals. In conclusion, it is suggested that the combination of the 2 features would improve the performance of memory based neurofeedback systems. © EEG and Clinical Neuroscience Society (ECNS) 2016.

  13. Polarization-resolved time-delay signatures of chaos induced by FBG-feedback in VCSEL.

    PubMed

    Zhong, Zhu-Qiang; Li, Song-Sui; Chan, Sze-Chun; Xia, Guang-Qiong; Wu, Zheng-Mao

    2015-06-15

    Polarization-resolved chaotic emission intensities from a vertical-cavity surface-emitting laser (VCSEL) subject to feedback from a fiber Bragg grating (FBG) are numerically investigated. Time-delay (TD) signatures of the feedback are examined through various means including self-correlations of intensity time-series of individual polarizations, cross-correlation of intensities time-series between both polarizations, and permutation entropies calculated for the individual polarizations. The results show that the TD signatures can be clearly suppressed by selecting suitable operation parameters such as the feedback strength, FBG bandwidth, and Bragg frequency. Also, in the operational parameter space, numerical maps of TD signatures and effective bandwidths are obtained, which show regions of chaotic signals with both wide bandwidths and weak TD signatures. Finally, by comparing with a VCSEL subject to feedback from a mirror, the VCSEL subject to feedback from the FBG generally shows better concealment of the TD signatures with similar, or even wider, bandwidths.

  14. A Reversible Logical Circuit Synthesis Algorithm Based on Decomposition of Cycle Representations of Permutations

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Li, Zhiqiang; Zhang, Gaoman; Pan, Suhan; Zhang, Wei

    2018-05-01

    A reversible function is isomorphic to a permutation and an arbitrary permutation can be represented by a series of cycles. A new synthesis algorithm for 3-qubit reversible circuits was presented. It consists of two parts, the first part used the Number of reversible function's Different Bits (NDBs) to decide whether the NOT gate should be added to decrease the Hamming distance of the input and output vectors; the second part was based on the idea of exploring properties of the cycle representation of permutations, decomposed the cycles to make the permutation closer to the identity permutation and finally turn into the identity permutation, it was realized by using totally controlled Toffoli gates with positive and negative controls.

  15. Consciousness Indexing and Outcome Prediction with Resting-State EEG in Severe Disorders of Consciousness.

    PubMed

    Stefan, Sabina; Schorr, Barbara; Lopez-Rolon, Alex; Kolassa, Iris-Tatjana; Shock, Jonathan P; Rosenfelder, Martin; Heck, Suzette; Bender, Andreas

    2018-04-17

    We applied the following methods to resting-state EEG data from patients with disorders of consciousness (DOC) for consciousness indexing and outcome prediction: microstates, entropy (i.e. approximate, permutation), power in alpha and delta frequency bands, and connectivity (i.e. weighted symbolic mutual information, symbolic transfer entropy, complex network analysis). Patients with unresponsive wakefulness syndrome (UWS) and patients in a minimally conscious state (MCS) were classified into these two categories by fitting and testing a generalised linear model. We aimed subsequently to develop an automated system for outcome prediction in severe DOC by selecting an optimal subset of features using sequential floating forward selection (SFFS). The two outcome categories were defined as UWS or dead, and MCS or emerged from MCS. Percentage of time spent in microstate D in the alpha frequency band performed best at distinguishing MCS from UWS patients. The average clustering coefficient obtained from thresholding beta coherence performed best at predicting outcome. The optimal subset of features selected with SFFS consisted of the frequency of microstate A in the 2-20 Hz frequency band, path length obtained from thresholding alpha coherence, and average path length obtained from thresholding alpha coherence. Combining these features seemed to afford high prediction power. Python and MATLAB toolboxes for the above calculations are freely available under the GNU public license for non-commercial use ( https://qeeg.wordpress.com ).

  16. Effective Iterated Greedy Algorithm for Flow-Shop Scheduling Problems with Time lags

    NASA Astrophysics Data System (ADS)

    ZHAO, Ning; YE, Song; LI, Kaidian; CHEN, Siyu

    2017-05-01

    Flow shop scheduling problem with time lags is a practical scheduling problem and attracts many studies. Permutation problem(PFSP with time lags) is concentrated but non-permutation problem(non-PFSP with time lags) seems to be neglected. With the aim to minimize the makespan and satisfy time lag constraints, efficient algorithms corresponding to PFSP and non-PFSP problems are proposed, which consist of iterated greedy algorithm for permutation(IGTLP) and iterated greedy algorithm for non-permutation (IGTLNP). The proposed algorithms are verified using well-known simple and complex instances of permutation and non-permutation problems with various time lag ranges. The permutation results indicate that the proposed IGTLP can reach near optimal solution within nearly 11% computational time of traditional GA approach. The non-permutation results indicate that the proposed IG can reach nearly same solution within less than 1% computational time compared with traditional GA approach. The proposed research combines PFSP and non-PFSP together with minimal and maximal time lag consideration, which provides an interesting viewpoint for industrial implementation.

  17. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies

    PubMed Central

    Wayne, Peter M.; Gow, Brian J.; Costa, Madalena D.; Peng, C.-K.; Lipsitz, Lewis A.; Hausdorff, Jeffrey M.; Davis, Roger B.; Walsh, Jacquelyn N.; Lough, Matthew; Novak, Vera; Yeh, Gloria Y.; Ahn, Andrew C.; Macklin, Eric A.; Manor, Brad

    2014-01-01

    Background Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale “complexity” of postural sway fluctuations. Objectives To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults. Methods A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up–and-Go tests characterized physical function. Results At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (P<0.001), and ML EC (P<0.001) conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018). Long- and short-term TC training were positively associated with physical function. Conclusion Multiscale entropy offers a complementary approach to traditional COP measures for characterizing sway during quiet standing, and may be more sensitive to the effects of TC in healthy adults. Trial Registration ClinicalTrials.gov NCT01340365 PMID:25494333

  18. Complexity-Based Measures Inform Effects of Tai Chi Training on Standing Postural Control: Cross-Sectional and Randomized Trial Studies.

    PubMed

    Wayne, Peter M; Gow, Brian J; Costa, Madalena D; Peng, C-K; Lipsitz, Lewis A; Hausdorff, Jeffrey M; Davis, Roger B; Walsh, Jacquelyn N; Lough, Matthew; Novak, Vera; Yeh, Gloria Y; Ahn, Andrew C; Macklin, Eric A; Manor, Brad

    2014-01-01

    Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale "complexity" of postural sway fluctuations. To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults. A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up-and-Go tests characterized physical function. At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (P<0.001), and ML EC (P<0.001) conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018). Long- and short-term TC training were positively associated with physical function. Multiscale entropy offers a complementary approach to traditional COP measures for characterizing sway during quiet standing, and may be more sensitive to the effects of TC in healthy adults. ClinicalTrials.gov NCT01340365.

  19. Decryption of pure-position permutation algorithms.

    PubMed

    Zhao, Xiao-Yu; Chen, Gang; Zhang, Dan; Wang, Xiao-Hong; Dong, Guang-Chang

    2004-07-01

    Pure position permutation image encryption algorithms, commonly used as image encryption investigated in this work are unfortunately frail under known-text attack. In view of the weakness of pure position permutation algorithm, we put forward an effective decryption algorithm for all pure-position permutation algorithms. First, a summary of the pure position permutation image encryption algorithms is given by introducing the concept of ergodic matrices. Then, by using probability theory and algebraic principles, the decryption probability of pure-position permutation algorithms is verified theoretically; and then, by defining the operation system of fuzzy ergodic matrices, we improve a specific decryption algorithm. Finally, some simulation results are shown.

  20. Entropy measures detect increased movement variability in resistance training when elite rugby players use the ball.

    PubMed

    Moras, Gerard; Fernández-Valdés, Bruno; Vázquez-Guerrero, Jairo; Tous-Fajardo, Julio; Exel, Juliana; Sampaio, Jaime

    2018-05-24

    This study described the variability in acceleration during a resistance training task, performed in horizontal inertial flywheels without (NOBALL) or with the constraint of catching and throwing a rugby ball (BALL). Twelve elite rugby players (mean±SD: age 25.6±3.0years, height 1.82±0.07m, weight 94.0±9.9kg) performed a resistance training task in both conditions (NOBALL AND BALL). Players had five minutes of a standardized warm-up, followed by two series of six repetitions of both conditions: at the first three repetitions the intensity was progressively increased while the last three were performed at maximal voluntary effort. Thereafter, the participants performed two series of eight repetitions from each condition for two days and in a random order, with a minimum of 10min between series. The structure of variability was analysed using non-linear measures of entropy. Mean changes (%; ±90% CL) of 4.64; ±3.1g for mean acceleration and 39.48; ±36.63a.u. for sample entropy indicated likely and very likely increase when in BALL condition. Multiscale entropy also showed higher unpredictability of acceleration under the BALL condition, especially at higher time scales. The application of match specific constraints in resistance training for rugby players elicit different amount of variability of body acceleration across multiple physiological time scales. Understanding the non-linear process inherent to the manipulation of resistance training variables with constraints and its motor adaptations may help coaches and trainers to enhance the effectiveness of physical training and, ultimately, better understand and maximize sports performance. Copyright © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  1. Improved Prediction of Falls in Community-Dwelling Older Adults Through Phase-Dependent Entropy of Daily-Life Walking

    PubMed Central

    Ihlen, Espen A. F.; van Schooten, Kimberley S.; Bruijn, Sjoerd M.; van Dieën, Jaap H.; Vereijken, Beatrix; Helbostad, Jorunn L.; Pijnappels, Mirjam

    2018-01-01

    Age and age-related diseases have been suggested to decrease entropy of human gait kinematics, which is thought to make older adults more susceptible to falls. In this study we introduce a new entropy measure, called phase-dependent generalized multiscale entropy (PGME), and test whether this measure improves fall-risk prediction in community-dwelling older adults. PGME can assess phase-dependent changes in the stability of gait dynamics that result from kinematic changes in events such as heel strike and toe-off. PGME was assessed for trunk acceleration of 30 s walking epochs in a re-analysis of 1 week of daily-life activity data from the FARAO study, originally described by van Schooten et al. (2016). The re-analyzed data set contained inertial sensor data from 52 single- and 46 multiple-time prospective fallers in a 6 months follow-up period, and an equal number of non-falling controls matched by age, weight, height, gender, and the use of walking aids. The predictive ability of PGME for falls was assessed using a partial least squares regression. PGME had a superior predictive ability of falls among single-time prospective fallers when compared to the other gait features. The single-time fallers had a higher PGME (p < 0.0001) of their trunk acceleration at 60% of their step cycle when compared with non-fallers. No significant differences were found between PGME of multiple-time fallers and non-fallers, but PGME was found to improve the prediction model of multiple-time fallers when combined with other gait features. These findings suggest that taking into account phase-dependent changes in the stability of the gait dynamics has additional value for predicting falls in older people, especially for single-time prospective fallers. PMID:29556188

  2. Improved Prediction of Falls in Community-Dwelling Older Adults Through Phase-Dependent Entropy of Daily-Life Walking.

    PubMed

    Ihlen, Espen A F; van Schooten, Kimberley S; Bruijn, Sjoerd M; van Dieën, Jaap H; Vereijken, Beatrix; Helbostad, Jorunn L; Pijnappels, Mirjam

    2018-01-01

    Age and age-related diseases have been suggested to decrease entropy of human gait kinematics, which is thought to make older adults more susceptible to falls. In this study we introduce a new entropy measure, called phase-dependent generalized multiscale entropy (PGME), and test whether this measure improves fall-risk prediction in community-dwelling older adults. PGME can assess phase-dependent changes in the stability of gait dynamics that result from kinematic changes in events such as heel strike and toe-off. PGME was assessed for trunk acceleration of 30 s walking epochs in a re-analysis of 1 week of daily-life activity data from the FARAO study, originally described by van Schooten et al. (2016). The re-analyzed data set contained inertial sensor data from 52 single- and 46 multiple-time prospective fallers in a 6 months follow-up period, and an equal number of non-falling controls matched by age, weight, height, gender, and the use of walking aids. The predictive ability of PGME for falls was assessed using a partial least squares regression. PGME had a superior predictive ability of falls among single-time prospective fallers when compared to the other gait features. The single-time fallers had a higher PGME ( p < 0.0001) of their trunk acceleration at 60% of their step cycle when compared with non-fallers. No significant differences were found between PGME of multiple-time fallers and non-fallers, but PGME was found to improve the prediction model of multiple-time fallers when combined with other gait features. These findings suggest that taking into account phase-dependent changes in the stability of the gait dynamics has additional value for predicting falls in older people, especially for single-time prospective fallers.

  3. Weight distributions for turbo codes using random and nonrandom permutations

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Divsalar, D.

    1995-01-01

    This article takes a preliminary look at the weight distributions achievable for turbo codes using random, nonrandom, and semirandom permutations. Due to the recursiveness of the encoders, it is important to distinguish between self-terminating and non-self-terminating input sequences. The non-self-terminating sequences have little effect on decoder performance, because they accumulate high encoded weight until they are artificially terminated at the end of the block. From probabilistic arguments based on selecting the permutations randomly, it is concluded that the self-terminating weight-2 data sequences are the most important consideration in the design of constituent codes; higher-weight self-terminating sequences have successively decreasing importance. Also, increasing the number of codes and, correspondingly, the number of permutations makes it more and more likely that the bad input sequences will be broken up by one or more of the permuters. It is possible to design nonrandom permutations that ensure that the minimum distance due to weight-2 input sequences grows roughly as the square root of (2N), where N is the block length. However, these nonrandom permutations amplify the bad effects of higher-weight inputs, and as a result they are inferior in performance to randomly selected permutations. But there are 'semirandom' permutations that perform nearly as well as the designed nonrandom permutations with respect to weight-2 input sequences and are not as susceptible to being foiled by higher-weight inputs.

  4. PERMutation Using Transposase Engineering (PERMUTE): A Simple Approach for Constructing Circularly Permuted Protein Libraries.

    PubMed

    Jones, Alicia M; Atkinson, Joshua T; Silberg, Jonathan J

    2017-01-01

    Rearrangements that alter the order of a protein's sequence are used in the lab to study protein folding, improve activity, and build molecular switches. One of the simplest ways to rearrange a protein sequence is through random circular permutation, where native protein termini are linked together and new termini are created elsewhere through random backbone fission. Transposase mutagenesis has emerged as a simple way to generate libraries encoding different circularly permuted variants of proteins. With this approach, a synthetic transposon (called a permuteposon) is randomly inserted throughout a circularized gene to generate vectors that express different permuted variants of a protein. In this chapter, we outline the protocol for constructing combinatorial libraries of circularly permuted proteins using transposase mutagenesis, and we describe the different permuteposons that have been developed to facilitate library construction.

  5. Ensembles of physical states and random quantum circuits on graphs

    NASA Astrophysics Data System (ADS)

    Hamma, Alioscia; Santra, Siddhartha; Zanardi, Paolo

    2012-11-01

    In this paper we continue and extend the investigations of the ensembles of random physical states introduced in Hamma [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.109.040502 109, 040502 (2012)]. These ensembles are constructed by finite-length random quantum circuits (RQC) acting on the (hyper)edges of an underlying (hyper)graph structure. The latter encodes for the locality structure associated with finite-time quantum evolutions generated by physical, i.e., local, Hamiltonians. Our goal is to analyze physical properties of typical states in these ensembles; in particular here we focus on proxies of quantum entanglement as purity and α-Renyi entropies. The problem is formulated in terms of matrix elements of superoperators which depend on the graph structure, choice of probability measure over the local unitaries, and circuit length. In the α=2 case these superoperators act on a restricted multiqubit space generated by permutation operators associated to the subsets of vertices of the graph. For permutationally invariant interactions the dynamics can be further restricted to an exponentially smaller subspace. We consider different families of RQCs and study their typical entanglement properties for finite time as well as their asymptotic behavior. We find that area law holds in average and that the volume law is a typical property (that is, it holds in average and the fluctuations around the average are vanishing for the large system) of physical states. The area law arises when the evolution time is O(1) with respect to the size L of the system, while the volume law arises as is typical when the evolution time scales like O(L).

  6. Modulation of frustration in folding by sequence permutation.

    PubMed

    Nobrega, R Paul; Arora, Karunesh; Kathuria, Sagar V; Graceffa, Rita; Barrea, Raul A; Guo, Liang; Chakravarthy, Srinivas; Bilsel, Osman; Irving, Thomas C; Brooks, Charles L; Matthews, C Robert

    2014-07-22

    Folding of globular proteins can be envisioned as the contraction of a random coil unfolded state toward the native state on an energy surface rough with local minima trapping frustrated species. These substructures impede productive folding and can serve as nucleation sites for aggregation reactions. However, little is known about the relationship between frustration and its underlying sequence determinants. Chemotaxis response regulator Y (CheY), a 129-amino acid bacterial protein, has been shown previously to populate an off-pathway kinetic trap in the microsecond time range. The frustration has been ascribed to premature docking of the N- and C-terminal subdomains or, alternatively, to the formation of an unproductive local-in-sequence cluster of branched aliphatic side chains, isoleucine, leucine, and valine (ILV). The roles of the subdomains and ILV clusters in frustration were tested by altering the sequence connectivity using circular permutations. Surprisingly, the stability and buried surface area of the intermediate could be increased or decreased depending on the location of the termini. Comparison with the results of small-angle X-ray-scattering experiments and simulations points to the accelerated formation of a more compact, on-pathway species for the more stable intermediate. The effect of chain connectivity in modulating the structures and stabilities of the early kinetic traps in CheY is better understood in terms of the ILV cluster model. However, the subdomain model captures the requirement for an intact N-terminal domain to access the native conformation. Chain entropy and aliphatic-rich sequences play crucial roles in biasing the early events leading to frustration in the folding of CheY.

  7. Opposition-Based Memetic Algorithm and Hybrid Approach for Sorting Permutations by Reversals.

    PubMed

    Soncco-Álvarez, José Luis; Muñoz, Daniel M; Ayala-Rincón, Mauricio

    2018-02-21

    Sorting unsigned permutations by reversals is a difficult problem; indeed, it was proved to be NP-hard by Caprara (1997). Because of its high complexity, many approximation algorithms to compute the minimal reversal distance were proposed until reaching the nowadays best-known theoretical ratio of 1.375. In this article, two memetic algorithms to compute the reversal distance are proposed. The first one uses the technique of opposition-based learning leading to an opposition-based memetic algorithm; the second one improves the previous algorithm by applying the heuristic of two breakpoint elimination leading to a hybrid approach. Several experiments were performed with one-hundred randomly generated permutations, single benchmark permutations, and biological permutations. Results of the experiments showed that the proposed OBMA and Hybrid-OBMA algorithms achieve the best results for practical cases, that is, for permutations of length up to 120. Also, Hybrid-OBMA showed to improve the results of OBMA for permutations greater than or equal to 60. The applicability of our proposed algorithms was checked processing permutations based on biological data, in which case OBMA gave the best average results for all instances.

  8. Visual recognition of permuted words

    NASA Astrophysics Data System (ADS)

    Rashid, Sheikh Faisal; Shafait, Faisal; Breuel, Thomas M.

    2010-02-01

    In current study we examine how letter permutation affects in visual recognition of words for two orthographically dissimilar languages, Urdu and German. We present the hypothesis that recognition or reading of permuted and non-permuted words are two distinct mental level processes, and that people use different strategies in handling permuted words as compared to normal words. A comparison between reading behavior of people in these languages is also presented. We present our study in context of dual route theories of reading and it is observed that the dual-route theory is consistent with explanation of our hypothesis of distinction in underlying cognitive behavior for reading permuted and non-permuted words. We conducted three experiments in lexical decision tasks to analyze how reading is degraded or affected by letter permutation. We performed analysis of variance (ANOVA), distribution free rank test, and t-test to determine the significance differences in response time latencies for two classes of data. Results showed that the recognition accuracy for permuted words is decreased 31% in case of Urdu and 11% in case of German language. We also found a considerable difference in reading behavior for cursive and alphabetic languages and it is observed that reading of Urdu is comparatively slower than reading of German due to characteristics of cursive script.

  9. Four applications of permutation methods to testing a single-mediator model.

    PubMed

    Taylor, Aaron B; MacKinnon, David P

    2012-09-01

    Four applications of permutation tests to the single-mediator model are described and evaluated in this study. Permutation tests work by rearranging data in many possible ways in order to estimate the sampling distribution for the test statistic. The four applications to mediation evaluated here are the permutation test of ab, the permutation joint significance test, and the noniterative and iterative permutation confidence intervals for ab. A Monte Carlo simulation study was used to compare these four tests with the four best available tests for mediation found in previous research: the joint significance test, the distribution of the product test, and the percentile and bias-corrected bootstrap tests. We compared the different methods on Type I error, power, and confidence interval coverage. The noniterative permutation confidence interval for ab was the best performer among the new methods. It successfully controlled Type I error, had power nearly as good as the most powerful existing methods, and had better coverage than any existing method. The iterative permutation confidence interval for ab had lower power than do some existing methods, but it performed better than any other method in terms of coverage. The permutation confidence interval methods are recommended when estimating a confidence interval is a primary concern. SPSS and SAS macros that estimate these confidence intervals are provided.

  10. Circular Permutation of a Chaperonin Protein: Biophysics and Application to Nanotechnology

    NASA Technical Reports Server (NTRS)

    Paavola, Chad; Chan, Suzanne; Li, Yi-Fen; McMillan, R. Andrew; Trent, Jonathan

    2004-01-01

    We have designed five circular permutants of a chaperonin protein derived from the hyperthermophilic organism Sulfolobus shibatae. These permuted proteins were expressed in E. coli and are well-folded. Furthermore, all the permutants assemble into 18-mer double rings of the same form as the wild-type protein. We characterized the thermodynamics of folding for each permutant by both guanidine denaturation and differential scanning calorimetry. We also examined the assembly of chaperonin rings into higher order structures that may be used as nanoscale templates. The results show that circular permutation can be used to tune the thermodynamic properties of a protein template as well as facilitating the fusion of peptides, binding proteins or enzymes onto nanostructured templates.

  11. Surface self-organization: From wear to self-healing in biological and technical surfaces

    NASA Astrophysics Data System (ADS)

    Nosonovsky, Michael; Bhushan, Bharat

    2010-04-01

    Wear occurs at most solid surfaces that come in contact with other solid surfaces. While biological surfaces and tissues usually have the ability for self-healing, engineered self-healing materials only started to emerge recently. These materials are currently created using the trial-and-error approach and phenomenological models, so there is a need of a general first-principles theory of self-healing. We discuss the conditions under which the self-healing occurs and provide a general theoretical framework and criteria for self-healing using the concept of multiscale organization of entropy and non-equilibrium thermodynamics. The example of epicuticular wax regeneration of plant leaves is discussed as a case study.

  12. The structure of a thermophilic kinase shapes fitness upon random circular permutation

    PubMed Central

    Jones, Alicia M.; Mehta, Manan M.; Thomas, Emily E.; Atkinson, Joshua T.; Segall-Shapiro, Thomas H.; Liu, Shirley; Silberg, Jonathan J.

    2016-01-01

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement where native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein’s functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AK with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and they reveal a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection. PMID:26976658

  13. The Structure of a Thermophilic Kinase Shapes Fitness upon Random Circular Permutation.

    PubMed

    Jones, Alicia M; Mehta, Manan M; Thomas, Emily E; Atkinson, Joshua T; Segall-Shapiro, Thomas H; Liu, Shirley; Silberg, Jonathan J

    2016-05-20

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement in which native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein's functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AKs with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and it reveals a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection.

  14. Teaching Tip: When a Matrix and Its Inverse Are Stochastic

    ERIC Educational Resources Information Center

    Ding, J.; Rhee, N. H.

    2013-01-01

    A stochastic matrix is a square matrix with nonnegative entries and row sums 1. The simplest example is a permutation matrix, whose rows permute the rows of an identity matrix. A permutation matrix and its inverse are both stochastic. We prove the converse, that is, if a matrix and its inverse are both stochastic, then it is a permutation matrix.

  15. Permutation-based inference for the AUC: A unified approach for continuous and discontinuous data.

    PubMed

    Pauly, Markus; Asendorf, Thomas; Konietschke, Frank

    2016-11-01

    We investigate rank-based studentized permutation methods for the nonparametric Behrens-Fisher problem, that is, inference methods for the area under the ROC curve. We hereby prove that the studentized permutation distribution of the Brunner-Munzel rank statistic is asymptotically standard normal, even under the alternative. Thus, incidentally providing the hitherto missing theoretical foundation for the Neubert and Brunner studentized permutation test. In particular, we do not only show its consistency, but also that confidence intervals for the underlying treatment effects can be computed by inverting this permutation test. In addition, we derive permutation-based range-preserving confidence intervals. Extensive simulation studies show that the permutation-based confidence intervals appear to maintain the preassigned coverage probability quite accurately (even for rather small sample sizes). For a convenient application of the proposed methods, a freely available software package for the statistical software R has been developed. A real data example illustrates the application. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Efficient Blockwise Permutation Tests Preserving Exchangeability

    PubMed Central

    Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.

    2014-01-01

    In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113

  17. Assessing multiscale complexity of short heart rate variability series through a model-based linear approach

    NASA Astrophysics Data System (ADS)

    Porta, Alberto; Bari, Vlasta; Ranuzzi, Giovanni; De Maria, Beatrice; Baselli, Giuseppe

    2017-09-01

    We propose a multiscale complexity (MSC) method assessing irregularity in assigned frequency bands and being appropriate for analyzing the short time series. It is grounded on the identification of the coefficients of an autoregressive model, on the computation of the mean position of the poles generating the components of the power spectral density in an assigned frequency band, and on the assessment of its distance from the unit circle in the complex plane. The MSC method was tested on simulations and applied to the short heart period (HP) variability series recorded during graded head-up tilt in 17 subjects (age from 21 to 54 years, median = 28 years, 7 females) and during paced breathing protocols in 19 subjects (age from 27 to 35 years, median = 31 years, 11 females) to assess the contribution of time scales typical of the cardiac autonomic control, namely in low frequency (LF, from 0.04 to 0.15 Hz) and high frequency (HF, from 0.15 to 0.5 Hz) bands to the complexity of the cardiac regulation. The proposed MSC technique was compared to a traditional model-free multiscale method grounded on information theory, i.e., multiscale entropy (MSE). The approach suggests that the reduction of HP variability complexity observed during graded head-up tilt is due to a regularization of the HP fluctuations in LF band via a possible intervention of sympathetic control and the decrement of HP variability complexity observed during slow breathing is the result of the regularization of the HP variations in both LF and HF bands, thus implying the action of physiological mechanisms working at time scales even different from that of respiration. MSE did not distinguish experimental conditions at time scales larger than 1. Over a short time series MSC allows a more insightful association between cardiac control complexity and physiological mechanisms modulating cardiac rhythm compared to a more traditional tool such as MSE.

  18. An AUC-based permutation variable importance measure for random forests

    PubMed Central

    2013-01-01

    Background The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. Results We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. Conclusions The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html. PMID:23560875

  19. An AUC-based permutation variable importance measure for random forests.

    PubMed

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  20. Nonlinear dynamical systems effects of homeopathic remedies on multiscale entropy and correlation dimension of slow wave sleep EEG in young adults with histories of coffee-induced insomnia.

    PubMed

    Bell, Iris R; Howerter, Amy; Jackson, Nicholas; Aickin, Mikel; Bootzin, Richard R; Brooks, Audrey J

    2012-07-01

    Investigators of homeopathy have proposed that nonlinear dynamical systems (NDS) and complex systems science offer conceptual and analytic tools for evaluating homeopathic remedy effects. Previous animal studies demonstrate that homeopathic medicines alter delta electroencephalographic (EEG) slow wave sleep. The present study extended findings of remedy-related sleep stage alterations in human subjects by testing the feasibility of using two different NDS analytic approaches to assess remedy effects on human slow wave sleep EEG. Subjects (N=54) were young adult male and female college students with a history of coffee-related insomnia who participated in a larger 4-week study of the polysomnographic effects of homeopathic medicines on home-based all-night sleep recordings. Subjects took one bedtime dose of a homeopathic remedy (Coffea cruda or Nux vomica 30c). We computed multiscale entropy (MSE) and the correlation dimension (Mekler-D2) for stages 3 and 4 slow wave sleep EEG sampled in artifact-free 2-min segments during the first two rapid-eye-movement (REM) cycles for remedy and post-remedy nights, controlling for placebo and post-placebo night effects. MSE results indicate significant, remedy-specific directional effects, especially later in the night (REM cycle 2) (CC: remedy night increases and post-remedy night decreases in MSE at multiple sites for both stages 3 and 4 in both REM cycles; NV: remedy night decreases and post-remedy night increases, mainly in stage 3 REM cycle 2 MSE). D2 analyses yielded more sporadic and inconsistent findings. Homeopathic medicines Coffea cruda and Nux vomica in 30c potencies alter short-term nonlinear dynamic parameters of slow wave sleep EEG in healthy young adults. MSE may provide a more sensitive NDS analytic method than D2 for evaluating homeopathic remedy effects on human sleep EEG patterns. Copyright © 2012 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  1. Nonlinear Dynamical Systems Effects of Homeopathic Remedies on Multiscale Entropy and Correlation Dimension of Slow Wave Sleep EEG in Young Adults with Histories of Coffee-Induced Insomnia

    PubMed Central

    Bell, Iris R.; Howerter, Amy; Jackson, Nicholas; Aickin, Mikel; Bootzin, Richard R.; Brooks, Audrey J.

    2012-01-01

    Background Investigators of homeopathy have proposed that nonlinear dynamical systems (NDS) and complex systems science offer conceptual and analytic tools for evaluating homeopathic remedy effects. Previous animal studies demonstrate that homeopathic medicines alter delta electroencephalographic (EEG) slow wave sleep. The present study extended findings of remedy-related sleep stage alterations in human subjects by testing the feasibility of using two different NDS analytic approaches to assess remedy effects on human slow wave sleep EEG. Methods Subjects (N=54) were young adult male and female college students with a history of coffee-related insomnia who participated in a larger 4-week study of the polysomnographic effects of homeopathic medicines on home-based all-night sleep recordings. Subjects took one bedtime dose of a homeopathic remedy (Coffea cruda or Nux vomica 30c). We computed multiscale entropy (MSE) and the correlation dimension (Mekler-D2) for stage 3 and 4 slow wave sleep EEG sampled in artifact-free 2-minute segments during the first two rapid-eye-movement (REM) cycles for remedy and post-remedy nights, controlling for placebo and post-placebo night effects. Results MSE results indicate significant, remedy-specific directional effects, especially later in the night (REM cycle 2) (CC: remedy night increases and post-remedy night decreases in MSE at multiple sites for both stages 3 and 4 in both REM cycles; NV: remedy night decreases and post-remedy night increases, mainly in stage 3 REM cycle 2 MSE). D2 analyses yielded more sporadic and inconsistent findings. Conclusions Homeopathic medicines Coffea cruda and Nux vomica in 30c potencies alter short-term nonlinear dynamic parameters of slow wave sleep EEG in healthy young adults. MSE may provide a more sensitive NDS analytic method than D2 for evaluating homeopathic remedy effects on human sleep EEG patterns. PMID:22818237

  2. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S [Santa Fe, NM; Cabantous, Stephanie [Los Alamos, NM

    2008-06-24

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  3. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S; Cabantous, Stephanie

    2013-02-12

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  4. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S [Santa Fe, NM; Cabantous, Stephanie [Los Alamos, NM

    2011-06-14

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  5. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S.; Cabantous, Stephanie

    2013-04-16

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  6. Analysis of Meniscus Fluctuation in a Continuous Casting Slab Mold

    NASA Astrophysics Data System (ADS)

    Zhang, Kaitian; Liu, Jianhua; Cui, Heng; Xiao, Chao

    2018-06-01

    A water model of slab mold was established to analyze the microscopic and macroscopic fluctuation of meniscus. The fast Fourier transform and wavelet entropy were adopted to analyze the wave amplitude, frequency, and components of fluctuation. The flow patterns under the meniscus were measured by using particle image velocimetry measurement and then the mechanisms of meniscus fluctuation were discussed. The results reflected that wavelet entropy had multi-scale and statistical properties, and it was suitable for the study of meniscus fluctuation details both in time and frequency domain. The basic wave, frequency of which exceeding 1 Hz in the condition of no mold oscillation, was demonstrated in this work. In fact, three basic waves were found: long-wave with low frequency, middle-wave with middle frequency, and short-wave with high frequency. In addition, the upper roll flow in mold had significant effect on meniscus fluctuation. When the position of flow impinged was far from the meniscus, long-wave dominated the fluctuation and the stability of meniscus was enhanced. However, when the velocity of flow was increased, the short-wave dominated the meniscus fluctuation and the meniscus stability was decreased.

  7. Coarse-grained models using local-density potentials optimized with the relative entropy: Application to implicit solvation

    NASA Astrophysics Data System (ADS)

    Sanyal, Tanmoy; Shell, M. Scott

    2016-07-01

    Bottom-up multiscale techniques are frequently used to develop coarse-grained (CG) models for simulations at extended length and time scales but are often limited by a compromise between computational efficiency and accuracy. The conventional approach to CG nonbonded interactions uses pair potentials which, while computationally efficient, can neglect the inherently multibody contributions of the local environment of a site to its energy, due to degrees of freedom that were coarse-grained out. This effect often causes the CG potential to depend strongly on the overall system density, composition, or other properties, which limits its transferability to states other than the one at which it was parameterized. Here, we propose to incorporate multibody effects into CG potentials through additional nonbonded terms, beyond pair interactions, that depend in a mean-field manner on local densities of different atomic species. This approach is analogous to embedded atom and bond-order models that seek to capture multibody electronic effects in metallic systems. We show that the relative entropy coarse-graining framework offers a systematic route to parameterizing such local density potentials. We then characterize this approach in the development of implicit solvation strategies for interactions between model hydrophobes in an aqueous environment.

  8. Sample entropy analysis for the estimating depth of anaesthesia through human EEG signal at different levels of unconsciousness during surgeries.

    PubMed

    Liu, Quan; Ma, Li; Fan, Shou-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2018-01-01

    Estimating the depth of anaesthesia (DoA) in operations has always been a challenging issue due to the underlying complexity of the brain mechanisms. Electroencephalogram (EEG) signals are undoubtedly the most widely used signals for measuring DoA. In this paper, a novel EEG-based index is proposed to evaluate DoA for 24 patients receiving general anaesthesia with different levels of unconsciousness. Sample Entropy (SampEn) algorithm was utilised in order to acquire the chaotic features of the signals. After calculating the SampEn from the EEG signals, Random Forest was utilised for developing learning regression models with Bispectral index (BIS) as the target. Correlation coefficient, mean absolute error, and area under the curve (AUC) were used to verify the perioperative performance of the proposed method. Validation comparisons with typical nonstationary signal analysis methods (i.e., recurrence analysis and permutation entropy) and regression methods (i.e., neural network and support vector machine) were conducted. To further verify the accuracy and validity of the proposed methodology, the data is divided into four unconsciousness-level groups on the basis of BIS levels. Subsequently, analysis of variance (ANOVA) was applied to the corresponding index (i.e., regression output). Results indicate that the correlation coefficient improved to 0.72 ± 0.09 after filtering and to 0.90 ± 0.05 after regression from the initial values of 0.51 ± 0.17. Similarly, the final mean absolute error dramatically declined to 5.22 ± 2.12. In addition, the ultimate AUC increased to 0.98 ± 0.02, and the ANOVA analysis indicates that each of the four groups of different anaesthetic levels demonstrated significant difference from the nearest levels. Furthermore, the Random Forest output was extensively linear in relation to BIS, thus with better DoA prediction accuracy. In conclusion, the proposed method provides a concrete basis for monitoring patients' anaesthetic level during surgeries.

  9. Probing and exploiting the chaotic dynamics of a hydrodynamic photochemical oscillator to implement all the basic binary logic functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayashi, Kenta; Department of Chemistry, Biology, and Biotechnology, University of Perugia, 06123 Perugia; Gotoda, Hiroshi

    2016-05-15

    The convective motions within a solution of a photochromic spiro-oxazine being irradiated by UV only on the bottom part of its volume, give rise to aperiodic spectrophotometric dynamics. In this paper, we study three nonlinear properties of the aperiodic time series: permutation entropy, short-term predictability and long-term unpredictability, and degree distribution of the visibility graph networks. After ascertaining the extracted chaotic features, we show how the aperiodic time series can be exploited to implement all the fundamental two-inputs binary logic functions (AND, OR, NAND, NOR, XOR, and XNOR) and some basic arithmetic operations (half-adder, full-adder, half-subtractor). This is possible duemore » to the wide range of states a nonlinear system accesses in the course of its evolution. Therefore, the solution of the convective photochemical oscillator results in hardware for chaos-computing alternative to conventional complementary metal-oxide semiconductor-based integrated circuits.« less

  10. Modelling the structure of Zr-rich Pb(Zr1-xTix)O3, x = 0.4 by a multiphase approach.

    PubMed

    Bogdanov, Alexander; Mysovsky, Andrey; Pickard, Chris J; Kimmel, Anna V

    2016-10-12

    Solid solution perovskite Pb(Zr 1-x Ti x )O 3 (PZT) is an industrially important material. Despite the long history of experimental and theoretical studies, the structure of this material is still under intensive discussion. In this work, we have applied structure searching coupled with density functional theory methods to provide a multiphase description of this material at x = 0.4. We demonstrate that the permutational freedom of B-site cations leads to the stabilisation of a variety of local phases reflecting a relatively flat energy landscape of PZT. Using a set of predicted local phases we reproduce the experimental pair distribution function (PDF) profile with high accuracy. We introduce a complex multiphase picture of the structure of PZT and show that additional monoclinic and rhombohedral phases account for a better description of the experimental PDF profile. We propose that such a multiphase picture reflects the entropy reached in the sample during the preparation process.

  11. Fourier-Mellin moment-based intertwining map for image encryption

    NASA Astrophysics Data System (ADS)

    Kaur, Manjit; Kumar, Vijay

    2018-03-01

    In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.

  12. A new methodology for automated diagnosis of mild cognitive impairment (MCI) using magnetoencephalography (MEG).

    PubMed

    Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat

    2016-05-15

    Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. An improvement of the measurement of time series irreversibility with visibility graph approach

    NASA Astrophysics Data System (ADS)

    Wu, Zhenyu; Shang, Pengjian; Xiong, Hui

    2018-07-01

    We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.

  14. Image encryption using a synchronous permutation-diffusion technique

    NASA Astrophysics Data System (ADS)

    Enayatifar, Rasul; Abdullah, Abdul Hanan; Isnin, Ismail Fauzi; Altameem, Ayman; Lee, Malrey

    2017-03-01

    In the past decade, the interest on digital images security has been increased among scientists. A synchronous permutation and diffusion technique is designed in order to protect gray-level image content while sending it through internet. To implement the proposed method, two-dimensional plain-image is converted to one dimension. Afterward, in order to reduce the sending process time, permutation and diffusion steps for any pixel are performed in the same time. The permutation step uses chaotic map and deoxyribonucleic acid (DNA) to permute a pixel, while diffusion employs DNA sequence and DNA operator to encrypt the pixel. Experimental results and extensive security analyses have been conducted to demonstrate the feasibility and validity of this proposed image encryption method.

  15. Nonlinear Dynamic Characteristics of Oil-in-Water Emulsions

    NASA Astrophysics Data System (ADS)

    Yin, Zhaoqi; Han, Yunfeng; Ren, Yingyu; Yang, Qiuyi; Jin, Ningde

    2016-08-01

    In this article, the nonlinear dynamic characteristics of oil-in-water emulsions under the addition of surfactant were experimentally investigated. Firstly, based on the vertical upward oil-water two-phase flow experiment in 20 mm inner diameter (ID) testing pipe, dynamic response signals of oil-in-water emulsions were recorded using vertical multiple electrode array (VMEA) sensor. Afterwards, the recurrence plot (RP) algorithm and multi-scale weighted complexity entropy causality plane (MS-WCECP) were employed to analyse the nonlinear characteristics of the signals. The results show that the certainty is decreasing and the randomness is increasing with the increment of surfactant concentration. This article provides a novel method for revealing the nonlinear dynamic characteristics, complexity, and randomness of oil-in-water emulsions with experimental measurement signals.

  16. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    PubMed Central

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  17. Empirical and Theoretical Aspects of Generation and Transfer of Information in a Neuromagnetic Source Network

    PubMed Central

    Vakorin, Vasily A.; Mišić, Bratislav; Krakovska, Olga; McIntosh, Anthony Randal

    2011-01-01

    Variability in source dynamics across the sources in an activated network may be indicative of how the information is processed within a network. Information-theoretic tools allow one not only to characterize local brain dynamics but also to describe interactions between distributed brain activity. This study follows such a framework and explores the relations between signal variability and asymmetry in mutual interdependencies in a data-driven pipeline of non-linear analysis of neuromagnetic sources reconstructed from human magnetoencephalographic (MEG) data collected as a reaction to a face recognition task. Asymmetry in non-linear interdependencies in the network was analyzed using transfer entropy, which quantifies predictive information transfer between the sources. Variability of the source activity was estimated using multi-scale entropy, quantifying the rate of which information is generated. The empirical results are supported by an analysis of synthetic data based on the dynamics of coupled systems with time delay in coupling. We found that the amount of information transferred from one source to another was correlated with the difference in variability between the dynamics of these two sources, with the directionality of net information transfer depending on the time scale at which the sample entropy was computed. The results based on synthetic data suggest that both time delay and strength of coupling can contribute to the relations between variability of brain signals and information transfer between them. Our findings support the previous attempts to characterize functional organization of the activated brain, based on a combination of non-linear dynamics and temporal features of brain connectivity, such as time delay. PMID:22131968

  18. Multi-scale radiomic analysis of sub-cortical regions in MRI related to autism, gender and age

    NASA Astrophysics Data System (ADS)

    Chaddad, Ahmad; Desrosiers, Christian; Toews, Matthew

    2017-03-01

    We propose using multi-scale image textures to investigate links between neuroanatomical regions and clinical variables in MRI. Texture features are derived at multiple scales of resolution based on the Laplacian-of-Gaussian (LoG) filter. Three quantifier functions (Average, Standard Deviation and Entropy) are used to summarize texture statistics within standard, automatically segmented neuroanatomical regions. Significance tests are performed to identify regional texture differences between ASD vs. TDC and male vs. female groups, as well as correlations with age (corrected p < 0.05). The open-access brain imaging data exchange (ABIDE) brain MRI dataset is used to evaluate texture features derived from 31 brain regions from 1112 subjects including 573 typically developing control (TDC, 99 females, 474 males) and 539 Autism spectrum disorder (ASD, 65 female and 474 male) subjects. Statistically significant texture differences between ASD vs. TDC groups are identified asymmetrically in the right hippocampus, left choroid-plexus and corpus callosum (CC), and symmetrically in the cerebellar white matter. Sex-related texture differences in TDC subjects are found in primarily in the left amygdala, left cerebellar white matter, and brain stem. Correlations between age and texture in TDC subjects are found in the thalamus-proper, caudate and pallidum, most exhibiting bilateral symmetry.

  19. Multiscale weighted colored graphs for protein flexibility and rigidity analysis

    NASA Astrophysics Data System (ADS)

    Bramer, David; Wei, Guo-Wei

    2018-02-01

    Protein structural fluctuation, measured by Debye-Waller factors or B-factors, is known to correlate to protein flexibility and function. A variety of methods has been developed for protein Debye-Waller factor prediction and related applications to domain separation, docking pose ranking, entropy calculation, hinge detection, stability analysis, etc. Nevertheless, none of the current methodologies are able to deliver an accuracy of 0.7 in terms of the Pearson correlation coefficients averaged over a large set of proteins. In this work, we introduce a paradigm-shifting geometric graph model, multiscale weighted colored graph (MWCG), to provide a new generation of computational algorithms to significantly change the current status of protein structural fluctuation analysis. Our MWCG model divides a protein graph into multiple subgraphs based on interaction types between graph nodes and represents the protein rigidity by generalized centralities of subgraphs. MWCGs not only predict the B-factors of protein residues but also accurately analyze the flexibility of all atoms in a protein. The MWCG model is validated over a number of protein test sets and compared with many standard methods. An extensive numerical study indicates that the proposed MWCG offers an accuracy of over 0.8 and thus provides perhaps the first reliable method for estimating protein flexibility and B-factors. It also simultaneously predicts all-atom flexibility in a molecule.

  20. Multiscale intensity homogeneity transformation method and its application to computer-aided detection of pulmonary embolism in computed tomographic pulmonary angiography (CTPA)

    NASA Astrophysics Data System (ADS)

    Guo, Yanhui; Zhou, Chuan; Chan, Heang-Ping; Wei, Jun; Chughtai, Aamer; Sundaram, Baskaran; Hadjiiski, Lubomir M.; Patel, Smita; Kazerooni, Ella A.

    2013-04-01

    A 3D multiscale intensity homogeneity transformation (MIHT) method was developed to reduce false positives (FPs) in our previously developed CAD system for pulmonary embolism (PE) detection. In MIHT, the voxel intensity of a PE candidate region was transformed to an intensity homogeneity value (IHV) with respect to the local median intensity. The IHVs were calculated in multiscales (MIHVs) to measure the intensity homogeneity, taking into account vessels of different sizes and different degrees of occlusion. Seven new features including the entropy, gradient, and moments that characterized the intensity distributions of the candidate regions were derived from the MIHVs and combined with the previously designed features that described the shape and intensity of PE candidates for the training of a linear classifier to reduce the FPs. 59 CTPA PE cases were collected from our patient files (UM set) with IRB approval and 69 cases from the PIOPED II data set with access permission. 595 and 800 PEs were identified as reference standard by experienced thoracic radiologists in the UM and PIOPED set, respectively. FROC analysis was used for performance evaluation. Compared with our previous CAD system, at a test sensitivity of 80%, the new method reduced the FP rate from 18.9 to 14.1/scan for the PIOPED set when the classifier was trained with the UM set and from 22.6 to 16.0/scan vice versa. The improvement was statistically significant (p<0.05) by JAFROC analysis. This study demonstrated that the MIHT method is effective in reducing FPs and improving the performance of the CAD system.

  1. Overlap Cycles for Permutations: Necessary and Sufficient Conditions

    DTIC Science & Technology

    2013-09-19

    for Weak Orders, To appear in SIAM Journal of Discrete Math . [9] G. Hurlbert and G. Isaak, Equivalence class universal cycles for permutations, Discrete ... Math . 149 (1996), pp. 123–129. [10] J. R. Johnson, Universal cycles for permutations, Discrete Math . 309 (2009), pp. 5264– 5270. [11] E. A. Ragland

  2. Multi-response permutation procedure as an alternative to the analysis of variance: an SPSS implementation.

    PubMed

    Cai, Li

    2006-02-01

    A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.

  3. Using R to Simulate Permutation Distributions for Some Elementary Experimental Designs

    ERIC Educational Resources Information Center

    Eudey, T. Lynn; Kerr, Joshua D.; Trumbo, Bruce E.

    2010-01-01

    Null distributions of permutation tests for two-sample, paired, and block designs are simulated using the R statistical programming language. For each design and type of data, permutation tests are compared with standard normal-theory and nonparametric tests. These examples (often using real data) provide for classroom discussion use of metrics…

  4. Circular permutation of a WW domain: Folding still occurs after excising the turn of the folding-nucleating hairpin

    PubMed Central

    Kier, Brandon L.; Anderson, Jordan M.; Andersen, Niels H.

    2014-01-01

    A hyperstable Pin1 WW domain has been circularly permuted via excision of the fold-nucleating turn; it still folds to form the native three-strand sheet and hydrophobic core features. Multiprobe folding dynamics studies of the normal and circularly permuted sequences, as well as their constituent hairpin fragments and comparable-length β-strand-loop-β-strand models, indicate 2-state folding for all topologies. N-terminal hairpin formation is the fold nucleating event for the wild-type sequence; the slower folding circular permutant has a more distributed folding transition state. PMID:24350581

  5. Epidemic outbreaks and its control using a fractional order model with seasonality and stochastic infection

    NASA Astrophysics Data System (ADS)

    He, Shaobo; Banerjee, Santo

    2018-07-01

    A fractional-order SIR epidemic model is proposed under the influence of both parametric seasonality and the external noise. The integer order SIR epidemic model originally is stable. By introducing seasonality and noise force to the model, behaviors of the system is changed. It is shown that the system has rich dynamical behaviors with different system parameters, fractional derivative order and the degree of seasonality and noise. Complexity of the stochastic model is investigated by using multi-scale fuzzy entropy. Finally, hard limiter controlled system is designed and simulation results show the ratio of infected individuals can converge to a small enough target ρ, which means the epidemic outbreak can be under control by the implementation of some effective medical and health measures.

  6. Correlated and uncorrelated heart rate fluctuations during relaxing visualization

    NASA Astrophysics Data System (ADS)

    Papasimakis, N.; Pallikari, F.

    2010-05-01

    The heart rate variability (HRV) of healthy subjects practicing relaxing visualization is studied by use of three multiscale analysis techniques: the detrended fluctuation analysis (DFA), the entropy in natural time (ENT) and the average wavelet (AWC) coefficient. The scaling exponent of normal interbeat interval increments exhibits characteristics of the presence of long-range correlations. During relaxing visualization the HRV dynamics change in the sense that two new features emerge independent of each other: a respiration-induced periodicity that often dominates the HRV at short scales (<40 interbeat intervals) and the decrease of the scaling exponent at longer scales (40-512 interbeat intervals). In certain cases, the scaling exponent during relaxing visualization indicates the breakdown of long-range correlations. These characteristics have been previously seen in the HRV dynamics during non-REM sleep.

  7. Traditional Chinese medicine: potential approaches from modern dynamical complexity theories.

    PubMed

    Ma, Yan; Zhou, Kehua; Fan, Jing; Sun, Shuchen

    2016-03-01

    Despite the widespread use of traditional Chinese medicine (TCM) in clinical settings, proving its effectiveness via scientific trials is still a challenge. TCM views the human body as a complex dynamical system, and focuses on the balance of the human body, both internally and with its external environment. Such fundamental concepts require investigations using system-level quantification approaches, which are beyond conventional reductionism. Only methods that quantify dynamical complexity can bring new insights into the evaluation of TCM. In a previous article, we briefly introduced the potential value of Multiscale Entropy (MSE) analysis in TCM. This article aims to explain the existing challenges in TCM quantification, to introduce the consistency of dynamical complexity theories and TCM theories, and to inspire future system-level research on health and disease.

  8. Physical Connectivity Mapping by Circular Permutation of Human Telomerase RNA Reveals New Regions Critical for Activity and Processivity.

    PubMed

    Mefford, Melissa A; Zappulla, David C

    2016-01-15

    Telomerase is a specialized ribonucleoprotein complex that extends the 3' ends of chromosomes to counteract telomere shortening. However, increased telomerase activity is associated with ∼90% of human cancers. The telomerase enzyme minimally requires an RNA (hTR) and a specialized reverse transcriptase protein (TERT) for activity in vitro. Understanding the structure-function relationships within hTR has important implications for human disease. For the first time, we have tested the physical-connectivity requirements in the 451-nucleotide hTR RNA using circular permutations, which reposition the 5' and 3' ends. Our extensive in vitro analysis identified three classes of hTR circular permutants with altered function. First, circularly permuting 3' of the template causes specific defects in repeat-addition processivity, revealing that the template recognition element found in ciliates is conserved in human telomerase RNA. Second, seven circular permutations residing within the catalytically important core and CR4/5 domains completely abolish telomerase activity, unveiling mechanistically critical portions of these domains. Third, several circular permutations between the core and CR4/5 significantly increase telomerase activity. Our extensive circular permutation results provide insights into the architecture and coordination of human telomerase RNA and highlight where the RNA could be targeted for the development of antiaging and anticancer therapeutics. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  9. Physical Connectivity Mapping by Circular Permutation of Human Telomerase RNA Reveals New Regions Critical for Activity and Processivity

    PubMed Central

    Mefford, Melissa A.

    2015-01-01

    Telomerase is a specialized ribonucleoprotein complex that extends the 3′ ends of chromosomes to counteract telomere shortening. However, increased telomerase activity is associated with ∼90% of human cancers. The telomerase enzyme minimally requires an RNA (hTR) and a specialized reverse transcriptase protein (TERT) for activity in vitro. Understanding the structure-function relationships within hTR has important implications for human disease. For the first time, we have tested the physical-connectivity requirements in the 451-nucleotide hTR RNA using circular permutations, which reposition the 5′ and 3′ ends. Our extensive in vitro analysis identified three classes of hTR circular permutants with altered function. First, circularly permuting 3′ of the template causes specific defects in repeat-addition processivity, revealing that the template recognition element found in ciliates is conserved in human telomerase RNA. Second, seven circular permutations residing within the catalytically important core and CR4/5 domains completely abolish telomerase activity, unveiling mechanistically critical portions of these domains. Third, several circular permutations between the core and CR4/5 significantly increase telomerase activity. Our extensive circular permutation results provide insights into the architecture and coordination of human telomerase RNA and highlight where the RNA could be targeted for the development of antiaging and anticancer therapeutics. PMID:26503788

  10. Set-Based Discrete Particle Swarm Optimization Based on Decomposition for Permutation-Based Multiobjective Combinatorial Optimization Problems.

    PubMed

    Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun

    2018-07-01

    This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.

  11. Coarse-grained models using local-density potentials optimized with the relative entropy: Application to implicit solvation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanyal, Tanmoy; Shell, M. Scott, E-mail: shell@engineering.ucsb.edu

    Bottom-up multiscale techniques are frequently used to develop coarse-grained (CG) models for simulations at extended length and time scales but are often limited by a compromise between computational efficiency and accuracy. The conventional approach to CG nonbonded interactions uses pair potentials which, while computationally efficient, can neglect the inherently multibody contributions of the local environment of a site to its energy, due to degrees of freedom that were coarse-grained out. This effect often causes the CG potential to depend strongly on the overall system density, composition, or other properties, which limits its transferability to states other than the one atmore » which it was parameterized. Here, we propose to incorporate multibody effects into CG potentials through additional nonbonded terms, beyond pair interactions, that depend in a mean-field manner on local densities of different atomic species. This approach is analogous to embedded atom and bond-order models that seek to capture multibody electronic effects in metallic systems. We show that the relative entropy coarse-graining framework offers a systematic route to parameterizing such local density potentials. We then characterize this approach in the development of implicit solvation strategies for interactions between model hydrophobes in an aqueous environment.« less

  12. Bayesian maximum entropy integration of ozone observations and model predictions: an application for attainment demonstration in North Carolina.

    PubMed

    de Nazelle, Audrey; Arunachalam, Saravanan; Serre, Marc L

    2010-08-01

    States in the USA are required to demonstrate future compliance of criteria air pollutant standards by using both air quality monitors and model outputs. In the case of ozone, the demonstration tests aim at relying heavily on measured values, due to their perceived objectivity and enforceable quality. Weight given to numerical models is diminished by integrating them in the calculations only in a relative sense. For unmonitored locations, the EPA has suggested the use of a spatial interpolation technique to assign current values. We demonstrate that this approach may lead to erroneous assignments of nonattainment and may make it difficult for States to establish future compliance. We propose a method that combines different sources of information to map air pollution, using the Bayesian Maximum Entropy (BME) Framework. The approach gives precedence to measured values and integrates modeled data as a function of model performance. We demonstrate this approach in North Carolina, using the State's ozone monitoring network in combination with outputs from the Multiscale Air Quality Simulation Platform (MAQSIP) modeling system. We show that the BME data integration approach, compared to a spatial interpolation of measured data, improves the accuracy and the precision of ozone estimations across the state.

  13. Fast vessel segmentation in retinal images using multi-scale enhancement and second-order local entropy

    NASA Astrophysics Data System (ADS)

    Yu, H.; Barriga, S.; Agurto, C.; Zamora, G.; Bauman, W.; Soliz, P.

    2012-03-01

    Retinal vasculature is one of the most important anatomical structures in digital retinal photographs. Accurate segmentation of retinal blood vessels is an essential task in automated analysis of retinopathy. This paper presents a new and effective vessel segmentation algorithm that features computational simplicity and fast implementation. This method uses morphological pre-processing to decrease the disturbance of bright structures and lesions before vessel extraction. Next, a vessel probability map is generated by computing the eigenvalues of the second derivatives of Gaussian filtered image at multiple scales. Then, the second order local entropy thresholding is applied to segment the vessel map. Lastly, a rule-based decision step, which measures the geometric shape difference between vessels and lesions is applied to reduce false positives. The algorithm is evaluated on the low-resolution DRIVE and STARE databases and the publicly available high-resolution image database from Friedrich-Alexander University Erlangen-Nuremberg, Germany). The proposed method achieved comparable performance to state of the art unsupervised vessel segmentation methods with a competitive faster speed on the DRIVE and STARE databases. For the high resolution fundus image database, the proposed algorithm outperforms an existing approach both on performance and speed. The efficiency and robustness make the blood vessel segmentation method described here suitable for broad application in automated analysis of retinal images.

  14. On the estimation of brain signal entropy from sparse neuroimaging data

    PubMed Central

    Grandy, Thomas H.; Garrett, Douglas D.; Schmiedek, Florian; Werkle-Bergner, Markus

    2016-01-01

    Multi-scale entropy (MSE) has been recently established as a promising tool for the analysis of the moment-to-moment variability of neural signals. Appealingly, MSE provides a measure of the predictability of neural operations across the multiple time scales on which the brain operates. An important limitation in the application of the MSE to some classes of neural signals is MSE’s apparent reliance on long time series. However, this sparse-data limitation in MSE computation could potentially be overcome via MSE estimation across shorter time series that are not necessarily acquired continuously (e.g., in fMRI block-designs). In the present study, using simulated, EEG, and fMRI data, we examined the dependence of the accuracy and precision of MSE estimates on the number of data points per segment and the total number of data segments. As hypothesized, MSE estimation across discontinuous segments was comparably accurate and precise, despite segment length. A key advance of our approach is that it allows the calculation of MSE scales not previously accessible from the native segment lengths. Consequently, our results may permit a far broader range of applications of MSE when gauging moment-to-moment dynamics in sparse and/or discontinuous neurophysiological data typical of many modern cognitive neuroscience study designs. PMID:27020961

  15. Altering the orientation of a fused protein to the RNA-binding ribosomal protein L7Ae and its derivatives through circular permutation.

    PubMed

    Ohuchi, Shoji J; Sagawa, Fumihiko; Sakamoto, Taiichi; Inoue, Tan

    2015-10-23

    RNA-protein complexes (RNPs) are useful for constructing functional nano-objects because a variety of functional proteins can be displayed on a designed RNA scaffold. Here, we report circular permutations of an RNA-binding protein L7Ae based on the three-dimensional structure information to alter the orientation of the displayed proteins on the RNA scaffold. An electrophoretic mobility shift assay and atomic force microscopy (AFM) analysis revealed that most of the designed circular permutants formed an RNP nano-object. Moreover, the alteration of the enhanced green fluorescent protein (EGFP) orientation was confirmed with AFM by employing EGFP on the L7Ae permutant on the RNA. The results demonstrate that targeted fine-tuning of the stereo-specific fixation of a protein on a protein-binding RNA is feasible by using the circular permutation technique. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Linear models: permutation methods

    USGS Publications Warehouse

    Cade, B.S.; Everitt, B.S.; Howell, D.C.

    2005-01-01

    Permutation tests (see Permutation Based Inference) for the linear model have applications in behavioral studies when traditional parametric assumptions about the error term in a linear model are not tenable. Improved validity of Type I error rates can be achieved with properly constructed permutation tests. Perhaps more importantly, increased statistical power, improved robustness to effects of outliers, and detection of alternative distributional differences can be achieved by coupling permutation inference with alternative linear model estimators. For example, it is well-known that estimates of the mean in linear model are extremely sensitive to even a single outlying value of the dependent variable compared to estimates of the median [7, 19]. Traditionally, linear modeling focused on estimating changes in the center of distributions (means or medians). However, quantile regression allows distributional changes to be estimated in all or any selected part of a distribution or responses, providing a more complete statistical picture that has relevance to many biological questions [6]...

  17. Altering the orientation of a fused protein to the RNA-binding ribosomal protein L7Ae and its derivatives through circular permutation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohuchi, Shoji J.; Sagawa, Fumihiko; Sakamoto, Taiichi

    RNA-protein complexes (RNPs) are useful for constructing functional nano-objects because a variety of functional proteins can be displayed on a designed RNA scaffold. Here, we report circular permutations of an RNA-binding protein L7Ae based on the three-dimensional structure information to alter the orientation of the displayed proteins on the RNA scaffold. An electrophoretic mobility shift assay and atomic force microscopy (AFM) analysis revealed that most of the designed circular permutants formed an RNP nano-object. Moreover, the alteration of the enhanced green fluorescent protein (EGFP) orientation was confirmed with AFM by employing EGFP on the L7Ae permutant on the RNA. Themore » results demonstrate that targeted fine-tuning of the stereo-specific fixation of a protein on a protein-binding RNA is feasible by using the circular permutation technique.« less

  18. Balance Regularity Among Former High School Football Players With or Without a History of Concussion.

    PubMed

    Schmidt, Julianne D; Terry, Douglas P; Ko, Jihyun; Newell, Karl M; Miller, L Stephen

    2018-02-01

      Subclinical postural-control changes may persist beyond the point when athletes are considered clinically recovered postconcussion.   To compare postural-control performance between former high school football players with or without a history of concussion using linear and nonlinear metrics.   Case-control study.   Clinical research laboratory.   A total of 11 former high school football players (age range, 45-60 years) with 2 or more concussions and 11 age- and height-matched former high school football players without a history of concussion. No participant had college or professional football experience.   Participants completed the Sensory Organization Test. We compared postural control (linear: equilibrium scores; nonlinear: sample and multiscale entropy) between groups using a 2 × 3 analysis of variance across conditions 4 to 6 (4: eyes open, sway-referenced platform; 5: eyes closed, sway-referenced platform; 6: eyes open, sway-referenced surround and platform).   We observed a group-by-condition interaction effect for medial-lateral sample entropy ( F 2,40 = 3.26, P = .049, η p 2 = 0.140). Participants with a history of concussion presented with more regular medial-lateral sample entropy values (0.90 ± 0.41) for condition 5 than participants without a history of concussion (1.30 ± 0.35; mean difference = -0.40; 95% confidence interval [CI] = -0.74, -0.06; t 20 = -2.48, P = .02), but conditions 4 (mean difference = -0.11; 95% CI: -0.37, 0.15; t 20 = -0.86, P = .40) and 6 (mean difference = -0.25; 95% CI: -0.55, 0.06; t 20 = -1.66, P = .11) did not differ between groups.   Postconcussion deficits, detected using nonlinear metrics, may persist long after injury resolution. Subclinical concussion deficits may persist for years beyond clinical concussion recovery.

  19. Nonextensive Entropy Approach to Space Plasma Fluctuations and Turbulence

    NASA Astrophysics Data System (ADS)

    Leubner, M. P.; Vörös, Z.; Baumjohann, W.

    Spatial intermittency in fully developed turbulence is an established feature of astrophysical plasma fluctuations and in particular apparent in the interplanetary medium by in situ observations. In this situation, the classical Boltzmann— Gibbs extensive thermo-statistics, applicable when microscopic interactions and memory are short ranged and the environment is a continuous and differentiable manifold, fails. Upon generalization of the entropy function to nonextensivity, accounting for long-range interactions and thus for correlations in the system, it is demonstrated that the corresponding probability distribution functions (PDFs) are members of a family of specific power-law distributions. In particular, the resulting theoretical bi-κ functional reproduces accurately the observed global leptokurtic, non-Gaussian shape of the increment PDFs of characteristic solar wind variables on all scales, where nonlocality in turbulence is controlled via a multiscale coupling parameter. Gradual decoupling is obtained by enhancing the spatial separation scale corresponding to increasing κ-values in case of slow solar wind conditions where a Gaussian is approached in the limit of large scales. Contrary, the scaling properties in the high speed solar wind are predominantly governed by the mean energy or variance of the distribution, appearing as second parameter in the theory. The PDFs of solar wind scalar field differences are computed from WIND and ACE data for different time-lags and bulk speeds and analyzed within the nonextensive theory, where also a particular nonlinear dependence of the coupling parameter and variance with scale arises for best fitting theoretical PDFs. Consequently, nonlocality in fluctuations, related to both, turbulence and its large scale driving, should be related to long-range interactions in the context of nonextensive entropy generalization, providing fundamentally the physical background of the observed scale dependence of fluctuations in intermittent space plasmas.

  20. Permutation inference for the general linear model

    PubMed Central

    Winkler, Anderson M.; Ridgway, Gerard R.; Webster, Matthew A.; Smith, Stephen M.; Nichols, Thomas E.

    2014-01-01

    Permutation methods can provide exact control of false positives and allow the use of non-standard statistics, making only weak assumptions about the data. With the availability of fast and inexpensive computing, their main limitation would be some lack of flexibility to work with arbitrary experimental designs. In this paper we report on results on approximate permutation methods that are more flexible with respect to the experimental design and nuisance variables, and conduct detailed simulations to identify the best method for settings that are typical for imaging research scenarios. We present a generic framework for permutation inference for complex general linear models (glms) when the errors are exchangeable and/or have a symmetric distribution, and show that, even in the presence of nuisance effects, these permutation inferences are powerful while providing excellent control of false positives in a wide range of common and relevant imaging research scenarios. We also demonstrate how the inference on glm parameters, originally intended for independent data, can be used in certain special but useful cases in which independence is violated. Detailed examples of common neuroimaging applications are provided, as well as a complete algorithm – the “randomise” algorithm – for permutation inference with the glm. PMID:24530839

  1. Modulation of a protein free-energy landscape by circular permutation.

    PubMed

    Radou, Gaël; Enciso, Marta; Krivov, Sergei; Paci, Emanuele

    2013-11-07

    Circular permutations usually retain the native structure and function of a protein while inevitably perturbing its folding dynamics. By using simulations with a structure-based model and a rigorous methodology to determine free-energy surfaces from trajectories, we evaluate the effect of a circular permutation on the free-energy landscape of the protein T4 lysozyme. We observe changes which, although subtle, largely affect the cooperativity between the two subdomains. Such a change in cooperativity has been previously experimentally observed and recently also characterized using single molecule optical tweezers and the Crooks relation. The free-energy landscapes show that both the wild type and circular permutant have an on-pathway intermediate, previously experimentally characterized, in which one of the subdomains is completely formed. The landscapes, however, differ in the position of the rate-limiting step for folding, which occurs before the intermediate in the wild type and after in the circular permutant. This shift of transition state explains the observed change in the cooperativity. The underlying free-energy landscape thus provides a microscopic description of the folding dynamics and the connection between circular permutation and the loss of cooperativity experimentally observed.

  2. Multiscale gyrokinetics for rotating tokamak plasmas: fluctuations, transport and energy flows.

    PubMed

    Abel, I G; Plunk, G G; Wang, E; Barnes, M; Cowley, S C; Dorland, W; Schekochihin, A A

    2013-11-01

    This paper presents a complete theoretical framework for studying turbulence and transport in rapidly rotating tokamak plasmas. The fundamental scale separations present in plasma turbulence are codified as an asymptotic expansion in the ratio ε = ρi/α of the gyroradius to the equilibrium scale length. Proceeding order by order in this expansion, a set of coupled multiscale equations is developed. They describe an instantaneous equilibrium, the fluctuations driven by gradients in the equilibrium quantities, and the transport-timescale evolution of mean profiles of these quantities driven by the interplay between the equilibrium and the fluctuations. The equilibrium distribution functions are local Maxwellians with each flux surface rotating toroidally as a rigid body. The magnetic equilibrium is obtained from the generalized Grad-Shafranov equation for a rotating plasma, determining the magnetic flux function from the mean pressure and velocity profiles of the plasma. The slow (resistive-timescale) evolution of the magnetic field is given by an evolution equation for the safety factor q. Large-scale deviations of the distribution function from a Maxwellian are given by neoclassical theory. The fluctuations are determined by the 'high-flow' gyrokinetic equation, from which we derive the governing principle for gyrokinetic turbulence in tokamaks: the conservation and local (in space) cascade of the free energy of the fluctuations (i.e. there is no turbulence spreading). Transport equations for the evolution of the mean density, temperature and flow velocity profiles are derived. These transport equations show how the neoclassical and fluctuating corrections to the equilibrium Maxwellian act back upon the mean profiles through fluxes and heating. The energy and entropy conservation laws for the mean profiles are derived from the transport equations. Total energy, thermal, kinetic and magnetic, is conserved and there is no net turbulent heating. Entropy is produced by the action of fluxes flattening gradients, Ohmic heating and the equilibration of interspecies temperature differences. This equilibration is found to include both turbulent and collisional contributions. Finally, this framework is condensed, in the low-Mach-number limit, to a more concise set of equations suitable for numerical implementation.

  3. Nonlinearities of heart rate variability in animal models of impaired cardiac control: contribution of different time scales.

    PubMed

    Silva, Luiz Eduardo Virgilio; Lataro, Renata Maria; Castania, Jaci Airton; Silva, Carlos Alberto Aguiar; Salgado, Helio Cesar; Fazan, Rubens; Porta, Alberto

    2017-08-01

    Heart rate variability (HRV) has been extensively explored by traditional linear approaches (e.g., spectral analysis); however, several studies have pointed to the presence of nonlinear features in HRV, suggesting that linear tools might fail to account for the complexity of the HRV dynamics. Even though the prevalent notion is that HRV is nonlinear, the actual presence of nonlinear features is rarely verified. In this study, the presence of nonlinear dynamics was checked as a function of time scales in three experimental models of rats with different impairment of the cardiac control: namely, rats with heart failure (HF), spontaneously hypertensive rats (SHRs), and sinoaortic denervated (SAD) rats. Multiscale entropy (MSE) and refined MSE (RMSE) were chosen as the discriminating statistic for the surrogate test utilized to detect nonlinearity. Nonlinear dynamics is less present in HF animals at both short and long time scales compared with controls. A similar finding was found in SHR only at short time scales. SAD increased the presence of nonlinear dynamics exclusively at short time scales. Those findings suggest that a working baroreflex contributes to linearize HRV and to reduce the likelihood to observe nonlinear components of the cardiac control at short time scales. In addition, an increased sympathetic modulation seems to be a source of nonlinear dynamics at long time scales. Testing nonlinear dynamics as a function of the time scales can provide a characterization of the cardiac control complementary to more traditional markers in time, frequency, and information domains. NEW & NOTEWORTHY Although heart rate variability (HRV) dynamics is widely assumed to be nonlinear, nonlinearity tests are rarely used to check this hypothesis. By adopting multiscale entropy (MSE) and refined MSE (RMSE) as the discriminating statistic for the nonlinearity test, we show that nonlinear dynamics varies with time scale and the type of cardiac dysfunction. Moreover, as complexity metrics and nonlinearities provide complementary information, we strongly recommend using the test for nonlinearity as an additional index to characterize HRV. Copyright © 2017 the American Physiological Society.

  4. EEG complexity as a biomarker for autism spectrum disorder risk

    PubMed Central

    2011-01-01

    Background Complex neurodevelopmental disorders may be characterized by subtle brain function signatures early in life before behavioral symptoms are apparent. Such endophenotypes may be measurable biomarkers for later cognitive impairments. The nonlinear complexity of electroencephalography (EEG) signals is believed to contain information about the architecture of the neural networks in the brain on many scales. Early detection of abnormalities in EEG signals may be an early biomarker for developmental cognitive disorders. The goal of this paper is to demonstrate that the modified multiscale entropy (mMSE) computed on the basis of resting state EEG data can be used as a biomarker of normal brain development and distinguish typically developing children from a group of infants at high risk for autism spectrum disorder (ASD), defined on the basis of an older sibling with ASD. Methods Using mMSE as a feature vector, a multiclass support vector machine algorithm was used to classify typically developing and high-risk groups. Classification was computed separately within each age group from 6 to 24 months. Results Multiscale entropy appears to go through a different developmental trajectory in infants at high risk for autism (HRA) than it does in typically developing controls. Differences appear to be greatest at ages 9 to 12 months. Using several machine learning algorithms with mMSE as a feature vector, infants were classified with over 80% accuracy into control and HRA groups at age 9 months. Classification accuracy for boys was close to 100% at age 9 months and remains high (70% to 90%) at ages 12 and 18 months. For girls, classification accuracy was highest at age 6 months, but declines thereafter. Conclusions This proof-of-principle study suggests that mMSE computed from resting state EEG signals may be a useful biomarker for early detection of risk for ASD and abnormalities in cognitive development in infants. To our knowledge, this is the first demonstration of an information theoretic analysis of EEG data for biomarkers in infants at risk for a complex neurodevelopmental disorder. PMID:21342500

  5. Correlations between the Signal Complexity of Cerebral and Cardiac Electrical Activity: A Multiscale Entropy Analysis

    PubMed Central

    Lin, Pei-Feng; Lo, Men-Tzung; Tsao, Jenho; Chang, Yi-Chung; Lin, Chen; Ho, Yi-Lwun

    2014-01-01

    The heart begins to beat before the brain is formed. Whether conventional hierarchical central commands sent by the brain to the heart alone explain all the interplay between these two organs should be reconsidered. Here, we demonstrate correlations between the signal complexity of brain and cardiac activity. Eighty-seven geriatric outpatients with healthy hearts and varied cognitive abilities each provided a 24-hour electrocardiography (ECG) and a 19-channel eye-closed routine electroencephalography (EEG). Multiscale entropy (MSE) analysis was applied to three epochs (resting-awake state, photic stimulation of fast frequencies (fast-PS), and photic stimulation of slow frequencies (slow-PS)) of EEG in the 1–58 Hz frequency range, and three RR interval (RRI) time series (awake-state, sleep and that concomitant with the EEG) for each subject. The low-to-high frequency power (LF/HF) ratio of RRI was calculated to represent sympatho-vagal balance. With statistics after Bonferroni corrections, we found that: (a) the summed MSE value on coarse scales of the awake RRI (scales 11–20, RRI-MSE-coarse) were inversely correlated with the summed MSE value on coarse scales of the resting-awake EEG (scales 6–20, EEG-MSE-coarse) at Fp2, C4, T6 and T4; (b) the awake RRI-MSE-coarse was inversely correlated with the fast-PS EEG-MSE-coarse at O1, O2 and C4; (c) the sleep RRI-MSE-coarse was inversely correlated with the slow-PS EEG-MSE-coarse at Fp2; (d) the RRI-MSE-coarse and LF/HF ratio of the awake RRI were correlated positively to each other; (e) the EEG-MSE-coarse at F8 was proportional to the cognitive test score; (f) the results conform to the cholinergic hypothesis which states that cognitive impairment causes reduction in vagal cardiac modulation; (g) fast-PS significantly lowered the EEG-MSE-coarse globally. Whether these heart-brain correlations could be fully explained by the central autonomic network is unknown and needs further exploration. PMID:24498375

  6. Toward a general theory of conical intersections in systems of identical nuclei

    NASA Astrophysics Data System (ADS)

    Keating, Sean P.; Mead, C. Alden

    1987-02-01

    It has been shown previously that the Herzberg-Longuet-Higgins sign change produced in Born-Oppenheimer electronic wave functions when the nuclei traverse a closed path around a conical intersection has implications for the symmetry of wave functions under permutations of identical nuclei. For systems of three or four identical nuclei, there are special features present which have facilitated the detailed analysis. The present paper reports progress toward a general theory for systems of n nuclei. For n=3 or 4, the two key functions which locate conical intersections and define compensating phase factors can conveniently be defined so as to transform under permutations according to a two-dimensional irreducible representation of the permutation group. Since such representations do not exist for n>4, we have chosen to develop a formalism in terms of lab-fixed electronic basis functions, and we show how to define the two key functions in principle. The functions so defined both turn out to be totally symmetric under permutations. We show how they can be used to define compensating phase factors so that all modified electronic wave functions are either totally symmetric or totally antisymmetric under permutations. A detailed analysis is made to cyclic permutations in the neighborhood of Dnh symmetry, which can be extended by continuity arguments to more general configurations, and criteria are obtained for sign changes. There is a qualitative discussion of the treatment of more general permutations.

  7. Hidden cross-correlation patterns in stock markets based on permutation cross-sample entropy and PCA

    NASA Astrophysics Data System (ADS)

    Lin, Aijing; Shang, Pengjian; Zhong, Bo

    2014-12-01

    In this article, we investigate the hidden cross-correlation structures in Chinese stock markets and US stock markets by performing PCSE combined with PCA approach. It is suggested that PCSE can provide a more faithful and more interpretable description of the dynamic mechanism between time series than cross-correlation matrix. We show that this new technique can be adapted to observe stock markets especially during financial crisis. In order to identify and compare the interactions and structures of stock markets during financial crisis, as well as in normal periods, all the samples are divided into four sub-periods. The results imply that the cross-correlations between Chinese group are stronger than the US group in the most sub-periods. In particular, it is likely that the US stock markets are more integrated with each other during global financial crisis than during Asian financial crisis. However, our results illustrate that Chinese stock markets are not immune from the global financial crisis, although less integrated with other markets if they are compared with US stock markets.

  8. Permutation parity machines for neural cryptography.

    PubMed

    Reyes, Oscar Mauricio; Zimmermann, Karl-Heinz

    2010-06-01

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  9. Inference for Distributions over the Permutation Group

    DTIC Science & Technology

    2008-05-01

    world problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n...problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n! possibilities...the Krone ker (or Tensor ) Produ t Representation.In general, the Krone ker produ t representation is redu ible, and so it ande omposed into a dire t

  10. Students' Errors in Solving the Permutation and Combination Problems Based on Problem Solving Steps of Polya

    ERIC Educational Resources Information Center

    Sukoriyanto; Nusantara, Toto; Subanji; Chandra, Tjang Daniel

    2016-01-01

    This article was written based on the results of a study evaluating students' errors in problem solving of permutation and combination in terms of problem solving steps according to Polya. Twenty-five students were asked to do four problems related to permutation and combination. The research results showed that the students still did a mistake in…

  11. Permutation parity machines for neural cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyes, Oscar Mauricio; Escuela de Ingenieria Electrica, Electronica y Telecomunicaciones, Universidad Industrial de Santander, Bucaramanga; Zimmermann, Karl-Heinz

    2010-06-15

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  12. Sorting signed permutations by short operations.

    PubMed

    Galvão, Gustavo Rodrigues; Lee, Orlando; Dias, Zanoni

    2015-01-01

    During evolution, global mutations may alter the order and the orientation of the genes in a genome. Such mutations are referred to as rearrangement events, or simply operations. In unichromosomal genomes, the most common operations are reversals, which are responsible for reversing the order and orientation of a sequence of genes, and transpositions, which are responsible for switching the location of two contiguous portions of a genome. The problem of computing the minimum sequence of operations that transforms one genome into another - which is equivalent to the problem of sorting a permutation into the identity permutation - is a well-studied problem that finds application in comparative genomics. There are a number of works concerning this problem in the literature, but they generally do not take into account the length of the operations (i.e. the number of genes affected by the operations). Since it has been observed that short operations are prevalent in the evolution of some species, algorithms that efficiently solve this problem in the special case of short operations are of interest. In this paper, we investigate the problem of sorting a signed permutation by short operations. More precisely, we study four flavors of this problem: (i) the problem of sorting a signed permutation by reversals of length at most 2; (ii) the problem of sorting a signed permutation by reversals of length at most 3; (iii) the problem of sorting a signed permutation by reversals and transpositions of length at most 2; and (iv) the problem of sorting a signed permutation by reversals and transpositions of length at most 3. We present polynomial-time solutions for problems (i) and (iii), a 5-approximation for problem (ii), and a 3-approximation for problem (iv). Moreover, we show that the expected approximation ratio of the 5-approximation algorithm is not greater than 3 for random signed permutations with more than 12 elements. Finally, we present experimental results that show that the approximation ratios of the approximation algorithms cannot be smaller than 3. In particular, this means that the approximation ratio of the 3-approximation algorithm is tight.

  13. Sample entropy analysis for the estimating depth of anaesthesia through human EEG signal at different levels of unconsciousness during surgeries

    PubMed Central

    Fan, Shou-Zen; Abbod, Maysam F.

    2018-01-01

    Estimating the depth of anaesthesia (DoA) in operations has always been a challenging issue due to the underlying complexity of the brain mechanisms. Electroencephalogram (EEG) signals are undoubtedly the most widely used signals for measuring DoA. In this paper, a novel EEG-based index is proposed to evaluate DoA for 24 patients receiving general anaesthesia with different levels of unconsciousness. Sample Entropy (SampEn) algorithm was utilised in order to acquire the chaotic features of the signals. After calculating the SampEn from the EEG signals, Random Forest was utilised for developing learning regression models with Bispectral index (BIS) as the target. Correlation coefficient, mean absolute error, and area under the curve (AUC) were used to verify the perioperative performance of the proposed method. Validation comparisons with typical nonstationary signal analysis methods (i.e., recurrence analysis and permutation entropy) and regression methods (i.e., neural network and support vector machine) were conducted. To further verify the accuracy and validity of the proposed methodology, the data is divided into four unconsciousness-level groups on the basis of BIS levels. Subsequently, analysis of variance (ANOVA) was applied to the corresponding index (i.e., regression output). Results indicate that the correlation coefficient improved to 0.72 ± 0.09 after filtering and to 0.90 ± 0.05 after regression from the initial values of 0.51 ± 0.17. Similarly, the final mean absolute error dramatically declined to 5.22 ± 2.12. In addition, the ultimate AUC increased to 0.98 ± 0.02, and the ANOVA analysis indicates that each of the four groups of different anaesthetic levels demonstrated significant difference from the nearest levels. Furthermore, the Random Forest output was extensively linear in relation to BIS, thus with better DoA prediction accuracy. In conclusion, the proposed method provides a concrete basis for monitoring patients’ anaesthetic level during surgeries. PMID:29844970

  14. Physiologic variability at the verge of systemic inflammation: multi-scale entropy of heart rate variability is affected by very low doses of endotoxin

    PubMed Central

    Herlitz, Georg N.; Sanders, Renee L.; Cheung, Nora H.; Coyle, Susette M.; Griffel, Benjamin; Macor, Marie A.; Lowry, Stephen F.; Calvano, Steve E.; Gale, Stephen C.

    2014-01-01

    Introduction Human injury or infection induces systemic inflammation with characteristic neuro-endocrine responses. Fluctuations in autonomic function during inflammation are reflected by beat-to-beat variation in heart rate, termed heart rate variability (HRV). In the present study, we determine threshold doses of endotoxin needed to induce observable changes in markers of systemic inflammation, we investigate whether metrics of HRV exhibit a differing threshold dose from other inflammatory markers, and we investigate the size of data sets required for meaningful use of multi-scale entropy (MSE) analysis of HRV. Methods Healthy human volunteers (n=25) were randomized to receive placebo (normal saline) or endotoxin/lipopolysaccharide (LPS): 0.1, 0.25, 0.5, 1.0, or 2.0 ng/kg administered intravenously. Vital signs were recorded every 30 minutes for 6 hours and then at 9, 12, and 24 hours after LPS. Blood samples were drawn at specific time points for cytokine measurements. HRV analysis was performed using EKG epochs of 5 minutes. MSE for HRV was calculated for all dose groups to scale factor 40. Results The lowest significant threshold dose was noted in core temperature at 0.25ng/kg. Endogenous TNF-α and IL-6 were significantly responsive at the next dosage level (0.5ng/kg) along with elevations in circulating leukocytes and heart rate. Responses were exaggerated at higher doses (1 and 2 ng/kg). Time domain and frequency domain HRV metrics similarly suggested a threshold dose, differing from placebo at 1.0 and 2.0 ng/kg, below which no clear pattern in response was evident. By applying repeated-measures ANOVA across scale factors, a significant decrease in MSE was seen at 1.0 and 2.0 ng/kg by 2 hours post exposure to LPS. While not statistically significant below 1.0 ng/kg, MSE unexpectedly decreased across all groups in an orderly dose-response pattern not seen in the other outcomes. Conclusions By usingrANOVA across scale factors, MSE can detect autonomic change after LPS challenge in a group of 25 subjects using EKG epochs of only 5 minutes and entropy analysis to scale factor of only 40, potentially facilitating MSE’s wider use as a research tool or bedside monitor. Traditional markers of inflammation generally exhibit threshold dose behavior. In contrast, MSE’s apparent continuous dose-response pattern, while not statistically verifiable in this study, suggests a potential subclinical harbinger of infectious or other insult. The possible derangement of autonomic complexity prior to or independent of the cytokine surge cannot be ruled out. Future investigation should focus on confirmation of overt inflammation following observed decreases in MSE in a clinical setting. PMID:25526373

  15. Stage call: Cardiovascular reactivity to audition stress in musicians

    PubMed Central

    Chanwimalueang, Theerasak; Aufegger, Lisa; Adjei, Tricia; Wasley, David; Cruder, Cinzia; Mandic, Danilo P.

    2017-01-01

    Auditioning is at the very center of educational and professional life in music and is associated with significant psychophysical demands. Knowledge of how these demands affect cardiovascular responses to psychosocial pressure is essential for developing strategies to both manage stress and understand optimal performance states. To this end, we recorded the electrocardiograms (ECGs) of 16 musicians (11 violinists and 5 flutists) before and during performances in both low- and high-stress conditions: with no audience and in front of an audition panel, respectively. The analysis consisted of the detection of R-peaks in the ECGs to extract heart rate variability (HRV) from the notoriously noisy real-world ECGs. Our data analysis approach spanned both standard (temporal and spectral) and advanced (structural complexity) techniques. The complexity science approaches—namely, multiscale sample entropy and multiscale fuzzy entropy—indicated a statistically significant decrease in structural complexity in HRV from the low- to the high-stress condition and an increase in structural complexity from the pre-performance to performance period, thus confirming the complexity loss theory and a loss in degrees of freedom due to stress. Results from the spectral analyses also suggest that the stress responses in the female participants were more parasympathetically driven than those of the male participants. In conclusion, our findings suggest that interventions to manage stress are best targeted at the sensitive pre-performance period, before an audition begins. PMID:28437466

  16. Determining distinct circuit in complete graphs using permutation

    NASA Astrophysics Data System (ADS)

    Karim, Sharmila; Ibrahim, Haslinda; Darus, Maizon Mohd

    2017-11-01

    A Half Butterfly Method (HBM) is a method introduced to construct the distinct circuits in complete graphs where used the concept of isomorphism. The Half Butterfly Method was applied in the field of combinatorics such as in listing permutations of n elements. However the method of determining distinct circuit using HBM for n > 4 is become tedious. Thus, in this paper, we present the method of generating distinct circuit using permutation.

  17. A Versatile Platform for Nanotechnology Based on Circular Permutation of a Chaperonin Protein

    NASA Technical Reports Server (NTRS)

    Paavola, Chad; McMillan, Andrew; Trent, Jonathan; Chan, Suzanne; Mazzarella, Kellen; Li, Yi-Fen

    2004-01-01

    A number of protein complexes have been developed as nanoscale templates. These templates can be functionalized using the peptide sequences that bind inorganic materials. However, it is difficult to integrate peptides into a specific position within a protein template. Integrating intact proteins with desirable binding or catalytic activities is an even greater challenge. We present a general method for modifying protein templates using circular permutation so that additional peptide sequence can be added in a wide variety of specific locations. Circular permutation is a reordering of the polypeptide chain such that the original termini are joined and new termini are created elsewhere in the protein. New sequence can be joined to the protein termini without perturbing the protein structure and with minimal limitation on the size and conformation of the added sequence. We have used this approach to modify a chaperonin protein template, placing termini at five different locations distributed across the surface of the protein complex. These permutants are competent to form the double-ring structures typical of chaperonin proteins. The permuted double-rings also form the same assemblies as the unmodified protein. We fused a fluorescent protein to two representative permutants and demonstrated that it assumes its active structure and does not interfere with assembly of chaperonin double-rings.

  18. Nonlinear complexity behaviors of agent-based 3D Potts financial dynamics with random environments

    NASA Astrophysics Data System (ADS)

    Xing, Yani; Wang, Jun

    2018-02-01

    A new microscopic 3D Potts interaction financial price model is established in this work, to investigate the nonlinear complexity behaviors of stock markets. 3D Potts model, which extends the 2D Potts model to three-dimensional, is a cubic lattice model to explain the interaction behavior among the agents. In order to explore the complexity of real financial markets and the 3D Potts financial model, a new random coarse-grained Lempel-Ziv complexity is proposed to certain series, such as the price returns, the price volatilities, and the random time d-returns. Then the composite multiscale entropy (CMSE) method is applied to the intrinsic mode functions (IMFs) and the corresponding shuffled data to study the complexity behaviors. The empirical results indicate that the 3D financial model is feasible.

  19. Volatility behavior of visibility graph EMD financial time series from Ising interacting system

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Wang, Jun; Fang, Wen

    2015-08-01

    A financial market dynamics model is developed and investigated by stochastic Ising system, where the Ising model is the most popular ferromagnetic model in statistical physics systems. Applying two graph based analysis and multiscale entropy method, we investigate and compare the statistical volatility behavior of return time series and the corresponding IMF series derived from the empirical mode decomposition (EMD) method. And the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, we find that the degree distribution of visibility graph for the simulation series has the power law tails, and the assortative network exhibits the mixing pattern property. All these features are in agreement with the real market data, the research confirms that the financial model established by the Ising system is reasonable.

  20. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  1. Rank score and permutation testing alternatives for regression quantile estimates

    USGS Publications Warehouse

    Cade, B.S.; Richards, J.D.; Mielke, P.W.

    2006-01-01

    Performance of quantile rank score tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1) were evaluated by simulation for models with p = 2 and 6 predictors, moderate collinearity among predictors, homogeneous and hetero-geneous errors, small to moderate samples (n = 20–300), and central to upper quantiles (0.50–0.99). Test statistics evaluated were the conventional quantile rank score T statistic distributed as χ2 random variable with q degrees of freedom (where q parameters are constrained by H 0:) and an F statistic with its sampling distribution approximated by permutation. The permutation F-test maintained better Type I errors than the T-test for homogeneous error models with smaller n and more extreme quantiles τ. An F distributional approximation of the F statistic provided some improvements in Type I errors over the T-test for models with > 2 parameters, smaller n, and more extreme quantiles but not as much improvement as the permutation approximation. Both rank score tests required weighting to maintain correct Type I errors when heterogeneity under the alternative model increased to 5 standard deviations across the domain of X. A double permutation procedure was developed to provide valid Type I errors for the permutation F-test when null models were forced through the origin. Power was similar for conditions where both T- and F-tests maintained correct Type I errors but the F-test provided some power at smaller n and extreme quantiles when the T-test had no power because of excessively conservative Type I errors. When the double permutation scheme was required for the permutation F-test to maintain valid Type I errors, power was less than for the T-test with decreasing sample size and increasing quantiles. Confidence intervals on parameters and tolerance intervals for future predictions were constructed based on test inversion for an example application relating trout densities to stream channel width:depth.

  2. The right thalamus may play an important role in anesthesia-awakening regulation in frogs

    PubMed Central

    Fan, Yanzhu; Yue, Xizi; Xue, Fei; Brauth, Steven E.; Tang, Yezhong

    2018-01-01

    Background Previous studies have shown that the mammalian thalamus is a key structure for anesthesia-induced unconsciousness and anesthesia-awakening regulation. However, both the dynamic characteristics and probable lateralization of thalamic functioning during anesthesia-awakening regulation are not fully understood, and little is known of the evolutionary basis of the role of the thalamus in anesthesia-awakening regulation. Methods An amphibian species, the South African clawed frog (Xenopus laevis) was used in the present study. The frogs were immersed in triciane methanesulfonate (MS-222) for general anesthesia. Electroencephalogram (EEG) signals were recorded continuously from both sides of the telencephalon, diencephalon (thalamus) and mesencephalon during the pre-anesthesia stage, administration stage, recovery stage and post-anesthesia stage. EEG data was analyzed including calculation of approximate entropy (ApEn) and permutation entropy (PE). Results Both ApEn and PE values differed significantly between anesthesia stages, with the highest values occurring during the awakening period and the lowest values during the anesthesia period. There was a significant correlation between the stage durations and ApEn or PE values during anesthesia-awakening cycle primarily for the right diencephalon (right thalamus). ApEn and PE values for females were significantly higher than those for males. Discussion ApEn and PE measurements are suitable for estimating depth of anesthesia and complexity of amphibian brain activity. The right thalamus appears physiologically positioned to play an important role in anesthesia-awakening regulation in frogs indicating an early evolutionary origin of the role of the thalamus in arousal and consciousness in land vertebrates. Sex differences exist in the neural regulation of general anesthesia in frogs. PMID:29576980

  3. Segment swapping aided the evolution of enzyme function: The case of uroporphyrinogen III synthase.

    PubMed

    Szilágyi, András; Györffy, Dániel; Závodszky, Péter

    2017-01-01

    In an earlier study, we showed that two-domain segment-swapped proteins can evolve by domain swapping and fusion, resulting in a protein with two linkers connecting its domains. We proposed that a potential evolutionary advantage of this topology may be the restriction of interdomain motions, which may facilitate domain closure by a hinge-like movement, crucial for the function of many enzymes. Here, we test this hypothesis computationally on uroporphyrinogen III synthase, a two-domain segment-swapped enzyme essential in porphyrin metabolism. To compare the interdomain flexibility between the wild-type, segment-swapped enzyme (having two interdomain linkers) and circular permutants of the same enzyme having only one interdomain linker, we performed geometric and molecular dynamics simulations for these species in their ligand-free and ligand-bound forms. We find that in the ligand-free form, interdomain motions in the wild-type enzyme are significantly more restricted than they would be with only one interdomain linker, while the flexibility difference is negligible in the ligand-bound form. We also estimated the entropy costs of ligand binding associated with the interdomain motions, and find that the change in domain connectivity due to segment swapping results in a reduction of this entropy cost, corresponding to ∼20% of the total ligand binding free energy. In addition, the restriction of interdomain motions may also help the functional domain-closure motion required for catalysis. This suggests that the evolution of the segment-swapped topology facilitated the evolution of enzyme function for this protein by influencing its dynamic properties. Proteins 2016; 85:46-53. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. An analog scrambler for speech based on sequential permutations in time and frequency

    NASA Astrophysics Data System (ADS)

    Cox, R. V.; Jayant, N. S.; McDermott, B. J.

    Permutation of speech segments is an operation that is frequently used in the design of scramblers for analog speech privacy. In this paper, a sequential procedure for segment permutation is considered. This procedure can be extended to two dimensional permutation of time segments and frequency bands. By subjective testing it is shown that this combination gives a residual intelligibility for spoken digits of 20 percent with a delay of 256 ms. (A lower bound for this test would be 10 percent). The complexity of implementing such a system is considered and the issues of synchronization and channel equalization are addressed. The computer simulation results for the system using both real and simulated channels are examined.

  5. A 1.375-approximation algorithm for sorting by transpositions.

    PubMed

    Elias, Isaac; Hartman, Tzvika

    2006-01-01

    Sorting permutations by transpositions is an important problem in genome rearrangements. A transposition is a rearrangement operation in which a segment is cut out of the permutation and pasted in a different location. The complexity of this problem is still open and it has been a 10-year-old open problem to improve the best known 1.5-approximation algorithm. In this paper, we provide a 1.375-approximation algorithm for sorting by transpositions. The algorithm is based on a new upper bound on the diameter of 3-permutations. In addition, we present some new results regarding the transposition diameter: we improve the lower bound for the transposition diameter of the symmetric group and determine the exact transposition diameter of simple permutations.

  6. Permutational distribution of the log-rank statistic under random censorship with applications to carcinogenicity assays.

    PubMed

    Heimann, G; Neuhaus, G

    1998-03-01

    In the random censorship model, the log-rank test is often used for comparing a control group with different dose groups. If the number of tumors is small, so-called exact methods are often applied for computing critical values from a permutational distribution. Two of these exact methods are discussed and shown to be incorrect. The correct permutational distribution is derived and studied with respect to its behavior under unequal censoring in the light of recent results proving that the permutational version and the unconditional version of the log-rank test are asymptotically equivalent even under unequal censoring. The log-rank test is studied by simulations of a realistic scenario from a bioassay with small numbers of tumors.

  7. Object-Based Land Use Classification of Agricultural Land by Coupling Multi-Temporal Spectral Characteristics and Phenological Events in Germany

    NASA Astrophysics Data System (ADS)

    Knoefel, Patrick; Loew, Fabian; Conrad, Christopher

    2015-04-01

    Crop maps based on classification of remotely sensed data are of increased attendance in agricultural management. This induces a more detailed knowledge about the reliability of such spatial information. However, classification of agricultural land use is often limited by high spectral similarities of the studied crop types. More, spatially and temporally varying agro-ecological conditions can introduce confusion in crop mapping. Classification errors in crop maps in turn may have influence on model outputs, like agricultural production monitoring. One major goal of the PhenoS project ("Phenological structuring to determine optimal acquisition dates for Sentinel-2 data for field crop classification"), is the detection of optimal phenological time windows for land cover classification purposes. Since many crop species are spectrally highly similar, accurate classification requires the right selection of satellite images for a certain classification task. In the course of one growing season, phenological phases exist where crops are separable with higher accuracies. For this purpose, coupling of multi-temporal spectral characteristics and phenological events is promising. The focus of this study is set on the separation of spectrally similar cereal crops like winter wheat, barley, and rye of two test sites in Germany called "Harz/Central German Lowland" and "Demmin". However, this study uses object based random forest (RF) classification to investigate the impact of image acquisition frequency and timing on crop classification uncertainty by permuting all possible combinations of available RapidEye time series recorded on the test sites between 2010 and 2014. The permutations were applied to different segmentation parameters. Then, classification uncertainty was assessed and analysed, based on the probabilistic soft-output from the RF algorithm at the per-field basis. From this soft output, entropy was calculated as a spatial measure of classification uncertainty. The results indicate that uncertainty estimates provide a valuable addition to traditional accuracy assessments and helps the user to allocate error in crop maps.

  8. A Computationally Efficient Hypothesis Testing Method for Epistasis Analysis using Multifactor Dimensionality Reduction

    PubMed Central

    Pattin, Kristine A.; White, Bill C.; Barney, Nate; Gui, Jiang; Nelson, Heather H.; Kelsey, Karl R.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.

    2008-01-01

    Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free data mining method for detecting, characterizing, and interpreting epistasis in the absence of significant main effects in genetic and epidemiologic studies of complex traits such as disease susceptibility. The goal of MDR is to change the representation of the data using a constructive induction algorithm to make nonadditive interactions easier to detect using any classification method such as naïve Bayes or logistic regression. Traditionally, MDR constructed variables have been evaluated with a naïve Bayes classifier that is combined with 10-fold cross validation to obtain an estimate of predictive accuracy or generalizability of epistasis models. Traditionally, we have used permutation testing to statistically evaluate the significance of models obtained through MDR. The advantage of permutation testing is that it controls for false-positives due to multiple testing. The disadvantage is that permutation testing is computationally expensive. This is in an important issue that arises in the context of detecting epistasis on a genome-wide scale. The goal of the present study was to develop and evaluate several alternatives to large-scale permutation testing for assessing the statistical significance of MDR models. Using data simulated from 70 different epistasis models, we compared the power and type I error rate of MDR using a 1000-fold permutation test with hypothesis testing using an extreme value distribution (EVD). We find that this new hypothesis testing method provides a reasonable alternative to the computationally expensive 1000-fold permutation test and is 50 times faster. We then demonstrate this new method by applying it to a genetic epidemiology study of bladder cancer susceptibility that was previously analyzed using MDR and assessed using a 1000-fold permutation test. PMID:18671250

  9. Estrogen pathway polymorphisms in relation to primary open angle glaucoma: An analysis accounting for gender from the United States

    PubMed Central

    Loomis, Stephanie J.; Weinreb, Robert N.; Kang, Jae H.; Yaspan, Brian L.; Bailey, Jessica Cooke; Gaasterland, Douglas; Gaasterland, Terry; Lee, Richard K.; Scott, William K.; Lichter, Paul R.; Budenz, Donald L.; Liu, Yutao; Realini, Tony; Friedman, David S.; McCarty, Catherine A.; Moroi, Sayoko E.; Olson, Lana; Schuman, Joel S.; Singh, Kuldev; Vollrath, Douglas; Wollstein, Gadi; Zack, Donald J.; Brilliant, Murray; Sit, Arthur J.; Christen, William G.; Fingert, John; Kraft, Peter; Zhang, Kang; Allingham, R. Rand; Pericak-Vance, Margaret A.; Richards, Julia E.; Hauser, Michael A.; Haines, Jonathan L.; Wiggs, Janey L.

    2013-01-01

    Purpose Circulating estrogen levels are relevant in glaucoma phenotypic traits. We assessed the association between an estrogen metabolism single nucleotide polymorphism (SNP) panel in relation to primary open angle glaucoma (POAG), accounting for gender. Methods We included 3,108 POAG cases and 3,430 controls of both genders from the Glaucoma Genes and Environment (GLAUGEN) study and the National Eye Institute Glaucoma Human Genetics Collaboration (NEIGHBOR) consortium genotyped on the Illumina 660W-Quad platform. We assessed the relation between the SNP panels representative of estrogen metabolism and POAG using pathway- and gene-based approaches with the Pathway Analysis by Randomization Incorporating Structure (PARIS) software. PARIS executes a permutation algorithm to assess statistical significance relative to the pathways and genes of comparable genetic architecture. These analyses were performed using the meta-analyzed results from the GLAUGEN and NEIGHBOR data sets. We evaluated POAG overall as well as two subtypes of POAG defined as intraocular pressure (IOP) ≥22 mmHg (high-pressure glaucoma [HPG]) or IOP <22 mmHg (normal pressure glaucoma [NPG]) at diagnosis. We conducted these analyses for each gender separately and then jointly in men and women. Results Among women, the estrogen SNP pathway was associated with POAG overall (permuted p=0.006) and HPG (permuted p<0.001) but not NPG (permuted p=0.09). Interestingly, there was no relation between the estrogen SNP pathway and POAG when men were considered alone (permuted p>0.99). Among women, gene-based analyses revealed that the catechol-O-methyltransferase gene showed strong associations with HTG (permuted gene p≤0.001) and NPG (permuted gene p=0.01). Conclusions The estrogen SNP pathway was associated with POAG among women. PMID:23869166

  10. Error-free holographic frames encryption with CA pixel-permutation encoding algorithm

    NASA Astrophysics Data System (ADS)

    Li, Xiaowei; Xiao, Dan; Wang, Qiong-Hua

    2018-01-01

    The security of video data is necessary in network security transmission hence cryptography is technique to make video data secure and unreadable to unauthorized users. In this paper, we propose a holographic frames encryption technique based on the cellular automata (CA) pixel-permutation encoding algorithm. The concise pixel-permutation algorithm is used to address the drawbacks of the traditional CA encoding methods. The effectiveness of the proposed video encoding method is demonstrated by simulation examples.

  11. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    PubMed

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  12. Photographs and Committees: Activities That Help Students Discover Permutations and Combinations.

    ERIC Educational Resources Information Center

    Szydlik, Jennifer Earles

    2000-01-01

    Presents problem situations that support students when discovering the multiplication principle, permutations, combinations, Pascal's triangle, and relationships among those objects in a concrete context. (ASK)

  13. A permutation characterization of Sturm global attractors of Hamiltonian type

    NASA Astrophysics Data System (ADS)

    Fiedler, Bernold; Rocha, Carlos; Wolfrum, Matthias

    We consider Neumann boundary value problems of the form u=u+f on the interval 0⩽x⩽π for dissipative nonlinearities f=f(u). A permutation characterization for the global attractors of the semiflows generated by these equations is well known, even in the much more general case f=f(x,u,u). We present a permutation characterization for the global attractors in the restrictive class of nonlinearities f=f(u). In this class the stationary solutions of the parabolic equation satisfy the second order ODE v+f(v)=0 and we obtain the permutation characterization from a characterization of the set of 2 π-periodic orbits of this planar Hamiltonian system. Our results are based on a diligent discussion of this mere pendulum equation.

  14. Multi-Scale Approach for Predicting Fish Species Distributions across Coral Reef Seascapes

    PubMed Central

    Pittman, Simon J.; Brown, Kerry A.

    2011-01-01

    Two of the major limitations to effective management of coral reef ecosystems are a lack of information on the spatial distribution of marine species and a paucity of data on the interacting environmental variables that drive distributional patterns. Advances in marine remote sensing, together with the novel integration of landscape ecology and advanced niche modelling techniques provide an unprecedented opportunity to reliably model and map marine species distributions across many kilometres of coral reef ecosystems. We developed a multi-scale approach using three-dimensional seafloor morphology and across-shelf location to predict spatial distributions for five common Caribbean fish species. Seascape topography was quantified from high resolution bathymetry at five spatial scales (5–300 m radii) surrounding fish survey sites. Model performance and map accuracy was assessed for two high performing machine-learning algorithms: Boosted Regression Trees (BRT) and Maximum Entropy Species Distribution Modelling (MaxEnt). The three most important predictors were geographical location across the shelf, followed by a measure of topographic complexity. Predictor contribution differed among species, yet rarely changed across spatial scales. BRT provided ‘outstanding’ model predictions (AUC = >0.9) for three of five fish species. MaxEnt provided ‘outstanding’ model predictions for two of five species, with the remaining three models considered ‘excellent’ (AUC = 0.8–0.9). In contrast, MaxEnt spatial predictions were markedly more accurate (92% map accuracy) than BRT (68% map accuracy). We demonstrate that reliable spatial predictions for a range of key fish species can be achieved by modelling the interaction between the geographical location across the shelf and the topographic heterogeneity of seafloor structure. This multi-scale, analytic approach is an important new cost-effective tool to accurately delineate essential fish habitat and support conservation prioritization in marine protected area design, zoning in marine spatial planning, and ecosystem-based fisheries management. PMID:21637787

  15. Multi-scale approach for predicting fish species distributions across coral reef seascapes.

    PubMed

    Pittman, Simon J; Brown, Kerry A

    2011-01-01

    Two of the major limitations to effective management of coral reef ecosystems are a lack of information on the spatial distribution of marine species and a paucity of data on the interacting environmental variables that drive distributional patterns. Advances in marine remote sensing, together with the novel integration of landscape ecology and advanced niche modelling techniques provide an unprecedented opportunity to reliably model and map marine species distributions across many kilometres of coral reef ecosystems. We developed a multi-scale approach using three-dimensional seafloor morphology and across-shelf location to predict spatial distributions for five common Caribbean fish species. Seascape topography was quantified from high resolution bathymetry at five spatial scales (5-300 m radii) surrounding fish survey sites. Model performance and map accuracy was assessed for two high performing machine-learning algorithms: Boosted Regression Trees (BRT) and Maximum Entropy Species Distribution Modelling (MaxEnt). The three most important predictors were geographical location across the shelf, followed by a measure of topographic complexity. Predictor contribution differed among species, yet rarely changed across spatial scales. BRT provided 'outstanding' model predictions (AUC = >0.9) for three of five fish species. MaxEnt provided 'outstanding' model predictions for two of five species, with the remaining three models considered 'excellent' (AUC = 0.8-0.9). In contrast, MaxEnt spatial predictions were markedly more accurate (92% map accuracy) than BRT (68% map accuracy). We demonstrate that reliable spatial predictions for a range of key fish species can be achieved by modelling the interaction between the geographical location across the shelf and the topographic heterogeneity of seafloor structure. This multi-scale, analytic approach is an important new cost-effective tool to accurately delineate essential fish habitat and support conservation prioritization in marine protected area design, zoning in marine spatial planning, and ecosystem-based fisheries management.

  16. PBOOST: a GPU-based tool for parallel permutation tests in genome-wide association studies.

    PubMed

    Yang, Guangyuan; Jiang, Wei; Yang, Qiang; Yu, Weichuan

    2015-05-01

    The importance of testing associations allowing for interactions has been demonstrated by Marchini et al. (2005). A fast method detecting associations allowing for interactions has been proposed by Wan et al. (2010a). The method is based on likelihood ratio test with the assumption that the statistic follows the χ(2) distribution. Many single nucleotide polymorphism (SNP) pairs with significant associations allowing for interactions have been detected using their method. However, the assumption of χ(2) test requires the expected values in each cell of the contingency table to be at least five. This assumption is violated in some identified SNP pairs. In this case, likelihood ratio test may not be applicable any more. Permutation test is an ideal approach to checking the P-values calculated in likelihood ratio test because of its non-parametric nature. The P-values of SNP pairs having significant associations with disease are always extremely small. Thus, we need a huge number of permutations to achieve correspondingly high resolution for the P-values. In order to investigate whether the P-values from likelihood ratio tests are reliable, a fast permutation tool to accomplish large number of permutations is desirable. We developed a permutation tool named PBOOST. It is based on GPU with highly reliable P-value estimation. By using simulation data, we found that the P-values from likelihood ratio tests will have relative error of >100% when 50% cells in the contingency table have expected count less than five or when there is zero expected count in any of the contingency table cells. In terms of speed, PBOOST completed 10(7) permutations for a single SNP pair from the Wellcome Trust Case Control Consortium (WTCCC) genome data (Wellcome Trust Case Control Consortium, 2007) within 1 min on a single Nvidia Tesla M2090 device, while it took 60 min in a single CPU Intel Xeon E5-2650 to finish the same task. More importantly, when simultaneously testing 256 SNP pairs for 10(7) permutations, our tool took only 5 min, while the CPU program took 10 h. By permuting on a GPU cluster consisting of 40 nodes, we completed 10(12) permutations for all 280 SNP pairs reported with P-values smaller than 1.6 × 10⁻¹² in the WTCCC datasets in 1 week. The source code and sample data are available at http://bioinformatics.ust.hk/PBOOST.zip. gyang@ust.hk; eeyu@ust.hk Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Engineering calculations for solving the orbital allotment problem

    NASA Technical Reports Server (NTRS)

    Reilly, C.; Walton, E. K.; Mount-Campbell, C.; Caldecott, R.; Aebker, E.; Mata, F.

    1988-01-01

    Four approaches for calculating downlink interferences for shaped-beam antennas are described. An investigation of alternative mixed-integer programming models for satellite synthesis is summarized. Plans for coordinating the various programs developed under this grant are outlined. Two procedures for ordering satellites to initialize the k-permutation algorithm are proposed. Results are presented for the k-permutation algorithms. Feasible solutions are found for 5 of the 6 problems considered. Finally, it is demonstrated that the k-permutation algorithm can be used to solve arc allotment problems.

  18. Dynamic information routing in complex networks

    PubMed Central

    Kirst, Christoph; Timme, Marc; Battaglia, Demian

    2016-01-01

    Flexible information routing fundamentally underlies the function of many biological and artificial networks. Yet, how such systems may specifically communicate and dynamically route information is not well understood. Here we identify a generic mechanism to route information on top of collective dynamical reference states in complex networks. Switching between collective dynamics induces flexible reorganization of information sharing and routing patterns, as quantified by delayed mutual information and transfer entropy measures between activities of a network's units. We demonstrate the power of this mechanism specifically for oscillatory dynamics and analyse how individual unit properties, the network topology and external inputs co-act to systematically organize information routing. For multi-scale, modular architectures, we resolve routing patterns at all levels. Interestingly, local interventions within one sub-network may remotely determine nonlocal network-wide communication. These results help understanding and designing information routing patterns across systems where collective dynamics co-occurs with a communication function. PMID:27067257

  19. RESOLVE: A new algorithm for aperture synthesis imaging of extended emission in radio astronomy

    NASA Astrophysics Data System (ADS)

    Junklewitz, H.; Bell, M. R.; Selig, M.; Enßlin, T. A.

    2016-02-01

    We present resolve, a new algorithm for radio aperture synthesis imaging of extended and diffuse emission in total intensity. The algorithm is derived using Bayesian statistical inference techniques, estimating the surface brightness in the sky assuming a priori log-normal statistics. resolve estimates the measured sky brightness in total intensity, and the spatial correlation structure in the sky, which is used to guide the algorithm to an optimal reconstruction of extended and diffuse sources. During this process, the algorithm succeeds in deconvolving the effects of the radio interferometric point spread function. Additionally, resolve provides a map with an uncertainty estimate of the reconstructed surface brightness. Furthermore, with resolve we introduce a new, optimal visibility weighting scheme that can be viewed as an extension to robust weighting. In tests using simulated observations, the algorithm shows improved performance against two standard imaging approaches for extended sources, Multiscale-CLEAN and the Maximum Entropy Method.

  20. Research on fusion algorithm of polarization image in tetrolet domain

    NASA Astrophysics Data System (ADS)

    Zhang, Dexiang; Yuan, BaoHong; Zhang, Jingjing

    2015-12-01

    Tetrolets are Haar-type wavelets whose supports are tetrominoes which are shapes made by connecting four equal-sized squares. A fusion method for polarization images based on tetrolet transform is proposed. Firstly, the magnitude of polarization image and angle of polarization image can be decomposed into low-frequency coefficients and high-frequency coefficients with multi-scales and multi-directions using tetrolet transform. For the low-frequency coefficients, the average fusion method is used. According to edge distribution differences in high frequency sub-band images, for the directional high-frequency coefficients are used to select the better coefficients by region spectrum entropy algorithm for fusion. At last the fused image can be obtained by utilizing inverse transform for fused tetrolet coefficients. Experimental results show that the proposed method can detect image features more effectively and the fused image has better subjective visual effect

  1. Kullback-Leibler divergence measure of intermittency: Application to turbulence

    NASA Astrophysics Data System (ADS)

    Granero-Belinchón, Carlos; Roux, Stéphane G.; Garnier, Nicolas B.

    2018-01-01

    For generic systems exhibiting power law behaviors, and hence multiscale dependencies, we propose a simple tool to analyze multifractality and intermittency, after noticing that these concepts are directly related to the deformation of a probability density function from Gaussian at large scales to non-Gaussian at smaller scales. Our framework is based on information theory and uses Shannon entropy and Kullback-Leibler divergence. We provide an extensive application to three-dimensional fully developed turbulence, seen here as a paradigmatic complex system where intermittency was historically defined and the concepts of scale invariance and multifractality were extensively studied and benchmarked. We compute our quantity on experimental Eulerian velocity measurements, as well as on synthetic processes and phenomenological models of fluid turbulence. Our approach is very general and does not require any underlying model of the system, although it can probe the relevance of such a model.

  2. Permutation invariant polynomial neural network approach to fitting potential energy surfaces. II. Four-atom systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun; Jiang, Bin; Guo, Hua, E-mail: hguo@unm.edu

    2013-11-28

    A rigorous, general, and simple method to fit global and permutation invariant potential energy surfaces (PESs) using neural networks (NNs) is discussed. This so-called permutation invariant polynomial neural network (PIP-NN) method imposes permutation symmetry by using in its input a set of symmetry functions based on PIPs. For systems with more than three atoms, it is shown that the number of symmetry functions in the input vector needs to be larger than the number of internal coordinates in order to include both the primary and secondary invariant polynomials. This PIP-NN method is successfully demonstrated in three atom-triatomic reactive systems, resultingmore » in full-dimensional global PESs with average errors on the order of meV. These PESs are used in full-dimensional quantum dynamical calculations.« less

  3. The Complexity of Standing Postural Sway Associates with Future Falls in Community-Dwelling Older Adults: The MOBILIZE Boston Study.

    PubMed

    Zhou, Junhong; Habtemariam, Daniel; Iloputaife, Ikechukwu; Lipsitz, Lewis A; Manor, Brad

    2017-06-07

    Standing postural control is complex, meaning that it is dependent upon numerous inputs interacting across multiple temporal-spatial scales. Diminished physiologic complexity of postural sway has been linked to reduced ability to adapt to stressors. We hypothesized that older adults with lower postural sway complexity would experience more falls in the future. 738 adults aged ≥70 years completed the Short Physical Performance Battery test (SPPB) test and assessments of single and dual-task standing postural control. Postural sway complexity was quantified using multiscale entropy. Falls were subsequently tracked for 48 months. Negative binomial regression demonstrated that older adults with lower postural sway complexity in both single and dual-task conditions had higher future fall rate (incident rate ratio (IRR) = 0.98, p = 0.02, 95% Confidence Limits (CL) = 0.96-0.99). Notably, participants in the lowest quintile of complexity during dual-task standing suffered 48% more falls during the four-year follow-up as compared to those in the highest quintile (IRR = 1.48, p = 0.01, 95% CL = 1.09-1.99). Conversely, traditional postural sway metrics or SPPB performance did not associate with future falls. As compared to traditional metrics, the degree of multi-scale complexity contained within standing postural sway-particularly during dual task conditions- appears to be a better predictor of future falls in older adults.

  4. Multiple comparisons permutation test for image based data mining in radiotherapy.

    PubMed

    Chen, Chun; Witte, Marnix; Heemsbergen, Wilma; van Herk, Marcel

    2013-12-23

    : Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy.

  5. Permutational symmetries for coincidence rates in multimode multiphotonic interferometry

    NASA Astrophysics Data System (ADS)

    Khalid, Abdullah; Spivak, Dylan; Sanders, Barry C.; de Guise, Hubert

    2018-06-01

    We obtain coincidence rates for passive optical interferometry by exploiting the permutational symmetries of partially distinguishable input photons, and our approach elucidates qualitative features of multiphoton coincidence landscapes. We treat the interferometer input as a product state of any number of photons in each input mode with photons distinguished by their arrival time. Detectors at the output of the interferometer count photons from each output mode over a long integration time. We generalize and prove the claim of Tillmann et al. [Phys. Rev. X 5, 041015 (2015), 10.1103/PhysRevX.5.041015] that coincidence rates can be elegantly expressed in terms of immanants. Immanants are functions of matrices that exhibit permutational symmetries and the immanants appearing in our coincidence-rate expressions share permutational symmetries with the input state. Our results are obtained by employing representation theory of the symmetric group to analyze systems of an arbitrary number of photons in arbitrarily sized interferometers.

  6. Complex-enhanced chaotic signals with time-delay signature suppression based on vertical-cavity surface-emitting lasers subject to chaotic optical injection

    NASA Astrophysics Data System (ADS)

    Chen, Jianjun; Duan, Yingni; Zhong, Zhuqiang

    2018-06-01

    A chaotic system is constructed on the basis of vertical-cavity surface-emitting lasers (VCSELs), where a slave VCSEL subject to chaotic optical injection (COI) from a master VCSEL with the external feedback. The complex degree (CD) and time-delay signature (TDS) of chaotic signals generated by this chaotic system are investigated numerically via permutation entropy (PE) and self-correlation function (SF) methods, respectively. The results show that, compared with master VCSEL subject to optical feedback, complex-enhanced chaotic signals with TDS suppression can be achieved for S-VCSEL subject to COI. Meanwhile, the influences of several controllable parameters on the evolution maps of CD of chaotic signals are carefully considered. It is shown that the CD of chaotic signals for S-VCSEL is always higher than that for M-VCSEL due to the CIO effect. The TDS of chaotic signals can be significantly suppressed by choosing the reasonable parameters in this system. Furthermore, TDS suppression and high CD chaos can be obtained simultaneously in the specific parameter ranges. The results confirm that this chaotic system may effectively improve the security of a chaos-based communication scheme.

  7. Complex-enhanced chaotic signals with time-delay signature suppression based on vertical-cavity surface-emitting lasers subject to chaotic optical injection

    NASA Astrophysics Data System (ADS)

    Chen, Jianjun; Duan, Yingni; Zhong, Zhuqiang

    2018-03-01

    A chaotic system is constructed on the basis of vertical-cavity surface-emitting lasers (VCSELs), where a slave VCSEL subject to chaotic optical injection (COI) from a master VCSEL with the external feedback. The complex degree (CD) and time-delay signature (TDS) of chaotic signals generated by this chaotic system are investigated numerically via permutation entropy (PE) and self-correlation function (SF) methods, respectively. The results show that, compared with master VCSEL subject to optical feedback, complex-enhanced chaotic signals with TDS suppression can be achieved for S-VCSEL subject to COI. Meanwhile, the influences of several controllable parameters on the evolution maps of CD of chaotic signals are carefully considered. It is shown that the CD of chaotic signals for S-VCSEL is always higher than that for M-VCSEL due to the CIO effect. The TDS of chaotic signals can be significantly suppressed by choosing the reasonable parameters in this system. Furthermore, TDS suppression and high CD chaos can be obtained simultaneously in the specific parameter ranges. The results confirm that this chaotic system may effectively improve the security of a chaos-based communication scheme.

  8. A novel algorithm for thermal image encryption.

    PubMed

    Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen

    2018-04-16

    Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.

  9. A new feedback image encryption scheme based on perturbation with dynamical compound chaotic sequence cipher generator

    NASA Astrophysics Data System (ADS)

    Tong, Xiaojun; Cui, Minggen; Wang, Zhu

    2009-07-01

    The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.

  10. Higher order explicit symmetric integrators for inseparable forms of coordinates and momenta

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Wu, Xin; Huang, Guoqing; Liu, Fuyao

    2016-06-01

    Pihajoki proposed the extended phase-space second-order explicit symmetric leapfrog methods for inseparable Hamiltonian systems. On the basis of this work, we survey a critical problem on how to mix the variables in the extended phase space. Numerical tests show that sequent permutations of coordinates and momenta can make the leapfrog-like methods yield the most accurate results and the optimal long-term stabilized error behaviour. We also present a novel method to construct many fourth-order extended phase-space explicit symmetric integration schemes. Each scheme represents the symmetric production of six usual second-order leapfrogs without any permutations. This construction consists of four segments: the permuted coordinates, triple product of the usual second-order leapfrog without permutations, the permuted momenta and the triple product of the usual second-order leapfrog without permutations. Similarly, extended phase-space sixth, eighth and other higher order explicit symmetric algorithms are available. We used several inseparable Hamiltonian examples, such as the post-Newtonian approach of non-spinning compact binaries, to show that one of the proposed fourth-order methods is more efficient than the existing methods; examples include the fourth-order explicit symplectic integrators of Chin and the fourth-order explicit and implicit mixed symplectic integrators of Zhong et al. Given a moderate choice for the related mixing and projection maps, the extended phase-space explicit symplectic-like methods are well suited for various inseparable Hamiltonian problems. Samples of these problems involve the algorithmic regularization of gravitational systems with velocity-dependent perturbations in the Solar system and post-Newtonian Hamiltonian formulations of spinning compact objects.

  11. Non-parametric combination and related permutation tests for neuroimaging.

    PubMed

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  12. Sylow p-groups of polynomial permutations on the integers mod pn☆

    PubMed Central

    Frisch, Sophie; Krenn, Daniel

    2013-01-01

    We enumerate and describe the Sylow p-groups of the groups of polynomial permutations of the integers mod pn for n⩾1 and of the pro-finite group which is the projective limit of these groups. PMID:26869732

  13. Storage and computationally efficient permutations of factorized covariance and square-root information arrays

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.

  14. Note on new KLT relations

    NASA Astrophysics Data System (ADS)

    Feng, Bo; He, Song; Huang, Rijun; Jia, Yin

    2010-10-01

    In this short note, we present two results about KLT relations discussed in recent several papers. Our first result is the re-derivation of Mason-Skinner MHV amplitude by applying the S n-3 permutation symmetric KLT relations directly to MHV amplitude. Our second result is the equivalence proof of the newly discovered S n-2 permutation symmetric KLT relations and the well-known S n-3 permutation symmetric KLT relations. Although both formulas have been shown to be correct by BCFW recursion relations, our result is the first direct check using the regularized definition of the new formula.

  15. Combating HER2-overexpressing breast cancer through induction of calreticulin exposure by Tras-Permut CrossMab

    PubMed Central

    Zhang, Fan; Zhang, Jie; Liu, Moyan; Zhao, Lichao; LingHu, RuiXia; Feng, Fan; Gao, Xudong; Jiao, Shunchang; Zhao, Lei; Hu, Yi; Yang, Junlan

    2015-01-01

    Although trastuzumab has succeeded in breast cancer treatment, acquired resistance is one of the prime obstacles for breast cancer therapies. There is an urgent need to develop novel HER2 antibodies against trastuzumab resistance. Here, we first rational designed avidity-imporved trastuzumab and pertuzumab variants, and explored the correlation between the binding avidity improvement and their antitumor activities. After characterization of a pertuzumab variant L56TY with potent antitumor activities, a bispecific immunoglobulin G-like CrossMab (Tras-Permut CrossMab) was generated from trastuzumab and binding avidity-improved pertuzumab variant L56TY. Although, the antitumor efficacy of trastuzumab was not enhanced by improving its binding avidity, binding avidity improvement could significantly increase the anti-proliferative and antibody-dependent cellular cytotoxicity (ADCC) activities of pertuzumab. Further studies showed that Tras-Permut CrossMab exhibited exceptional high efficiency to inhibit the progression of trastuzumab-resistant breast cancer. Notably, we found that calreticulin (CRT) exposure induced by Tras-Permut CrossMab was essential for induction of tumor-specific T cell immunity against tumor recurrence. These data indicated that simultaneous blockade of HER2 protein by Tras-Permut CrossMab could trigger CRT exposure and subsequently induce potent tumor-specific T cell immunity, suggesting it could be a promising therapeutic strategy against trastuzumab resistance. PMID:25949918

  16. The constructal law of design and evolution in nature

    PubMed Central

    Bejan, Adrian; Lorente, Sylvie

    2010-01-01

    Constructal theory is the view that (i) the generation of images of design (pattern, rhythm) in nature is a phenomenon of physics and (ii) this phenomenon is covered by a principle (the constructal law): ‘for a finite-size flow system to persist in time (to live) it must evolve such that it provides greater and greater access to the currents that flow through it’. This law is about the necessity of design to occur, and about the time direction of the phenomenon: the tape of the design evolution ‘movie’ runs such that existing configurations are replaced by globally easier flowing configurations. The constructal law has two useful sides: the prediction of natural phenomena and the strategic engineering of novel architectures, based on the constructal law, i.e. not by mimicking nature. We show that the emergence of scaling laws in inanimate (geophysical) flow systems is the same phenomenon as the emergence of allometric laws in animate (biological) flow systems. Examples are lung design, animal locomotion, vegetation, river basins, turbulent flow structure, self-lubrication and natural multi-scale porous media. This article outlines the place of the constructal law as a self-standing law in physics, which covers all the ad hoc (and contradictory) statements of optimality such as minimum entropy generation, maximum entropy generation, minimum flow resistance, maximum flow resistance, minimum time, minimum weight, uniform maximum stresses and characteristic organ sizes. Nature is configured to flow and move as a conglomerate of ‘engine and brake’ designs. PMID:20368252

  17. The constructal law of design and evolution in nature.

    PubMed

    Bejan, Adrian; Lorente, Sylvie

    2010-05-12

    Constructal theory is the view that (i) the generation of images of design (pattern, rhythm) in nature is a phenomenon of physics and (ii) this phenomenon is covered by a principle (the constructal law): 'for a finite-size flow system to persist in time (to live) it must evolve such that it provides greater and greater access to the currents that flow through it'. This law is about the necessity of design to occur, and about the time direction of the phenomenon: the tape of the design evolution 'movie' runs such that existing configurations are replaced by globally easier flowing configurations. The constructal law has two useful sides: the prediction of natural phenomena and the strategic engineering of novel architectures, based on the constructal law, i.e. not by mimicking nature. We show that the emergence of scaling laws in inanimate (geophysical) flow systems is the same phenomenon as the emergence of allometric laws in animate (biological) flow systems. Examples are lung design, animal locomotion, vegetation, river basins, turbulent flow structure, self-lubrication and natural multi-scale porous media. This article outlines the place of the constructal law as a self-standing law in physics, which covers all the ad hoc (and contradictory) statements of optimality such as minimum entropy generation, maximum entropy generation, minimum flow resistance, maximum flow resistance, minimum time, minimum weight, uniform maximum stresses and characteristic organ sizes. Nature is configured to flow and move as a conglomerate of 'engine and brake' designs.

  18. Stochastic dynamics and non-equilibrium thermodynamics of a bistable chemical system: the Schlögl model revisited.

    PubMed

    Vellela, Melissa; Qian, Hong

    2009-10-06

    Schlögl's model is the canonical example of a chemical reaction system that exhibits bistability. Because the biological examples of bistability and switching behaviour are increasingly numerous, this paper presents an integrated deterministic, stochastic and thermodynamic analysis of the model. After a brief review of the deterministic and stochastic modelling frameworks, the concepts of chemical and mathematical detailed balances are discussed and non-equilibrium conditions are shown to be necessary for bistability. Thermodynamic quantities such as the flux, chemical potential and entropy production rate are defined and compared across the two models. In the bistable region, the stochastic model exhibits an exchange of the global stability between the two stable states under changes in the pump parameters and volume size. The stochastic entropy production rate shows a sharp transition that mirrors this exchange. A new hybrid model that includes continuous diffusion and discrete jumps is suggested to deal with the multiscale dynamics of the bistable system. Accurate approximations of the exponentially small eigenvalue associated with the time scale of this switching and the full time-dependent solution are calculated using Matlab. A breakdown of previously known asymptotic approximations on small volume scales is observed through comparison with these and Monte Carlo results. Finally, in the appendix section is an illustration of how the diffusion approximation of the chemical master equation can fail to represent correctly the mesoscopically interesting steady-state behaviour of the system.

  19. Emotion recognition based on physiological changes in music listening.

    PubMed

    Kim, Jonghwa; André, Elisabeth

    2008-12-01

    Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\\% and 70\\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.

  20. Non-extensive statistical analysis of magnetic field during the March 2012 ICME event using a multi-spacecraft approach

    NASA Astrophysics Data System (ADS)

    Pavlos, G. P.; Malandraki, O. E.; Pavlos, E. G.; Iliopoulos, A. C.; Karakatsanis, L. P.

    2016-12-01

    In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat ,qsen ,qrel) of magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For this, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as ;quiet;, ;shock; and ;aftershock;, while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014a,b; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event.

  1. Automatic NEPHIS Coding of Descriptive Titles for Permuted Index Generation.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    1982-01-01

    Describes a system for the automatic coding of most descriptive titles which generates Nested Phrase Indexing System (NEPHIS) input strings of sufficient quality for permuted index production. A series of examples and an 11-item reference list accompany the text. (JL)

  2. Non‐parametric combination and related permutation tests for neuroimaging

    PubMed Central

    Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.

    2016-01-01

    Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101

  3. Creation of a Ligand-Dependent Enzyme by Fusing Circularly Permuted Antibody Variable Region Domains.

    PubMed

    Iwai, Hiroto; Kojima-Misaizu, Miki; Dong, Jinhua; Ueda, Hiroshi

    2016-04-20

    Allosteric control of enzyme activity with exogenous substances has been hard to achieve, especially using antibody domains that potentially allow control by any antigens of choice. Here, in order to attain this goal, we developed a novel antibody variable region format introduced with circular permutations, called Clampbody. The two variable-region domains of the antibone Gla protein (BGP) antibody were each circularly permutated to have novel termini at the loops near their domain interface. Through their attachment to the N- and C-termini of a circularly permutated TEM-1 β-lactamase (cpBLA), we created a molecular switch that responds to the antigen peptide. The fusion protein specifically recognized the antigen, and in the presence of some detergent or denaturant, its catalytic activity was enhanced up to 4.7-fold in an antigen-dependent manner, due to increased resistance to these reagents. Hence, Clampbody will be a powerful tool for the allosteric regulation of enzyme and other protein activities and especially useful to design robust biosensors.

  4. Multiple comparisons permutation test for image based data mining in radiotherapy

    PubMed Central

    2013-01-01

    Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy. PMID:24365155

  5. Quantum one-way permutation over the finite field of two elements

    NASA Astrophysics Data System (ADS)

    de Castro, Alexandre

    2017-06-01

    In quantum cryptography, a one-way permutation is a bounded unitary operator U:{H} → {H} on a Hilbert space {H} that is easy to compute on every input, but hard to invert given the image of a random input. Levin (Probl Inf Transm 39(1):92-103, 2003) has conjectured that the unitary transformation g(a,x)=(a,f(x)+ax), where f is any length-preserving function and a,x \\in {GF}_{{2}^{\\Vert x\\Vert }}, is an information-theoretically secure operator within a polynomial factor. Here, we show that Levin's one-way permutation is provably secure because its output values are four maximally entangled two-qubit states, and whose probability of factoring them approaches zero faster than the multiplicative inverse of any positive polynomial poly( x) over the Boolean ring of all subsets of x. Our results demonstrate through well-known theorems that existence of classical one-way functions implies existence of a universal quantum one-way permutation that cannot be inverted in subexponential time in the worst case.

  6. Discrete Bat Algorithm for Optimal Problem of Permutation Flow Shop Scheduling

    PubMed Central

    Luo, Qifang; Zhou, Yongquan; Xie, Jian; Ma, Mingzhi; Li, Liangliang

    2014-01-01

    A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem. PMID:25243220

  7. Discrete bat algorithm for optimal problem of permutation flow shop scheduling.

    PubMed

    Luo, Qifang; Zhou, Yongquan; Xie, Jian; Ma, Mingzhi; Li, Liangliang

    2014-01-01

    A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem.

  8. Levels of Conceptual Development in Melodic Permutation Concepts Based on Piaget's Theory

    ERIC Educational Resources Information Center

    Larn, Ronald L.

    1973-01-01

    Article considered different ways in which subjects at different age levels solved a musical task involving melodic permutation. The differences in responses to the musical task between age groups were judged to be compatible with Piaget's theory of cognitive development. (Author/RK)

  9. In Response to Rowland on "Realism and Debateability in Policy Advocacy."

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    1986-01-01

    Argues that Robert Rowland has overstated the case against the permutation process for assessing counterplan competitiveness. Claims that the permutation standard is a viable method for ascertaining counterplan competitiveness. Examines Rowland's alternative and argues that it is an unsatisfactory method for determining counterplan…

  10. EPEPT: A web service for enhanced P-value estimation in permutation tests

    PubMed Central

    2011-01-01

    Background In computational biology, permutation tests have become a widely used tool to assess the statistical significance of an event under investigation. However, the common way of computing the P-value, which expresses the statistical significance, requires a very large number of permutations when small (and thus interesting) P-values are to be accurately estimated. This is computationally expensive and often infeasible. Recently, we proposed an alternative estimator, which requires far fewer permutations compared to the standard empirical approach while still reliably estimating small P-values [1]. Results The proposed P-value estimator has been enriched with additional functionalities and is made available to the general community through a public website and web service, called EPEPT. This means that the EPEPT routines can be accessed not only via a website, but also programmatically using any programming language that can interact with the web. Examples of web service clients in multiple programming languages can be downloaded. Additionally, EPEPT accepts data of various common experiment types used in computational biology. For these experiment types EPEPT first computes the permutation values and then performs the P-value estimation. Finally, the source code of EPEPT can be downloaded. Conclusions Different types of users, such as biologists, bioinformaticians and software engineers, can use the method in an appropriate and simple way. Availability http://informatics.systemsbiology.net/EPEPT/ PMID:22024252

  11. Non-Extensive Statistical Analysis of Magnetic Field and SEPs during the March 2012 ICME event, using a multi-spacecraft approach

    NASA Astrophysics Data System (ADS)

    Pavlos, George; Malandraki, Olga; Pavlos, Evgenios; Iliopoulos, Aggelos; Karakatsanis, Leonidas

    2017-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing non-equilibrium statistical mechanics. In this study, we present the highlights of Tsallis non-extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at solar wind phenomena and magnetosphere. In this study we present some new and significant results concerning the dynamics of interplanetary coronal mass ejections (ICMEs) observed in the near Earth at L1 solar wind environment, as well as its effect in Earth's magnetosphere. The results are referred to Tsallis non-extensive statistics and in particular to the estimation of Tsallis q-triplet, (qstat, qsen, qrel) of SEPs time series observed at the interplanetary space and magnetic field time series of the ICME observed at the Earth resulting from the solar eruptive activity on March 7, 2012 at the Sun. For the magnetic field, we used a multi-spacecraft approach based on data experiments from ACE, CLUSTER 4, THEMIS-E and THEMIS-C spacecraft. For the data analysis different time periods were considered, sorted as "quiet", "shock" and "aftershock", while different space domains such as the Interplanetary space (near Earth at L1 and upstream of the Earth's bowshock), the Earth's magnetosheath and magnetotail, were also taken into account. Our results reveal significant differences in statistical and dynamical features, indicating important variations of the SEPs profile in time, and magnetic field dynamics both in time and space domains during the shock event, in terms of rate of entropy production, relaxation dynamics and non-equilibrium meta-stable stationary states. So far, Tsallis non-extensive statistical theory and Tsallis extension of the Boltzmann-Gibbs entropy principle to the q-entropy entropy principle (Tsallis, 1988, 2009) reveal strong universality character concerning non-equilibrium dynamics (Pavlos et al. 2012a,b, 2014, 2015, 2016; Karakatsanis et al. 2013). Tsallis q-entropy principle can explain the emergence of a series of new and significant physical characteristics in distributed systems as well as in space plasmas. Such characteristics are: non-Gaussian statistics and anomalous diffusion processes, strange and fractional dynamics, multifractal, percolating and intermittent turbulence structures, multiscale and long spatio-temporal correlations, fractional acceleration and Non-Equilibrium Stationary States (NESS) or non-equilibrium self-organization process and non-equilibrium phase transition and topological phase transition processes according to Zelenyi and Milovanov (2004). In this direction, our results reveal clearly strong self-organization and development of macroscopic ordering of plasma system related to strengthen of non-extensivity, multifractality and intermittency everywhere in the space plasmas region during the CME event. Acknowledgements: This project has received funding form the European Union's Horizon 2020 research and innovation program under grant agreement No 637324.

  12. Land use/land cover mapping using multi-scale texture processing of high resolution data

    NASA Astrophysics Data System (ADS)

    Wong, S. N.; Sarker, M. L. R.

    2014-02-01

    Land use/land cover (LULC) maps are useful for many purposes, and for a long time remote sensing techniques have been used for LULC mapping using different types of data and image processing techniques. In this research, high resolution satellite data from IKONOS was used to perform land use/land cover mapping in Johor Bahru city and adjacent areas (Malaysia). Spatial image processing was carried out using the six texture algorithms (mean, variance, contrast, homogeneity, entropy, and GLDV angular second moment) with five difference window sizes (from 3×3 to 11×11). Three different classifiers i.e. Maximum Likelihood Classifier (MLC), Artificial Neural Network (ANN) and Supported Vector Machine (SVM) were used to classify the texture parameters of different spectral bands individually and all bands together using the same training and validation samples. Results indicated that texture parameters of all bands together generally showed a better performance (overall accuracy = 90.10%) for land LULC mapping, however, single spectral band could only achieve an overall accuracy of 72.67%. This research also found an improvement of the overall accuracy (OA) using single-texture multi-scales approach (OA = 89.10%) and single-scale multi-textures approach (OA = 90.10%) compared with all original bands (OA = 84.02%) because of the complementary information from different bands and different texture algorithms. On the other hand, all of the three different classifiers have showed high accuracy when using different texture approaches, but SVM generally showed higher accuracy (90.10%) compared to MLC (89.10%) and ANN (89.67%) especially for the complex classes such as urban and road.

  13. Introduction to Permutation and Resampling-Based Hypothesis Tests

    ERIC Educational Resources Information Center

    LaFleur, Bonnie J.; Greevy, Robert A.

    2009-01-01

    A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…

  14. Explorations in Statistics: Permutation Methods

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2012-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eighth installment of "Explorations in Statistics" explores permutation methods, empiric procedures we can use to assess an experimental result--to test a null hypothesis--when we are reluctant to trust statistical…

  15. Relating brain signal variability to knowledge representation.

    PubMed

    Heisz, Jennifer J; Shedden, Judith M; McIntosh, Anthony R

    2012-11-15

    We assessed the hypothesis that brain signal variability is a reflection of functional network reconfiguration during memory processing. In the present experiments, we use multiscale entropy to capture the variability of human electroencephalogram (EEG) while manipulating the knowledge representation associated with faces stored in memory. Across two experiments, we observed increased variability as a function of greater knowledge representation. In Experiment 1, individuals with greater familiarity for a group of famous faces displayed more brain signal variability. In Experiment 2, brain signal variability increased with learning after multiple experimental exposures to previously unfamiliar faces. The results demonstrate that variability increases with face familiarity; cognitive processes during the perception of familiar stimuli may engage a broader network of regions, which manifests as higher complexity/variability in spatial and temporal domains. In addition, effects of repetition suppression on brain signal variability were observed, and the pattern of results is consistent with a selectivity model of neural adaptation. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  16. Multi-scale dynamics and relaxation of a tethered membrane in a solvent by Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Pandey, Ras; Anderson, Kelly; Farmer, Barry

    2006-03-01

    A tethered membrane modeled by a flexible sheet dissipates entropy as it wrinkles and crumples. Nodes of a coarse grained membrane are connected via multiple pathways for dynamical modes to propagate. We consider a sheet with nodes connected by fluctuating bonds on a cubic lattice. The empty lattice sites constitute an effective solvent medium via node-solvent interaction. Each node execute its stochastic motion with the Metropolis algorithm subject to bond fluctuations, excluded volume constraints, and interaction energy. Dynamics and conformation of the sheet are examined at a low and a high temperature with attractive and repulsive node-node interactions for the contrast in an attractive solvent medium. Variations of the mean square displacement of the center node of the sheet and that of its center of mass with the time steps are examined in detail which show different power-law motion from short to long time regimes. Relaxation of the gyration radius and scaling of its asymptotic value with the molecular weight are examined.

  17. Complexity of cardiac signals for predicting changes in alpha-waves after stress in patients undergoing cardiac catheterization

    NASA Astrophysics Data System (ADS)

    Chiu, Hung-Chih; Lin, Yen-Hung; Lo, Men-Tzung; Tang, Sung-Chun; Wang, Tzung-Dau; Lu, Hung-Chun; Ho, Yi-Lwun; Ma, Hsi-Pin; Peng, Chung-Kang

    2015-08-01

    The hierarchical interaction between electrical signals of the brain and heart is not fully understood. We hypothesized that the complexity of cardiac electrical activity can be used to predict changes in encephalic electricity after stress. Most methods for analyzing the interaction between the heart rate variability (HRV) and electroencephalography (EEG) require a computation-intensive mathematical model. To overcome these limitations and increase the predictive accuracy of human relaxing states, we developed a method to test our hypothesis. In addition to routine linear analysis, multiscale entropy and detrended fluctuation analysis of the HRV were used to quantify nonstationary and nonlinear dynamic changes in the heart rate time series. Short-time Fourier transform was applied to quantify the power of EEG. The clinical, HRV, and EEG parameters of postcatheterization EEG alpha waves were analyzed using change-score analysis and generalized additive models. In conclusion, the complexity of cardiac electrical signals can be used to predict EEG changes after stress.

  18. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  19. The Slug and Churn Turbulence Characteristics of Oil-Gas-Water Flows in a Vertical Small Pipe

    NASA Astrophysics Data System (ADS)

    Liu, Weixin; Han, Yunfeng; Wang, Dayang; Zhao, An; Jin, Ningde

    2017-08-01

    The intention of the present study was to investigate the slug and churn turbulence characteristics of a vertical upward oil-gas-water three-phase flow. We firstly carried out a vertical upward oil-gas-water three-phase flow experiment in a 20-mm inner diameter (ID) pipe to measure the fluctuating signals of a rotating electric field conductance sensor under different flow patterns. Afterwards, typical flow patterns were identified with the aid of the texture structures in a cross recurrence plot. Recurrence quantitative analysis and multi-scale cross entropy (MSCE) algorithms were applied to investigate the turbulence characteristics of slug and churn flows with the varying flow parameters. The results suggest that with cross nonlinear analysis, the underlying dynamic characteristics in the evolution from slug to churn flow can be well understood. The present study provides a novel perspective for the analysis of the spatial-temporal evolution instability and complexity in oil-gas-water three-phase flow.

  20. Complexity of cardiac signals for predicting changes in alpha-waves after stress in patients undergoing cardiac catheterization

    PubMed Central

    Chiu, Hung-Chih; Lin, Yen-Hung; Lo, Men-Tzung; Tang, Sung-Chun; Wang, Tzung-Dau; Lu, Hung-Chun; Ho, Yi-Lwun; Ma, Hsi-Pin; Peng, Chung-Kang

    2015-01-01

    The hierarchical interaction between electrical signals of the brain and heart is not fully understood. We hypothesized that the complexity of cardiac electrical activity can be used to predict changes in encephalic electricity after stress. Most methods for analyzing the interaction between the heart rate variability (HRV) and electroencephalography (EEG) require a computation-intensive mathematical model. To overcome these limitations and increase the predictive accuracy of human relaxing states, we developed a method to test our hypothesis. In addition to routine linear analysis, multiscale entropy and detrended fluctuation analysis of the HRV were used to quantify nonstationary and nonlinear dynamic changes in the heart rate time series. Short-time Fourier transform was applied to quantify the power of EEG. The clinical, HRV, and EEG parameters of postcatheterization EEG alpha waves were analyzed using change-score analysis and generalized additive models. In conclusion, the complexity of cardiac electrical signals can be used to predict EEG changes after stress. PMID:26286628

  1. Quantifying climatic controls on river network topology across scales

    NASA Astrophysics Data System (ADS)

    Ranjbar Moshfeghi, S.; Hooshyar, M.; Wang, D.; Singh, A.

    2017-12-01

    Branching structure of river networks is an important topologic and geomorphologic feature that depends on several factors (e.g. climate, tectonic). However, mechanisms that cause these drainage patterns in river networks are poorly understood. In this study, we investigate the effects of varying climatic forcing on river network topology and geomorphology. For this, we select 20 catchments across the United States with different long-term climatic conditions quantified by climate aridity index (AI), defined here as the ratio of mean annual potential evaporation (Ep) to precipitation (P), capturing variation in runoff and vegetation cover. The river networks of these catchments are extracted, using a curvature-based method, from high-resolution (1 m) digital elevation models and several metrics such as drainage density, branching angle, and width functions are computed. We also use a multiscale-entropy-based approach to quantify the topologic irregularity and structural richness of these river networks. Our results reveal systematic impacts of climate forcing on the structure of river networks.

  2. Heartbeat Complexity Modulation in Bipolar Disorder during Daytime and Nighttime.

    PubMed

    Nardelli, Mimma; Lanata, Antonio; Bertschy, Gilles; Scilingo, Enzo Pasquale; Valenza, Gaetano

    2017-12-20

    This study reports on the complexity modulation of heartbeat dynamics in patients affected by bipolar disorder. In particular, a multiscale entropy analysis was applied to the R-R interval series, that were derived from electrocardiographic (ECG) signals for a group of nineteen subjects comprised of eight patients and eleven healthy control subjects. They were monitored using a textile-based sensorized t-shirt during the day and overnight for a total of 47 diurnal and 27 nocturnal recordings. Patients showed three different mood states: depression, hypomania and euthymia. Results show a clear loss of complexity during depressive and hypomanic states as compared to euthymic and healthy control states. In addition, we observed that a more significant complexity modulation among healthy and pathological mood states occurs during the night. These findings suggest that bipolar disorder is associated with an enhanced sleep-related dysregulation of the Autonomic Nervous System (ANS) activity, and that heartbeat complex dynamics may serve as a viable marker of pathological conditions in mental health.

  3. A supervised learning approach for Crohn's disease detection using higher-order image statistics and a novel shape asymmetry measure.

    PubMed

    Mahapatra, Dwarikanath; Schueffler, Peter; Tielbeek, Jeroen A W; Buhmann, Joachim M; Vos, Franciscus M

    2013-10-01

    Increasing incidence of Crohn's disease (CD) in the Western world has made its accurate diagnosis an important medical challenge. The current reference standard for diagnosis, colonoscopy, is time-consuming and invasive while magnetic resonance imaging (MRI) has emerged as the preferred noninvasive procedure over colonoscopy. Current MRI approaches assess rate of contrast enhancement and bowel wall thickness, and rely on extensive manual segmentation for accurate analysis. We propose a supervised learning method for the identification and localization of regions in abdominal magnetic resonance images that have been affected by CD. Low-level features like intensity and texture are used with shape asymmetry information to distinguish between diseased and normal regions. Particular emphasis is laid on a novel entropy-based shape asymmetry method and higher-order statistics like skewness and kurtosis. Multi-scale feature extraction renders the method robust. Experiments on real patient data show that our features achieve a high level of accuracy and perform better than two competing methods.

  4. NASA Thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The NASA Thesaurus -- Volume 2, Access Vocabulary -- contains an alphabetical listing of all Thesaurus terms (postable and nonpostable) and permutations of all multiword and pseudo-multiword terms. Also included are Other Words (non-Thesaurus terms) consisting of abbreviations, chemical symbols, etc. The permutations and Other Words provide 'access' to the appropriate postable entries in the Thesaurus.

  5. A Permutation Test for Correlated Errors in Adjacent Questionnaire Items

    ERIC Educational Resources Information Center

    Hildreth, Laura A.; Genschel, Ulrike; Lorenz, Frederick O.; Lesser, Virginia M.

    2013-01-01

    Response patterns are of importance to survey researchers because of the insight they provide into the thought processes respondents use to answer survey questions. In this article we propose the use of structural equation modeling to examine response patterns and develop a permutation test to quantify the likelihood of observing a specific…

  6. The Parity Theorem Shuffle

    ERIC Educational Resources Information Center

    Smith, Michael D.

    2016-01-01

    The Parity Theorem states that any permutation can be written as a product of transpositions, but no permutation can be written as a product of both an even number and an odd number of transpositions. Most proofs of the Parity Theorem take several pages of mathematical formalism to complete. This article presents an alternative but equivalent…

  7. Heuristic Implementation of Dynamic Programming for Matrix Permutation Problems in Combinatorial Data Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie

    2008-01-01

    Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…

  8. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  9. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  10. A new EEG synchronization strength analysis method: S-estimator based normalized weighted-permutation mutual information.

    PubMed

    Cui, Dong; Pu, Weiting; Liu, Jing; Bian, Zhijie; Li, Qiuli; Wang, Lei; Gu, Guanghua

    2016-10-01

    Synchronization is an important mechanism for understanding information processing in normal or abnormal brains. In this paper, we propose a new method called normalized weighted-permutation mutual information (NWPMI) for double variable signal synchronization analysis and combine NWPMI with S-estimator measure to generate a new method named S-estimator based normalized weighted-permutation mutual information (SNWPMI) for analyzing multi-channel electroencephalographic (EEG) synchronization strength. The performances including the effects of time delay, embedding dimension, coupling coefficients, signal to noise ratios (SNRs) and data length of the NWPMI are evaluated by using Coupled Henon mapping model. The results show that the NWPMI is superior in describing the synchronization compared with the normalized permutation mutual information (NPMI). Furthermore, the proposed SNWPMI method is applied to analyze scalp EEG data from 26 amnestic mild cognitive impairment (aMCI) subjects and 20 age-matched controls with normal cognitive function, who both suffer from type 2 diabetes mellitus (T2DM). The proposed methods NWPMI and SNWPMI are suggested to be an effective index to estimate the synchronization strength. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Sorting signed permutations by inversions in O(nlogn) time.

    PubMed

    Swenson, Krister M; Rajan, Vaibhav; Lin, Yu; Moret, Bernard M E

    2010-03-01

    The study of genomic inversions (or reversals) has been a mainstay of computational genomics for nearly 20 years. After the initial breakthrough of Hannenhalli and Pevzner, who gave the first polynomial-time algorithm for sorting signed permutations by inversions, improved algorithms have been designed, culminating with an optimal linear-time algorithm for computing the inversion distance and a subquadratic algorithm for providing a shortest sequence of inversions--also known as sorting by inversions. Remaining open was the question of whether sorting by inversions could be done in O(nlogn) time. In this article, we present a qualified answer to this question, by providing two new sorting algorithms, a simple and fast randomized algorithm and a deterministic refinement. The deterministic algorithm runs in time O(nlogn + kn), where k is a data-dependent parameter. We provide the results of extensive experiments showing that both the average and the standard deviation for k are small constants, independent of the size of the permutation. We conclude (but do not prove) that almost all signed permutations can be sorted by inversions in O(nlogn) time.

  12. EXPLICIT SYMPLECTIC-LIKE INTEGRATORS WITH MIDPOINT PERMUTATIONS FOR SPINNING COMPACT BINARIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Junjie; Wu, Xin; Huang, Guoqing

    2017-01-01

    We refine the recently developed fourth-order extended phase space explicit symplectic-like methods for inseparable Hamiltonians using Yoshida’s triple product combined with a midpoint permuted map. The midpoint between the original variables and their corresponding extended variables at every integration step is readjusted as the initial values of the original variables and their corresponding extended ones at the next step integration. The triple-product construction is apparently superior to the composition of two triple products in computational efficiency. Above all, the new midpoint permutations are more effective in restraining the equality of the original variables and their corresponding extended ones at each integration step thanmore » the existing sequent permutations of momenta and coordinates. As a result, our new construction shares the benefit of implicit symplectic integrators in the conservation of the second post-Newtonian Hamiltonian of spinning compact binaries. Especially for the chaotic case, it can work well, but the existing sequent permuted algorithm cannot. When dissipative effects from the gravitational radiation reaction are included, the new symplectic-like method has a secular drift in the energy error of the dissipative system for the orbits that are regular in the absence of radiation, as an implicit symplectic integrator does. In spite of this, it is superior to the same-order implicit symplectic integrator in accuracy and efficiency. The new method is particularly useful in discussing the long-term evolution of inseparable Hamiltonian problems.« less

  13. A studentized permutation test for three-arm trials in the 'gold standard' design.

    PubMed

    Mütze, Tobias; Konietschke, Frank; Munk, Axel; Friede, Tim

    2017-03-15

    The 'gold standard' design for three-arm trials refers to trials with an active control and a placebo control in addition to the experimental treatment group. This trial design is recommended when being ethically justifiable and it allows the simultaneous comparison of experimental treatment, active control, and placebo. Parametric testing methods have been studied plentifully over the past years. However, these methods often tend to be liberal or conservative when distributional assumptions are not met particularly with small sample sizes. In this article, we introduce a studentized permutation test for testing non-inferiority and superiority of the experimental treatment compared with the active control in three-arm trials in the 'gold standard' design. The performance of the studentized permutation test for finite sample sizes is assessed in a Monte Carlo simulation study under various parameter constellations. Emphasis is put on whether the studentized permutation test meets the target significance level. For comparison purposes, commonly used Wald-type tests, which do not make any distributional assumptions, are included in the simulation study. The simulation study shows that the presented studentized permutation test for assessing non-inferiority in three-arm trials in the 'gold standard' design outperforms its competitors, for instance the test based on a quasi-Poisson model, for count data. The methods discussed in this paper are implemented in the R package ThreeArmedTrials which is available on the comprehensive R archive network (CRAN). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Combat casualties undergoing lifesaving interventions have decreased heart-rate complexity at multiple time scales

    PubMed Central

    Cancio, Leopoldo C.; Batchinsky, Andriy I.; Baker, William L.; Necsoiu, Corina; Salinas, José; Goldberger, Ary L.; Costa, Madalena D.

    2013-01-01

    Purpose We found that heart-rate (HR) complexity metrics, such as sample entropy (SampEn), identified trauma patients receiving lifesaving interventions (LSIs). We now aimed: 1) to test a new multiscale entropy (MSE) index; 2) to compare it to single-scale measures including SampEn; and 3) to assess different parameter values for calculation of SampEn and MSE. Methods This was a study of combat casualties in an Emergency Department (ED) in Iraq. ECGs of 70 acutely injured adults were recorded. Twelve underwent LSIs and 58 did not. LSIs included endotracheal intubation (9); tube thoracostomy (9); and emergency transfusion (4). From each ECG, a segment of 800 consecutive beats was selected. Off-line, R waves were detected and R-to-R (RR) interval time series were generated. SampEn, MSE, and time-domain measures of HR variability (mean HR, standard deviation, pNN20, pNN50, rMSSD) were computed. Results Differences in mean HR (LSI=111±33, NonLSI=90±17) were not significant. Systolic arterial pressure was statistically but not clinically different (LSI=123±19, NonLSI=135±19). SampEn (LSI=0.90±0.42, NonLSI=1.19±0.35, p<0.05) and MSE index (LSI = 2.58±2.55, NonLSI=5.67±2.48, p<0.001) differed significantly. Conclusions Complexity of HR dynamics over a range of time scales was lower in high-risk than in low-risk combat casualties and outperformed traditional vital signs. PMID:24140167

  15. Network complexity as a measure of information processing across resting-state networks: evidence from the Human Connectome Project

    PubMed Central

    McDonough, Ian M.; Nashiro, Kaoru

    2014-01-01

    An emerging field of research focused on fluctuations in brain signals has provided evidence that the complexity of those signals, as measured by entropy, conveys important information about network dynamics (e.g., local and distributed processing). While much research has focused on how neural complexity differs in populations with different age groups or clinical disorders, substantially less research has focused on the basic understanding of neural complexity in populations with young and healthy brain states. The present study used resting-state fMRI data from the Human Connectome Project (Van Essen et al., 2013) to test the extent that neural complexity in the BOLD signal, as measured by multiscale entropy (1) would differ from random noise, (2) would differ between four major resting-state networks previously associated with higher-order cognition, and (3) would be associated with the strength and extent of functional connectivity—a complementary method of estimating information processing. We found that complexity in the BOLD signal exhibited different patterns of complexity from white, pink, and red noise and that neural complexity was differentially expressed between resting-state networks, including the default mode, cingulo-opercular, left and right frontoparietal networks. Lastly, neural complexity across all networks was negatively associated with functional connectivity at fine scales, but was positively associated with functional connectivity at coarse scales. The present study is the first to characterize neural complexity in BOLD signals at a high temporal resolution and across different networks and might help clarify the inconsistencies between neural complexity and functional connectivity, thus informing the mechanisms underlying neural complexity. PMID:24959130

  16. A systems biology approach to studying Tai Chi, physiological complexity and healthy aging: design and rationale of a pragmatic randomized controlled trial.

    PubMed

    Wayne, Peter M; Manor, Brad; Novak, Vera; Costa, Madelena D; Hausdorff, Jeffrey M; Goldberger, Ary L; Ahn, Andrew C; Yeh, Gloria Y; Peng, C-K; Lough, Matthew; Davis, Roger B; Quilty, Mary T; Lipsitz, Lewis A

    2013-01-01

    Aging is typically associated with progressive multi-system impairment that leads to decreased physical and cognitive function and reduced adaptability to stress. Due to its capacity to characterize complex dynamics within and between physiological systems, the emerging field of complex systems biology and its array of quantitative tools show great promise for improving our understanding of aging, monitoring senescence, and providing biomarkers for evaluating novel interventions, including promising mind-body exercises, that treat age-related disease and promote healthy aging. An ongoing, two-arm randomized clinical trial is evaluating the potential of Tai Chi mind-body exercise to attenuate age-related loss of complexity. A total of 60 Tai Chi-naïve healthy older adults (aged 50-79) are being randomized to either six months of Tai Chi training (n=30), or to a waitlist control receiving unaltered usual medical care (n=30). Our primary outcomes are complexity-based measures of heart rate, standing postural sway and gait stride interval dynamics assessed at 3 and 6months. Multiscale entropy and detrended fluctuation analysis are used as entropy- and fractal-based measures of complexity, respectively. Secondary outcomes include measures of physical and psychological function and tests of physiological adaptability also assessed at 3 and 6months. Results of this study may lead to novel biomarkers that help us monitor and understand the physiological processes of aging and explore the potential benefits of Tai Chi and related mind-body exercises for healthy aging. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Time domain structures in a colliding magnetic flux rope experiment

    NASA Astrophysics Data System (ADS)

    Tang, Shawn Wenjie; Gekelman, Walter; Dehaas, Timothy; Vincena, Steve; Pribyl, Patrick

    2017-10-01

    Electron phase-space holes, regions of positive potential on the scale of the Debye length, have been observed in auroras as well as in laboratory experiments. These potential structures, also known as Time Domain Structures (TDS), are packets of intense electric field spikes that have significant components parallel to the local magnetic field. In an ongoing investigation at UCLA, TDS were observed on the surface of two magnetized flux ropes produced within the Large Plasma Device (LAPD). A barium oxide (BaO) cathode was used to produce an 18 m long magnetized plasma column and a lanthanum hexaboride (LaB6) source was used to create 11 m long kink unstable flux ropes. Using two probes capable of measuring the local electric and magnetic fields, correlation analysis was performed on tens of thousands of these structures and their propagation velocities, probability distribution function and spatial distribution were determined. The TDS became abundant as the flux ropes collided and appear to emanate from the reconnection region in between them. In addition, a preliminary analysis of the permutation entropy and statistical complexity of the data suggests that the TDS signals may be chaotic in nature. Work done at the Basic Plasma Science Facility (BaPSF) at UCLA which is supported by DOE and NSF.

  18. Analysis of EEG signals regularity in adults during video game play in 2D and 3D.

    PubMed

    Khairuddin, Hamizah R; Malik, Aamir S; Mumtaz, Wajid; Kamel, Nidal; Xia, Likun

    2013-01-01

    Video games have long been part of the entertainment industry. Nonetheless, it is not well known how video games can affect us with the advancement of 3D technology. The purpose of this study is to investigate the EEG signals regularity when playing video games in 2D and 3D modes. A total of 29 healthy subjects (24 male, 5 female) with mean age of 21.79 (1.63) years participated. Subjects were asked to play a car racing video game in three different modes (2D, 3D passive and 3D active). In 3D passive mode, subjects needed to wear a passive polarized glasses (cinema type) while for 3D active, an active shutter glasses was used. Scalp EEG data was recorded during game play using 19-channel EEG machine and linked ear was used as reference. After data were pre-processed, the signal irregularity for all conditions was computed. Two parameters were used to measure signal complexity for time series data: i) Hjorth-Complexity and ii) Composite Permutation Entropy Index (CPEI). Based on these two parameters, our results showed that the complexity level increased from eyes closed to eyes open condition; and further increased in the case of 3D as compared to 2D game play.

  19. Optimized mixed Markov models for motif identification

    PubMed Central

    Huang, Weichun; Umbach, David M; Ohler, Uwe; Li, Leping

    2006-01-01

    Background Identifying functional elements, such as transcriptional factor binding sites, is a fundamental step in reconstructing gene regulatory networks and remains a challenging issue, largely due to limited availability of training samples. Results We introduce a novel and flexible model, the Optimized Mixture Markov model (OMiMa), and related methods to allow adjustment of model complexity for different motifs. In comparison with other leading methods, OMiMa can incorporate more than the NNSplice's pairwise dependencies; OMiMa avoids model over-fitting better than the Permuted Variable Length Markov Model (PVLMM); and OMiMa requires smaller training samples than the Maximum Entropy Model (MEM). Testing on both simulated and actual data (regulatory cis-elements and splice sites), we found OMiMa's performance superior to the other leading methods in terms of prediction accuracy, required size of training data or computational time. Our OMiMa system, to our knowledge, is the only motif finding tool that incorporates automatic selection of the best model. OMiMa is freely available at [1]. Conclusion Our optimized mixture of Markov models represents an alternative to the existing methods for modeling dependent structures within a biological motif. Our model is conceptually simple and effective, and can improve prediction accuracy and/or computational speed over other leading methods. PMID:16749929

  20. Quantum image encryption based on restricted geometric and color transformations

    NASA Astrophysics Data System (ADS)

    Song, Xian-Hua; Wang, Shen; Abd El-Latif, Ahmed A.; Niu, Xia-Mu

    2014-08-01

    A novel encryption scheme for quantum images based on restricted geometric and color transformations is proposed. The new strategy comprises efficient permutation and diffusion properties for quantum image encryption. The core idea of the permutation stage is to scramble the codes of the pixel positions through restricted geometric transformations. Then, a new quantum diffusion operation is implemented on the permutated quantum image based on restricted color transformations. The encryption keys of the two stages are generated by two sensitive chaotic maps, which can ensure the security of the scheme. The final step, measurement, is built by the probabilistic model. Experiments conducted on statistical analysis demonstrate that significant improvements in the results are in favor of the proposed approach.

  1. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. NASA thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Access Vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries, and pseudo-multiword terms that are permutations of words that contain words within words. The Access Vocabulary contains 40,738 entries that give increased access to the hierarchies in Volume 1 - Hierarchical Listing.

  3. NASA Thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The Access Vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries, and pseudo-multiword terms that are permutations of words that contain words within words. The Access Vocabulary contains, 40,661 entries that give increased access to he hierarchies in Volume 1 - Hierarchical Listing.

  4. Instability of Hierarchical Cluster Analysis Due to Input Order of the Data: The PermuCLUSTER Solution

    ERIC Educational Resources Information Center

    van der Kloot, Willem A.; Spaans, Alexander M. J.; Heiser, Willem J.

    2005-01-01

    Hierarchical agglomerative cluster analysis (HACA) may yield different solutions under permutations of the input order of the data. This instability is caused by ties, either in the initial proximity matrix or arising during agglomeration. The authors recommend to repeat the analysis on a large number of random permutations of the rows and columns…

  5. Optimal control of hybrid qubits: Implementing the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Rivera-Ruiz, C. M.; de Lima, E. F.; Fanchini, F. F.; Lopez-Richard, V.; Castelano, L. K.

    2018-03-01

    The optimal quantum control theory is employed to determine electric pulses capable of producing quantum gates with a fidelity higher than 0.9997, when noise is not taken into account. Particularly, these quantum gates were chosen to perform the permutation algorithm in hybrid qubits in double quantum dots (DQDs). The permutation algorithm is an oracle based quantum algorithm that solves the problem of the permutation parity faster than a classical algorithm without the necessity of entanglement between particles. The only requirement for achieving the speedup is the use of a one-particle quantum system with at least three levels. The high fidelity found in our results is closely related to the quantum speed limit, which is a measure of how fast a quantum state can be manipulated. Furthermore, we model charge noise by considering an average over the optimal field centered at different values of the reference detuning, which follows a Gaussian distribution. When the Gaussian spread is of the order of 5 μ eV (10% of the correct value), the fidelity is still higher than 0.95. Our scheme also can be used for the practical realization of different quantum algorithms in DQDs.

  6. PsiQuaSP-A library for efficient computation of symmetric open quantum systems.

    PubMed

    Gegg, Michael; Richter, Marten

    2017-11-24

    In a recent publication we showed that permutation symmetry reduces the numerical complexity of Lindblad quantum master equations for identical multi-level systems from exponential to polynomial scaling. This is important for open system dynamics including realistic system bath interactions and dephasing in, for instance, the Dicke model, multi-Λ system setups etc. Here we present an object-oriented C++ library that allows to setup and solve arbitrary quantum optical Lindblad master equations, especially those that are permutationally symmetric in the multi-level systems. PsiQuaSP (Permutation symmetry for identical Quantum Systems Package) uses the PETSc package for sparse linear algebra methods and differential equations as basis. The aim of PsiQuaSP is to provide flexible, storage efficient and scalable code while being as user friendly as possible. It is easily applied to many quantum optical or quantum information systems with more than one multi-level system. We first review the basics of the permutation symmetry for multi-level systems in quantum master equations. The application of PsiQuaSP to quantum dynamical problems is illustrated with several typical, simple examples of open quantum optical systems.

  7. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Thomas; Efendiev, Yalchin; Tchelepi, Hamdi

    2016-05-24

    Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scalemore » basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics.« less

  8. Multiscale analysis and computation for flows in heterogeneous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Efendiev, Yalchin; Hou, T. Y.; Durlofsky, L. J.

    Our work in this project is aimed at making fundamental advances in multiscale methods for flow and transport in highly heterogeneous porous media. The main thrust of this research is to develop a systematic multiscale analysis and efficient coarse-scale models that can capture global effects and extend existing multiscale approaches to problems with additional physics and uncertainties. A key emphasis is on problems without an apparent scale separation. Multiscale solution methods are currently under active investigation for the simulation of subsurface flow in heterogeneous formations. These procedures capture the effects of fine-scale permeability variations through the calculation of specialized coarse-scalemore » basis functions. Most of the multiscale techniques presented to date employ localization approximations in the calculation of these basis functions. For some highly correlated (e.g., channelized) formations, however, global effects are important and these may need to be incorporated into the multiscale basis functions. Other challenging issues facing multiscale simulations are the extension of existing multiscale techniques to problems with additional physics, such as compressibility, capillary effects, etc. In our project, we explore the improvement of multiscale methods through the incorporation of additional (single-phase flow) information and the development of a general multiscale framework for flows in the presence of uncertainties, compressible flow and heterogeneous transport, and geomechanics. We have considered (1) adaptive local-global multiscale methods, (2) multiscale methods for the transport equation, (3) operator-based multiscale methods and solvers, (4) multiscale methods in the presence of uncertainties and applications, (5) multiscale finite element methods for high contrast porous media and their generalizations, and (6) multiscale methods for geomechanics. Below, we present a brief overview of each of these contributions.« less

  9. Supervised classification of brain tissues through local multi-scale texture analysis by coupling DIR and FLAIR MR sequences

    NASA Astrophysics Data System (ADS)

    Poletti, Enea; Veronese, Elisa; Calabrese, Massimiliano; Bertoldo, Alessandra; Grisan, Enrico

    2012-02-01

    The automatic segmentation of brain tissues in magnetic resonance (MR) is usually performed on T1-weighted images, due to their high spatial resolution. T1w sequence, however, has some major downsides when brain lesions are present: the altered appearance of diseased tissues causes errors in tissues classification. In order to overcome these drawbacks, we employed two different MR sequences: fluid attenuated inversion recovery (FLAIR) and double inversion recovery (DIR). The former highlights both gray matter (GM) and white matter (WM), the latter highlights GM alone. We propose here a supervised classification scheme that does not require any anatomical a priori information to identify the 3 classes, "GM", "WM", and "background". Features are extracted by means of a local multi-scale texture analysis, computed for each pixel of the DIR and FLAIR sequences. The 9 textures considered are average, standard deviation, kurtosis, entropy, contrast, correlation, energy, homogeneity, and skewness, evaluated on a neighborhood of 3x3, 5x5, and 7x7 pixels. Hence, the total number of features associated to a pixel is 56 (9 textures x3 scales x2 sequences +2 original pixel values). The classifier employed is a Support Vector Machine with Radial Basis Function as kernel. From each of the 4 brain volumes evaluated, a DIR and a FLAIR slice have been selected and manually segmented by 2 expert neurologists, providing 1st and 2nd human reference observations which agree with an average accuracy of 99.03%. SVM performances have been assessed with a 4-fold cross-validation, yielding an average classification accuracy of 98.79%.

  10. Classification of Urban Feature from Unmanned Aerial Vehicle Images Using Gasvm Integration and Multi-Scale Segmentation

    NASA Astrophysics Data System (ADS)

    Modiri, M.; Salehabadi, A.; Mohebbi, M.; Hashemi, A. M.; Masumi, M.

    2015-12-01

    The use of UAV in the application of photogrammetry to obtain cover images and achieve the main objectives of the photogrammetric mapping has been a boom in the region. The images taken from REGGIOLO region in the province of, Italy Reggio -Emilia by UAV with non-metric camera Canon Ixus and with an average height of 139.42 meters were used to classify urban feature. Using the software provided SURE and cover images of the study area, to produce dense point cloud, DSM and Artvqvtv spatial resolution of 10 cm was prepared. DTM area using Adaptive TIN filtering algorithm was developed. NDSM area was prepared with using the difference between DSM and DTM and a separate features in the image stack. In order to extract features, using simultaneous occurrence matrix features mean, variance, homogeneity, contrast, dissimilarity, entropy, second moment, and correlation for each of the RGB band image was used Orthophoto area. Classes used to classify urban problems, including buildings, trees and tall vegetation, grass and vegetation short, paved road and is impervious surfaces. Class consists of impervious surfaces such as pavement conditions, the cement, the car, the roof is stored. In order to pixel-based classification and selection of optimal features of classification was GASVM pixel basis. In order to achieve the classification results with higher accuracy and spectral composition informations, texture, and shape conceptual image featureOrthophoto area was fencing. The segmentation of multi-scale segmentation method was used.it belonged class. Search results using the proposed classification of urban feature, suggests the suitability of this method of classification complications UAV is a city using images. The overall accuracy and kappa coefficient method proposed in this study, respectively, 47/93% and 84/91% was.

  11. Symmetry breaking, mixing, instability, and low-frequency variability in a minimal Lorenz-like system.

    PubMed

    Lucarini, Valerio; Fraedrich, Klaus

    2009-08-01

    Starting from the classical Saltzman two-dimensional convection equations, we derive via a severe spectral truncation a minimal 10 ODE system which includes the thermal effect of viscous dissipation. Neglecting this process leads to a dynamical system which includes a decoupled generalized Lorenz system. The consideration of this process breaks an important symmetry and couples the dynamics of fast and slow variables, with the ensuing modifications to the structural properties of the attractor and of the spectral features. When the relevant nondimensional number (Eckert number Ec) is different from zero, an additional time scale of O(Ec(-1)) is introduced in the system, as shown with standard multiscale analysis and made clear by several numerical evidences. Moreover, the system is ergodic and hyperbolic, the slow variables feature long-term memory with 1/f(3/2) power spectra, and the fast variables feature amplitude modulation. Increasing the strength of the thermal-viscous feedback has a stabilizing effect, as both the metric entropy and the Kaplan-Yorke attractor dimension decrease monotonically with Ec. The analyzed system features very rich dynamics: it overcomes some of the limitations of the Lorenz system and might have prototypical value in relevant processes in complex systems dynamics, such as the interaction between slow and fast variables, the presence of long-term memory, and the associated extreme value statistics. This analysis shows how neglecting the coupling of slow and fast variables only on the basis of scale analysis can be catastrophic. In fact, this leads to spurious invariances that affect essential dynamical properties (ergodicity, hyperbolicity) and that cause the model losing ability in describing intrinsically multiscale processes.

  12. MULTI-SCALE MODELING AND APPROXIMATION ASSISTED OPTIMIZATION OF BARE TUBE HEAT EXCHANGERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacellar, Daniel; Ling, Jiazhen; Aute, Vikrant

    2014-01-01

    Air-to-refrigerant heat exchangers are very common in air-conditioning, heat pump and refrigeration applications. In these heat exchangers, there is a great benefit in terms of size, weight, refrigerant charge and heat transfer coefficient, by moving from conventional channel sizes (~ 9mm) to smaller channel sizes (< 5mm). This work investigates new designs for air-to-refrigerant heat exchangers with tube outer diameter ranging from 0.5 to 2.0mm. The goal of this research is to develop and optimize the design of these heat exchangers and compare their performance with existing state of the art designs. The air-side performance of various tube bundle configurationsmore » are analyzed using a Parallel Parameterized CFD (PPCFD) technique. PPCFD allows for fast-parametric CFD analyses of various geometries with topology change. Approximation techniques drastically reduce the number of CFD evaluations required during optimization. Maximum Entropy Design method is used for sampling and Kriging method is used for metamodeling. Metamodels are developed for the air-side heat transfer coefficients and pressure drop as a function of tube-bundle dimensions and air velocity. The metamodels are then integrated with an air-to-refrigerant heat exchanger design code. This integration allows a multi-scale analysis of air-side performance heat exchangers including air-to-refrigerant heat transfer and phase change. Overall optimization is carried out using a multi-objective genetic algorithm. The optimal designs found can exhibit 50 percent size reduction, 75 percent decrease in air side pressure drop and doubled air heat transfer coefficients compared to a high performance compact micro channel heat exchanger with same capacity and flow rates.« less

  13. Controllability of symmetric spin networks

    NASA Astrophysics Data System (ADS)

    Albertini, Francesca; D'Alessandro, Domenico

    2018-05-01

    We consider a network of n spin 1/2 systems which are pairwise interacting via Ising interaction and are controlled by the same electro-magnetic control field. Such a system presents symmetries since the Hamiltonian is unchanged if we permute two spins. This prevents full (operator) controllability, in that not every unitary evolution can be obtained. We prove however that controllability is verified if we restrict ourselves to unitary evolutions which preserve the above permutation invariance. For low dimensional cases, n = 2 and n = 3, we provide an analysis of the Lie group of available evolutions and give explicit control laws to transfer between two arbitrary permutation invariant states. This class of states includes highly entangled states such as Greenberger-Horne-Zeilinger (GHZ) states and W states, which are of interest in quantum information.

  14. Storage and computationally efficient permutations of factorized covariance and square-root information matrices

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector-stored upper-triangular diagonal factorized covariance (UD) and vector stored upper-triangular square-root information filter (SRIF) arrays is presented. The method involves cyclical permutation of the rows and columns of the arrays and retriangularization with appropriate square-root-free fast Givens rotations or elementary slow Givens reflections. A minimal amount of computation is performed and only one scratch vector of size N is required, where N is the column dimension of the arrays. To make the method efficient for large SRIF arrays on a virtual memory machine, three additional scratch vectors each of size N are used to avoid expensive paging faults. The method discussed is compared with the methods and routines of Bierman's Estimation Subroutine Library (ESL).

  15. NASA thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The access vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries and pseudo-multiword terms that are permutations of words that contain words within words. The access vocabulary contains almost 42,000 entries that give increased access to the hierarchies in Volume 1 - Hierarchical Listing.

  16. Genomic Analysis of Complex Microbial Communities in Wounds

    DTIC Science & Technology

    2012-01-01

    thoroughly in the ecology literature. Permutation Multivariate Analysis of Variance ( PerMANOVA ). We used PerMANOVA to test the null-hypothesis of no...difference between the bacterial communities found within a single wound compared to those from different patients (α = 0.05). PerMANOVA is a...permutation-based version of the multivariate analysis of variance (MANOVA). PerMANOVA uses the distances between samples to partition variance and

  17. Circular permutation of the starch-binding domain: inversion of ligand selectivity with increased affinity.

    PubMed

    Stephen, Preyesh; Tseng, Kai-Li; Liu, Yu-Nan; Lyu, Ping-Chiang

    2012-03-07

    Proteins containing starch-binding domains (SBDs) are used in a variety of scientific and technological applications. A circularly permutated SBD (CP90) with improved affinity and selectivity toward longer-chain carbohydrates was synthesized, suggesting that a new starch-binding protein may be developed for specific scientific and industrial applications. This journal is © The Royal Society of Chemistry 2012

  18. Sampling solution traces for the problem of sorting permutations by signed reversals

    PubMed Central

    2012-01-01

    Background Traditional algorithms to solve the problem of sorting by signed reversals output just one optimal solution while the space of all optimal solutions can be huge. A so-called trace represents a group of solutions which share the same set of reversals that must be applied to sort the original permutation following a partial ordering. By using traces, we therefore can represent the set of optimal solutions in a more compact way. Algorithms for enumerating the complete set of traces of solutions were developed. However, due to their exponential complexity, their practical use is limited to small permutations. A partial enumeration of traces is a sampling of the complete set of traces and can be an alternative for the study of distinct evolutionary scenarios of big permutations. Ideally, the sampling should be done uniformly from the space of all optimal solutions. This is however conjectured to be ♯P-complete. Results We propose and evaluate three algorithms for producing a sampling of the complete set of traces that instead can be shown in practice to preserve some of the characteristics of the space of all solutions. The first algorithm (RA) performs the construction of traces through a random selection of reversals on the list of optimal 1-sequences. The second algorithm (DFALT) consists in a slight modification of an algorithm that performs the complete enumeration of traces. Finally, the third algorithm (SWA) is based on a sliding window strategy to improve the enumeration of traces. All proposed algorithms were able to enumerate traces for permutations with up to 200 elements. Conclusions We analysed the distribution of the enumerated traces with respect to their height and average reversal length. Various works indicate that the reversal length can be an important aspect in genome rearrangements. The algorithms RA and SWA show a tendency to lose traces with high average reversal length. Such traces are however rare, and qualitatively our results show that, for testable-sized permutations, the algorithms DFALT and SWA produce distributions which approximate the reversal length distributions observed with a complete enumeration of the set of traces. PMID:22704580

  19. Permutation glass.

    PubMed

    Williams, Mobolaji

    2018-01-01

    The field of disordered systems in statistical physics provides many simple models in which the competing influences of thermal and nonthermal disorder lead to new phases and nontrivial thermal behavior of order parameters. In this paper, we add a model to the subject by considering a disordered system where the state space consists of various orderings of a list. As in spin glasses, the disorder of such "permutation glasses" arises from a parameter in the Hamiltonian being drawn from a distribution of possible values, thus allowing nominally "incorrect orderings" to have lower energies than "correct orderings" in the space of permutations. We analyze a Gaussian, uniform, and symmetric Bernoulli distribution of energy costs, and, by employing Jensen's inequality, derive a simple condition requiring the permutation glass to always transition to the correctly ordered state at a temperature lower than that of the nondisordered system, provided that this correctly ordered state is accessible. We in turn find that in order for the correctly ordered state to be accessible, the probability that an incorrectly ordered component is energetically favored must be less than the inverse of the number of components in the system. We show that all of these results are consistent with a replica symmetric ansatz of the system. We conclude by arguing that there is no distinct permutation glass phase for the simplest model considered here and by discussing how to extend the analysis to more complex Hamiltonians capable of novel phase behavior and replica symmetry breaking. Finally, we outline an apparent correspondence between the presented system and a discrete-energy-level fermion gas. In all, the investigation introduces a class of exactly soluble models into statistical mechanics and provides a fertile ground to investigate statistical models of disorder.

  20. Efficient computation of significance levels for multiple associations in large studies of correlated data, including genomewide association studies.

    PubMed

    Dudbridge, Frank; Koeleman, Bobby P C

    2004-09-01

    Large exploratory studies, including candidate-gene-association testing, genomewide linkage-disequilibrium scans, and array-expression experiments, are becoming increasingly common. A serious problem for such studies is that statistical power is compromised by the need to control the false-positive rate for a large family of tests. Because multiple true associations are anticipated, methods have been proposed that combine evidence from the most significant tests, as a more powerful alternative to individually adjusted tests. The practical application of these methods is currently limited by a reliance on permutation testing to account for the correlated nature of single-nucleotide polymorphism (SNP)-association data. On a genomewide scale, this is both very time-consuming and impractical for repeated explorations with standard marker panels. Here, we alleviate these problems by fitting analytic distributions to the empirical distribution of combined evidence. We fit extreme-value distributions for fixed lengths of combined evidence and a beta distribution for the most significant length. An initial phase of permutation sampling is required to fit these distributions, but it can be completed more quickly than a simple permutation test and need be done only once for each panel of tests, after which the fitted parameters give a reusable calibration of the panel. Our approach is also a more efficient alternative to a standard permutation test. We demonstrate the accuracy of our approach and compare its efficiency with that of permutation tests on genomewide SNP data released by the International HapMap Consortium. The estimation of analytic distributions for combined evidence will allow these powerful methods to be applied more widely in large exploratory studies.

  1. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  2. How to think about indiscernible particles

    NASA Astrophysics Data System (ADS)

    Giglio, Daniel Joseph

    Permutation symmetries which arise in quantum mechanics pose an intriguing problem. It is not clear that particles which exhibit permutation symmetries (i.e. particles which are indiscernible, meaning that they can be swapped with each other without this yielding a new physical state) qualify as "objects" in any reasonable sense of the term. One solution to this puzzle, which I attribute to W.V. Quine, would have us eliminate such particles from our ontology altogether in order to circumvent the metaphysical vexations caused by permutation symmetries. In this essay I argue that Quine's solution is too rash, and in its place I suggest a novel solution based on altering some of the language of quantum mechanics. Before launching into the technical details of indiscernible particles, however, I begin this essay with some remarks on the methodology -- instrumentalism -- which motivates my arguments.

  3. An analytical coarse-graining method which preserves the free energy, structural correlations, and thermodynamic state of polymer melts from the atomistic to the mesoscale.

    PubMed

    McCarty, J; Clark, A J; Copperman, J; Guenza, M G

    2014-05-28

    Structural and thermodynamic consistency of coarse-graining models across multiple length scales is essential for the predictive role of multi-scale modeling and molecular dynamic simulations that use mesoscale descriptions. Our approach is a coarse-grained model based on integral equation theory, which can represent polymer chains at variable levels of chemical details. The model is analytical and depends on molecular and thermodynamic parameters of the system under study, as well as on the direct correlation function in the k → 0 limit, c0. A numerical solution to the PRISM integral equations is used to determine c0, by adjusting the value of the effective hard sphere diameter, dHS, to agree with the predicted equation of state. This single quantity parameterizes the coarse-grained potential, which is used to perform mesoscale simulations that are directly compared with atomistic-level simulations of the same system. We test our coarse-graining formalism by comparing structural correlations, isothermal compressibility, equation of state, Helmholtz and Gibbs free energies, and potential energy and entropy using both united atom and coarse-grained descriptions. We find quantitative agreement between the analytical formalism for the thermodynamic properties, and the results of Molecular Dynamics simulations, independent of the chosen level of representation. In the mesoscale description, the potential energy of the soft-particle interaction becomes a free energy in the coarse-grained coordinates which preserves the excess free energy from an ideal gas across all levels of description. The structural consistency between the united-atom and mesoscale descriptions means the relative entropy between descriptions has been minimized without any variational optimization parameters. The approach is general and applicable to any polymeric system in different thermodynamic conditions.

  4. Impacts of large dams on the complexity of suspended sediment dynamics in the Yangtze River

    NASA Astrophysics Data System (ADS)

    Wang, Yuankun; Rhoads, Bruce L.; Wang, Dong; Wu, Jichun; Zhang, Xiao

    2018-03-01

    The Yangtze River is one of the largest and most important rivers in the world. Over the past several decades, the natural sediment regime of the Yangtze River has been altered by the construction of dams. This paper uses multi-scale entropy analysis to ascertain the impacts of large dams on the complexity of high-frequency suspended sediment dynamics in the Yangtze River system, especially after impoundment of the Three Gorges Dam (TGD). In this study, the complexity of sediment dynamics is quantified by framing it within the context of entropy analysis of time series. Data on daily sediment loads for four stations located in the mainstem are analyzed for the past 60 years. The results indicate that dam construction has reduced the complexity of short-term (1-30 days) variation in sediment dynamics near the structures, but that complexity has actually increased farther downstream. This spatial pattern seems to reflect a filtering effect of the dams on the on the temporal pattern of sediment loads as well as decreased longitudinal connectivity of sediment transfer through the river system, resulting in downstream enhancement of the influence of local sediment inputs by tributaries on sediment dynamics. The TGD has had a substantial impact on the complexity of sediment series in the mainstem of the Yangtze River, especially after it became fully operational. This enhanced impact is attributed to the high trapping efficiency of this dam and its associated large reservoir. The sediment dynamics "signal" becomes more spatially variable after dam construction. This study demonstrates the spatial influence of dams on the high-frequency temporal complexity of sediment regimes and provides valuable information that can be used to guide environmental conservation of the Yangtze River.

  5. Multiscale Modeling: A Review

    NASA Astrophysics Data System (ADS)

    Horstemeyer, M. F.

    This review of multiscale modeling covers a brief history of various multiscale methodologies related to solid materials and the associated experimental influences, the various influence of multiscale modeling on different disciplines, and some examples of multiscale modeling in the design of structural components. Although computational multiscale modeling methodologies have been developed in the late twentieth century, the fundamental notions of multiscale modeling have been around since da Vinci studied different sizes of ropes. The recent rapid growth in multiscale modeling is the result of the confluence of parallel computing power, experimental capabilities to characterize structure-property relations down to the atomic level, and theories that admit multiple length scales. The ubiquitous research that focus on multiscale modeling has broached different disciplines (solid mechanics, fluid mechanics, materials science, physics, mathematics, biological, and chemistry), different regions of the world (most continents), and different length scales (from atoms to autos).

  6. Fermion systems in discrete space-time

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    2007-05-01

    Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.

  7. Dynamic Testing and Automatic Repair of Reconfigurable Wiring Harnesses

    DTIC Science & Technology

    2006-11-27

    Switch An M ×N grid of switches configured to provide a M -input, N -output routing network. Permutation Network A permutation network performs an...wiring reduces the effective advantage of their reduced switch count, particularly when considering that regular grids (crossbar switches being a...are connected to. The outline circuit shown in Fig. 20 shows how a suitable ‘discovery probe’ might be implemented. The circuit shows a UART

  8. Tolerance of a Knotted Near-Infrared Fluorescent Protein to Random Circular Permutation.

    PubMed

    Pandey, Naresh; Kuypers, Brianna E; Nassif, Barbara; Thomas, Emily E; Alnahhas, Razan N; Segatori, Laura; Silberg, Jonathan J

    2016-07-12

    Bacteriophytochrome photoreceptors (BphP) are knotted proteins that have been developed as near-infrared fluorescent protein (iRFP) reporters of gene expression. To explore how rearrangements in the peptides that interlace into the knot within the BphP photosensory core affect folding, we subjected iRFPs to random circular permutation using an improved transposase mutagenesis strategy and screened for variants that fluoresce. We identified 27 circularly permuted iRFPs that display biliverdin-dependent fluorescence in Escherichia coli. The variants with the brightest whole cell fluorescence initiated translation at residues near the domain linker and knot tails, although fluorescent variants that initiated translation within the PAS and GAF domains were discovered. Circularly permuted iRFPs retained sufficient cofactor affinity to fluoresce in tissue culture without the addition of biliverdin, and one variant displayed enhanced fluorescence when expressed in bacteria and tissue culture. This variant displayed a quantum yield similar to that of iRFPs but exhibited increased resistance to chemical denaturation, suggesting that the observed increase in the magnitude of the signal arose from more efficient protein maturation. These results show how the contact order of a knotted BphP can be altered without disrupting chromophore binding and fluorescence, an important step toward the creation of near-infrared biosensors with expanded chemical sensing functions for in vivo imaging.

  9. Tolerance of a knotted near infrared fluorescent protein to random circular permutation

    PubMed Central

    Pandey, Naresh; Kuypers, Brianna E.; Nassif, Barbara; Thomas, Emily E.; Alnahhas, Razan N.; Segatori, Laura; Silberg, Jonathan J.

    2016-01-01

    Bacteriophytochrome photoreceptors (BphP) are knotted proteins that have been developed as near-infrared fluorescent protein (iRFP) reporters of gene expression. To explore how rearrangements in the peptides that interlace into the knot within the BphP photosensory core affect folding, we subjected iRFP to random circular permutation using an improved transposase mutagenesis strategy and screened for variants that fluoresce. We identified twenty seven circularly permuted iRFP that display biliverdin-dependent fluorescence in Escherichia coli. The variants with the brightest whole cell fluorescence initiated translation at residues near the domain linker and knot tails, although fluorescent variants were discovered that initiated translation within the PAS and GAF domains. Circularly permuted iRFP retained sufficient cofactor affinity to fluoresce in tissue culture without the addition of biliverdin, and one variant displayed enhanced fluorescence when expressed in bacteria and tissue culture. This variant displayed a similar quantum yield as iRFP, but exhibited increased resistance to chemical denaturation, suggesting that the observed signal increase arose from more efficient protein maturation. These results show how the contact order of a knotted BphP can be altered without disrupting chromophore binding and fluorescence, an important step towards the creation of near-infrared biosensors with expanded chemical-sensing functions for in vivo imaging. PMID:27304983

  10. A comparative analysis of alternative approaches for quantifying nonlinear dynamics in cardiovascular system.

    PubMed

    Chen, Yun; Yang, Hui

    2013-01-01

    Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.

  11. H0LiCOW VIII. A weak-lensing measurement of the external convergence in the field of the lensed quasar HE 0435-1223

    NASA Astrophysics Data System (ADS)

    Tihhonova, O.; Courbin, F.; Harvey, D.; Hilbert, S.; Rusu, C. E.; Fassnacht, C. D.; Bonvin, V.; Marshall, P. J.; Meylan, G.; Sluse, D.; Suyu, S. H.; Treu, T.; Wong, K. C.

    2018-07-01

    We present a weak gravitational lensing measurement of the external convergence along the line of sight to the quadruply lensed quasar HE 0435-1223. Using deep r-band images from Subaru Suprime Cam, we observe galaxies down to a 3σ limiting magnitude of ˜26 mag resulting in a source galaxy density of 14 galaxies per square arcminute after redshift-based cuts. Using an inpainting technique and multiscale entropy filtering algorithm, we find that the region in close proximity to the lens has an estimated external convergence of κ =-0.012^{+0.020}_{-0.013} and is hence marginally underdense. We also rule out the presence of any halo with a mass greater than Mvir = 1.6 × 1014h-1M⊙ (68 per cent confidence limit). Our results, consistent with previous studies of this lens, confirm that the intervening mass along the line of sight to HE 0435-1223 does not affect significantly the cosmological results inferred from the time-delay measurements of that specific object.

  12. Statistical field estimators for multiscale simulations.

    PubMed

    Eapen, Jacob; Li, Ju; Yip, Sidney

    2005-11-01

    We present a systematic approach for generating smooth and accurate fields from particle simulation data using the notions of statistical inference. As an extension to a parametric representation based on the maximum likelihood technique previously developed for velocity and temperature fields, a nonparametric estimator based on the principle of maximum entropy is proposed for particle density and stress fields. Both estimators are applied to represent molecular dynamics data on shear-driven flow in an enclosure which exhibits a high degree of nonlinear characteristics. We show that the present density estimator is a significant improvement over ad hoc bin averaging and is also free of systematic boundary artifacts that appear in the method of smoothing kernel estimates. Similarly, the velocity fields generated by the maximum likelihood estimator do not show any edge effects that can be erroneously interpreted as slip at the wall. For low Reynolds numbers, the velocity fields and streamlines generated by the present estimator are benchmarked against Newtonian continuum calculations. For shear velocities that are a significant fraction of the thermal speed, we observe a form of shear localization that is induced by the confining boundary.

  13. Complex systems and the technology of variability analysis

    PubMed Central

    Seely, Andrew JE; Macklem, Peter T

    2004-01-01

    Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients. PMID:15566580

  14. Development of reactive force fields using ab initio molecular dynamics simulation minimally biased to experimental data

    NASA Astrophysics Data System (ADS)

    Chen, Chen; Arntsen, Christopher; Voth, Gregory A.

    2017-10-01

    Incorporation of quantum mechanical electronic structure data is necessary to properly capture the physics of many chemical processes. Proton hopping in water, which involves rearrangement of chemical and hydrogen bonds, is one such example of an inherently quantum mechanical process. Standard ab initio molecular dynamics (AIMD) methods, however, do not yet accurately predict the structure of water and are therefore less than optimal for developing force fields. We have instead utilized a recently developed method which minimally biases AIMD simulations to match limited experimental data to develop novel multiscale reactive molecular dynamics (MS-RMD) force fields by using relative entropy minimization. In this paper, we present two new MS-RMD models using such a parameterization: one which employs water with harmonic internal vibrations and another which uses anharmonic water. We show that the newly developed MS-RMD models very closely reproduce the solvation structure of the hydrated excess proton in the target AIMD data. We also find that the use of anharmonic water increases proton hopping, thereby increasing the proton diffusion constant.

  15. H0LiCOW VIII. A weak lensing measurement of the external convergence in the field of the lensed quasar HE 0435-1223

    NASA Astrophysics Data System (ADS)

    Tihhonova, O.; Courbin, F.; Harvey, D.; Hilbert, S.; Rusu, C. E.; Fassnacht, C. D.; Bonvin, V.; Marshall, P. J.; Meylan, G.; Sluse, D.; Suyu, S. H.; Treu, T.; Wong, K. C.

    2018-04-01

    We present a weak gravitational lensing measurement of the external convergence along the line of sight to the quadruply lensed quasar HE 0435-1223. Using deep r-band images from Subaru-Suprime-Cam we observe galaxies down to a 3σ limiting magnitude of ˜26 mags resulting in a source galaxy density of 14 galaxies / arcmin2 after redshift-based cuts. Using an inpainting technique and Multi-Scale Entropy filtering algorithm, we find that the region in close proximity to the lens has an estimated external convergence of κ =-0.012^{+0.020}_{-0.013} and is hence marginally under-dense. We also rule out the presence of any halo with a mass greater than Mvir = 1.6 × 1014h-1M⊙ (68% confidence limit). Our results, consistent with previous studies of this lens, confirm that the intervening mass along the line of sight to HE 0435-1223 does not affect significantly the cosmological results inferred from the time delay measurements of that specific object.

  16. General Rotorcraft Aeromechanical Stability Program (GRASP) - Theory Manual

    DTIC Science & Technology

    1990-10-01

    the A basis. Two symbols frequently encountered in vector operations that use index notation are the Kronecker delta eij and the Levi - Civita epsilon...Blade root cutout fijk Levi - Civita epsilon permutation symbol 0 pretwist angle 0’ pretwist per unit length (d;) Oi Tait-Bryan angles K~i moment strains...the components of the identity tensor in a Cartesian coordinate system, while the Levi Civita epsilon consists of components of the permutation

  17. Using permutations to detect dependence between time series

    NASA Astrophysics Data System (ADS)

    Cánovas, Jose S.; Guillamón, Antonio; Ruíz, María del Carmen

    2011-07-01

    In this paper, we propose an independence test between two time series which is based on permutations. The proposed test can be carried out by means of different common statistics such as Pearson’s chi-square or the likelihood ratio. We also point out why an exact test is necessary. Simulated and real data (return exchange rates between several currencies) reveal the capacity of this test to detect linear and nonlinear dependences.

  18. Testing of Error-Correcting Sparse Permutation Channel Codes

    NASA Technical Reports Server (NTRS)

    Shcheglov, Kirill, V.; Orlov, Sergei S.

    2008-01-01

    A computer program performs Monte Carlo direct numerical simulations for testing sparse permutation channel codes, which offer strong error-correction capabilities at high code rates and are considered especially suitable for storage of digital data in holographic and volume memories. A word in a code of this type is characterized by, among other things, a sparseness parameter (M) and a fixed number (K) of 1 or "on" bits in a channel block length of N.

  19. Scrambled Sobol Sequences via Permutation

    DTIC Science & Technology

    2009-01-01

    LCG LCG64 LFG MLFG PMLCG Sobol Scrambler PermutationScrambler LinearScrambler <<uses>> PermuationFactory StaticFactory DynamicFactory <<uses>> Figure 3...Phy., 19:252–256, 1979. [2] Emanouil I. Atanassov. A new efficient algorithm for generating the scrambled sobol ’ sequence. In NMA ’02: Revised Papers...Deidre W.Evan, and Micheal Mascagni. On the scrambled sobol sequence. In ICCS2005, pages 775–782, 2005. [7] Richard Durstenfeld. Algorithm 235: Random

  20. Optimization and experimental realization of the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Yalçınkaya, I.; Gedik, Z.

    2017-12-01

    The quantum permutation algorithm provides computational speed-up over classical algorithms for determining the parity of a given cyclic permutation. For its n -qubit implementations, the number of required quantum gates scales quadratically with n due to the quantum Fourier transforms included. We show here for the n -qubit case that the algorithm can be simplified so that it requires only O (n ) quantum gates, which theoretically reduces the complexity of the implementation. To test our results experimentally, we utilize IBM's 5-qubit quantum processor to realize the algorithm by using the original and simplified recipes for the 2-qubit case. It turns out that the latter results in a significantly higher success probability which allows us to verify the algorithm more precisely than the previous experimental realizations. We also verify the algorithm for the first time for the 3-qubit case with a considerable success probability by taking the advantage of our simplified scheme.

  1. A faster 1.375-approximation algorithm for sorting by transpositions.

    PubMed

    Cunha, Luís Felipe I; Kowada, Luis Antonio B; Hausen, Rodrigo de A; de Figueiredo, Celina M H

    2015-11-01

    Sorting by Transpositions is an NP-hard problem for which several polynomial-time approximation algorithms have been developed. Hartman and Shamir (2006) developed a 1.5-approximation [Formula: see text] algorithm, whose running time was improved to O(nlogn) by Feng and Zhu (2007) with a data structure they defined, the permutation tree. Elias and Hartman (2006) developed a 1.375-approximation O(n(2)) algorithm, and Firoz et al. (2011) claimed an improvement to the running time, from O(n(2)) to O(nlogn), by using the permutation tree. We provide counter-examples to the correctness of Firoz et al.'s strategy, showing that it is not possible to reach a component by sufficient extensions using the method proposed by them. In addition, we propose a 1.375-approximation algorithm, modifying Elias and Hartman's approach with the use of permutation trees and achieving O(nlogn) time.

  2. A Weak Quantum Blind Signature with Entanglement Permutation

    NASA Astrophysics Data System (ADS)

    Lou, Xiaoping; Chen, Zhigang; Guo, Ying

    2015-09-01

    Motivated by the permutation encryption algorithm, a weak quantum blind signature (QBS) scheme is proposed. It involves three participants, including the sender Alice, the signatory Bob and the trusted entity Charlie, in four phases, i.e., initializing phase, blinding phase, signing phase and verifying phase. In a small-scale quantum computation network, Alice blinds the message based on a quantum entanglement permutation encryption algorithm that embraces the chaotic position string. Bob signs the blinded message with private parameters shared beforehand while Charlie verifies the signature's validity and recovers the original message. Analysis shows that the proposed scheme achieves the secure blindness for the signer and traceability for the message owner with the aid of the authentic arbitrator who plays a crucial role when a dispute arises. In addition, the signature can neither be forged nor disavowed by the malicious attackers. It has a wide application to E-voting and E-payment system, etc.

  3. An extended continuous estimation of distribution algorithm for solving the permutation flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2017-11-01

    This article proposes an extended continuous estimation of distribution algorithm (ECEDA) to solve the permutation flow-shop scheduling problem (PFSP). In ECEDA, to make a continuous estimation of distribution algorithm (EDA) suitable for the PFSP, the largest order value rule is applied to convert continuous vectors to discrete job permutations. A probabilistic model based on a mixed Gaussian and Cauchy distribution is built to maintain the exploration ability of the EDA. Two effective local search methods, i.e. revolver-based variable neighbourhood search and Hénon chaotic-based local search, are designed and incorporated into the EDA to enhance the local exploitation. The parameters of the proposed ECEDA are calibrated by means of a design of experiments approach. Simulation results and comparisons based on some benchmark instances show the efficiency of the proposed algorithm for solving the PFSP.

  4. A New Color Image Encryption Scheme Using CML and a Fractional-Order Chaotic System

    PubMed Central

    Wu, Xiangjun; Li, Yang; Kurths, Jürgen

    2015-01-01

    The chaos-based image cryptosystems have been widely investigated in recent years to provide real-time encryption and transmission. In this paper, a novel color image encryption algorithm by using coupled-map lattices (CML) and a fractional-order chaotic system is proposed to enhance the security and robustness of the encryption algorithms with a permutation-diffusion structure. To make the encryption procedure more confusing and complex, an image division-shuffling process is put forward, where the plain-image is first divided into four sub-images, and then the position of the pixels in the whole image is shuffled. In order to generate initial conditions and parameters of two chaotic systems, a 280-bit long external secret key is employed. The key space analysis, various statistical analysis, information entropy analysis, differential analysis and key sensitivity analysis are introduced to test the security of the new image encryption algorithm. The cryptosystem speed is analyzed and tested as well. Experimental results confirm that, in comparison to other image encryption schemes, the new algorithm has higher security and is fast for practical image encryption. Moreover, an extensive tolerance analysis of some common image processing operations such as noise adding, cropping, JPEG compression, rotation, brightening and darkening, has been performed on the proposed image encryption technique. Corresponding results reveal that the proposed image encryption method has good robustness against some image processing operations and geometric attacks. PMID:25826602

  5. Classical mutual information in mean-field spin glass models

    NASA Astrophysics Data System (ADS)

    Alba, Vincenzo; Inglis, Stephen; Pollet, Lode

    2016-03-01

    We investigate the classical Rényi entropy Sn and the associated mutual information In in the Sherrington-Kirkpatrick (S-K) model, which is the paradigm model of mean-field spin glasses. Using classical Monte Carlo simulations and analytical tools we investigate the S-K model in the n -sheet booklet. This is achieved by gluing together n independent copies of the model, and it is the main ingredient for constructing the Rényi entanglement-related quantities. We find a glassy phase at low temperatures, whereas at high temperatures the model exhibits paramagnetic behavior, consistent with the regular S-K model. The temperature of the paramagnetic-glassy transition depends nontrivially on the geometry of the booklet. At high temperatures we provide the exact solution of the model by exploiting the replica symmetry. This is the permutation symmetry among the fictitious replicas that are used to perform disorder averages (via the replica trick). In the glassy phase the replica symmetry has to be broken. Using a generalization of the Parisi solution, we provide analytical results for Sn and In and for standard thermodynamic quantities. Both Sn and In exhibit a volume law in the whole phase diagram. We characterize the behavior of the corresponding densities, Sn/N and In/N , in the thermodynamic limit. Interestingly, at the critical point the mutual information does not exhibit any crossing for different system sizes, in contrast with local spin models.

  6. DGCA: A comprehensive R package for Differential Gene Correlation Analysis.

    PubMed

    McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin

    2016-11-15

    Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.

  7. Chaos in Magnetic Flux Ropes

    NASA Astrophysics Data System (ADS)

    Gekelman, W. N.; DeHaas, T.; Van Compernolle, B.

    2013-12-01

    Magnetic Flux Ropes Immersed in a uniform magnetoplasma are observed to twist about themselves, writhe about each other and rotate about a central axis. They are kink unstable and smash into one another as they move. Full three dimensional magnetic field and flows are measured at thousands of time steps. Each collision results in magnetic field line generation and the generation of a quasi-seperatrix layer and induced electric fields. Three dimensional magnetic field lines are computed by conditionally averaging the data using correlation techniques. The permutation entropy1 ,which is related to the Lyapunov exponent, can be calculated from the the time series of the magnetic field data (this is also done with flows) and used to calculate the positions of the data on a Jensen Shannon complexity map2. The location of data on this map indicates if the magnetic fields are stochastic, or fall into regions of minimal or maximal complexity. The complexity is a function of space and time. The complexity map, and analysis will be explained in the course of the talk. Other types of chaotic dynamical models such as the Lorentz, Gissinger and Henon process also fall on the map and can give a clue to the nature of the flux rope turbulence. The ropes fall in the region of the C-H plane where chaotic systems lie. The entropy and complexity change in space and time which reflects the change and possibly type of chaos associated with the ropes. The maps give insight as to the type of chaos (deterministic chaos, fractional diffusion , Levi flights..) and underlying dynamical process. The power spectra of much of the magnetic and flow data is exponential and Lorentzian structures in the time domain are embedded in them. Other quantities such as the Hurst exponent are evaluated for both magnetic fields and plasma flow. Work Supported by a UC-LANL Lab fund and the Basic Plasma Science Facility which is funded by DOE and NSF. 1) C. Bandt, B. Pompe, Phys. Rev. Lett., 88,174102 (2007) 2) O. Russo et al., Phys. Rev. Lett., 99, 154102 (2007), J. Maggs, G.Morales, 55, 085015 (2013)

  8. Evaluation of Feature Extraction and Recognition for Activity Monitoring and Fall Detection Based on Wearable sEMG Sensors.

    PubMed

    Xi, Xugang; Tang, Minyan; Miran, Seyed M; Luo, Zhizeng

    2017-05-27

    As an essential subfield of context awareness, activity awareness, especially daily activity monitoring and fall detection, plays a significant role for elderly or frail people who need assistance in their daily activities. This study investigates the feature extraction and pattern recognition of surface electromyography (sEMG), with the purpose of determining the best features and classifiers of sEMG for daily living activities monitoring and fall detection. This is done by a serial of experiments. In the experiments, four channels of sEMG signal from wireless, wearable sensors located on lower limbs are recorded from three subjects while they perform seven activities of daily living (ADL). A simulated trip fall scenario is also considered with a custom-made device attached to the ankle. With this experimental setting, 15 feature extraction methods of sEMG, including time, frequency, time/frequency domain and entropy, are analyzed based on class separability and calculation complexity, and five classification methods, each with 15 features, are estimated with respect to the accuracy rate of recognition and calculation complexity for activity monitoring and fall detection. It is shown that a high accuracy rate of recognition and a minimal calculation time for daily activity monitoring and fall detection can be achieved in the current experimental setting. Specifically, the Wilson Amplitude (WAMP) feature performs the best, and the classifier Gaussian Kernel Support Vector Machine (GK-SVM) with Permutation Entropy (PE) or WAMP results in the highest accuracy for activity monitoring with recognition rates of 97.35% and 96.43%. For fall detection, the classifier Fuzzy Min-Max Neural Network (FMMNN) has the best sensitivity and specificity at the cost of the longest calculation time, while the classifier Gaussian Kernel Fisher Linear Discriminant Analysis (GK-FDA) with the feature WAMP guarantees a high sensitivity (98.70%) and specificity (98.59%) with a short calculation time (65.586 ms), making it a possible choice for pre-impact fall detection. The thorough quantitative comparison of the features and classifiers in this study supports the feasibility of a wireless, wearable sEMG sensor system for automatic activity monitoring and fall detection.

  9. Evaluation of Feature Extraction and Recognition for Activity Monitoring and Fall Detection Based on Wearable sEMG Sensors

    PubMed Central

    Xi, Xugang; Tang, Minyan; Miran, Seyed M.; Luo, Zhizeng

    2017-01-01

    As an essential subfield of context awareness, activity awareness, especially daily activity monitoring and fall detection, plays a significant role for elderly or frail people who need assistance in their daily activities. This study investigates the feature extraction and pattern recognition of surface electromyography (sEMG), with the purpose of determining the best features and classifiers of sEMG for daily living activities monitoring and fall detection. This is done by a serial of experiments. In the experiments, four channels of sEMG signal from wireless, wearable sensors located on lower limbs are recorded from three subjects while they perform seven activities of daily living (ADL). A simulated trip fall scenario is also considered with a custom-made device attached to the ankle. With this experimental setting, 15 feature extraction methods of sEMG, including time, frequency, time/frequency domain and entropy, are analyzed based on class separability and calculation complexity, and five classification methods, each with 15 features, are estimated with respect to the accuracy rate of recognition and calculation complexity for activity monitoring and fall detection. It is shown that a high accuracy rate of recognition and a minimal calculation time for daily activity monitoring and fall detection can be achieved in the current experimental setting. Specifically, the Wilson Amplitude (WAMP) feature performs the best, and the classifier Gaussian Kernel Support Vector Machine (GK-SVM) with Permutation Entropy (PE) or WAMP results in the highest accuracy for activity monitoring with recognition rates of 97.35% and 96.43%. For fall detection, the classifier Fuzzy Min-Max Neural Network (FMMNN) has the best sensitivity and specificity at the cost of the longest calculation time, while the classifier Gaussian Kernel Fisher Linear Discriminant Analysis (GK-FDA) with the feature WAMP guarantees a high sensitivity (98.70%) and specificity (98.59%) with a short calculation time (65.586 ms), making it a possible choice for pre-impact fall detection. The thorough quantitative comparison of the features and classifiers in this study supports the feasibility of a wireless, wearable sEMG sensor system for automatic activity monitoring and fall detection. PMID:28555016

  10. Unifying the rotational and permutation symmetry of nuclear spin states: Schur-Weyl duality in molecular physics.

    PubMed

    Schmiedt, Hanno; Jensen, Per; Schlemmer, Stephan

    2016-08-21

    In modern physics and chemistry concerned with many-body systems, one of the mainstays is identical-particle-permutation symmetry. In particular, both the intra-molecular dynamics of a single molecule and the inter-molecular dynamics associated, for example, with reactive molecular collisions are strongly affected by selection rules originating in nuclear-permutation symmetry operations being applied to the total internal wavefunctions, including nuclear spin, of the molecules involved. We propose here a general tool to determine coherently the permutation symmetry and the rotational symmetry (associated with the group of arbitrary rotations of the entire molecule in space) of molecular wavefunctions, in particular the nuclear-spin functions. Thus far, these two symmetries were believed to be mutually independent and it has even been argued that under certain circumstances, it is impossible to establish a one-to-one correspondence between them. However, using the Schur-Weyl duality theorem we show that the two types of symmetry are inherently coupled. In addition, we use the ingenious representation-theory technique of Young tableaus to represent the molecular nuclear-spin degrees of freedom in terms of well-defined mathematical objects. This simplifies the symmetry classification of the nuclear wavefunction even for large molecules. Also, the application to reactive collisions is very straightforward and provides a much simplified approach to obtaining selection rules.

  11. Unifying the rotational and permutation symmetry of nuclear spin states: Schur-Weyl duality in molecular physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmiedt, Hanno; Schlemmer, Stephan; Jensen, Per, E-mail: jensen@uni-wuppertal.de

    In modern physics and chemistry concerned with many-body systems, one of the mainstays is identical-particle-permutation symmetry. In particular, both the intra-molecular dynamics of a single molecule and the inter-molecular dynamics associated, for example, with reactive molecular collisions are strongly affected by selection rules originating in nuclear-permutation symmetry operations being applied to the total internal wavefunctions, including nuclear spin, of the molecules involved. We propose here a general tool to determine coherently the permutation symmetry and the rotational symmetry (associated with the group of arbitrary rotations of the entire molecule in space) of molecular wavefunctions, in particular the nuclear-spin functions. Thusmore » far, these two symmetries were believed to be mutually independent and it has even been argued that under certain circumstances, it is impossible to establish a one-to-one correspondence between them. However, using the Schur-Weyl duality theorem we show that the two types of symmetry are inherently coupled. In addition, we use the ingenious representation-theory technique of Young tableaus to represent the molecular nuclear-spin degrees of freedom in terms of well-defined mathematical objects. This simplifies the symmetry classification of the nuclear wavefunction even for large molecules. Also, the application to reactive collisions is very straightforward and provides a much simplified approach to obtaining selection rules.« less

  12. Potential energy surface fitting by a statistically localized, permutationally invariant, local interpolating moving least squares method for the many-body potential: Method and application to N{sub 4}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Jason D.; Doraiswamy, Sriram; Candler, Graham V., E-mail: truhlar@umn.edu, E-mail: candler@aem.umn.edu

    2014-02-07

    Fitting potential energy surfaces to analytic forms is an important first step for efficient molecular dynamics simulations. Here, we present an improved version of the local interpolating moving least squares method (L-IMLS) for such fitting. Our method has three key improvements. First, pairwise interactions are modeled separately from many-body interactions. Second, permutational invariance is incorporated in the basis functions, using permutationally invariant polynomials in Morse variables, and in the weight functions. Third, computational cost is reduced by statistical localization, in which we statistically correlate the cutoff radius with data point density. We motivate our discussion in this paper with amore » review of global and local least-squares-based fitting methods in one dimension. Then, we develop our method in six dimensions, and we note that it allows the analytic evaluation of gradients, a feature that is important for molecular dynamics. The approach, which we call statistically localized, permutationally invariant, local interpolating moving least squares fitting of the many-body potential (SL-PI-L-IMLS-MP, or, more simply, L-IMLS-G2), is used to fit a potential energy surface to an electronic structure dataset for N{sub 4}. We discuss its performance on the dataset and give directions for further research, including applications to trajectory calculations.« less

  13. Multiscale Modeling in Computational Biomechanics: Determining Computational Priorities and Addressing Current Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tawhai, Merryn; Bischoff, Jeff; Einstein, Daniel R.

    2009-05-01

    Abstract In this article, we describe some current multiscale modeling issues in computational biomechanics from the perspective of the musculoskeletal and respiratory systems and mechanotransduction. First, we outline the necessity of multiscale simulations in these biological systems. Then we summarize challenges inherent to multiscale biomechanics modeling, regardless of the subdiscipline, followed by computational challenges that are system-specific. We discuss some of the current tools that have been utilized to aid research in multiscale mechanics simulations, and the priorities to further the field of multiscale biomechanics computation.

  14. Inferring Species Richness and Turnover by Statistical Multiresolution Texture Analysis of Satellite Imagery

    PubMed Central

    Convertino, Matteo; Mangoubi, Rami S.; Linkov, Igor; Lowry, Nathan C.; Desai, Mukund

    2012-01-01

    Background The quantification of species-richness and species-turnover is essential to effective monitoring of ecosystems. Wetland ecosystems are particularly in need of such monitoring due to their sensitivity to rainfall, water management and other external factors that affect hydrology, soil, and species patterns. A key challenge for environmental scientists is determining the linkage between natural and human stressors, and the effect of that linkage at the species level in space and time. We propose pixel intensity based Shannon entropy for estimating species-richness, and introduce a method based on statistical wavelet multiresolution texture analysis to quantitatively assess interseasonal and interannual species turnover. Methodology/Principal Findings We model satellite images of regions of interest as textures. We define a texture in an image as a spatial domain where the variations in pixel intensity across the image are both stochastic and multiscale. To compare two textures quantitatively, we first obtain a multiresolution wavelet decomposition of each. Either an appropriate probability density function (pdf) model for the coefficients at each subband is selected, and its parameters estimated, or, a non-parametric approach using histograms is adopted. We choose the former, where the wavelet coefficients of the multiresolution decomposition at each subband are modeled as samples from the generalized Gaussian pdf. We then obtain the joint pdf for the coefficients for all subbands, assuming independence across subbands; an approximation that simplifies the computational burden significantly without sacrificing the ability to statistically distinguish textures. We measure the difference between two textures' representative pdf's via the Kullback-Leibler divergence (KL). Species turnover, or diversity, is estimated using both this KL divergence and the difference in Shannon entropy. Additionally, we predict species richness, or diversity, based on the Shannon entropy of pixel intensity.To test our approach, we specifically use the green band of Landsat images for a water conservation area in the Florida Everglades. We validate our predictions against data of species occurrences for a twenty-eight years long period for both wet and dry seasons. Our method correctly predicts 73% of species richness. For species turnover, the newly proposed KL divergence prediction performance is near 100% accurate. This represents a significant improvement over the more conventional Shannon entropy difference, which provides 85% accuracy. Furthermore, we find that changes in soil and water patterns, as measured by fluctuations of the Shannon entropy for the red and blue bands respectively, are positively correlated with changes in vegetation. The fluctuations are smaller in the wet season when compared to the dry season. Conclusions/Significance Texture-based statistical multiresolution image analysis is a promising method for quantifying interseasonal differences and, consequently, the degree to which vegetation, soil, and water patterns vary. The proposed automated method for quantifying species richness and turnover can also provide analysis at higher spatial and temporal resolution than is currently obtainable from expensive monitoring campaigns, thus enabling more prompt, more cost effective inference and decision making support regarding anomalous variations in biodiversity. Additionally, a matrix-based visualization of the statistical multiresolution analysis is presented to facilitate both insight and quick recognition of anomalous data. PMID:23115629

  15. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic.

    PubMed

    Xie, Kun; Fox, Grace E; Liu, Jun; Lyu, Cheng; Lee, Jason C; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies-the long-presumed computational motif-are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic ( N = 2 i -1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors-the synaptic switch for learning and memory-were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques-which preferentially encode specific and low-combinatorial features and project inter-cortically-is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6-which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems-is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain's basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex.

  16. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic

    PubMed Central

    Xie, Kun; Fox, Grace E.; Liu, Jun; Lyu, Cheng; Lee, Jason C.; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z.

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies—the long-presumed computational motif—are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic (N = 2i–1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors—the synaptic switch for learning and memory—were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques—which preferentially encode specific and low-combinatorial features and project inter-cortically—is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6—which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems—is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain’s basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex. PMID:27895562

  17. Community Multiscale Air Quality Model

    EPA Science Inventory

    The U.S. EPA developed the Community Multiscale Air Quality (CMAQ) system to apply a “one atmosphere” multiscale and multi-pollutant modeling approach based mainly on the “first principles” description of the atmosphere. The multiscale capability is supported by the governing di...

  18. Successful attack on permutation-parity-machine-based neural cryptography.

    PubMed

    Seoane, Luís F; Ruttor, Andreas

    2012-02-01

    An algorithm is presented which implements a probabilistic attack on the key-exchange protocol based on permutation parity machines. Instead of imitating the synchronization of the communicating partners, the strategy consists of a Monte Carlo method to sample the space of possible weights during inner rounds and an analytic approach to convey the extracted information from one outer round to the next one. The results show that the protocol under attack fails to synchronize faster than an eavesdropper using this algorithm.

  19. Crossbar Switches For Optical Data-Communication Networks

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P.

    1994-01-01

    Optoelectronic and electro-optical crossbar switches called "permutation engines" (PE's) developed to route packets of data through fiber-optic communication networks. Basic network concept described in "High-Speed Optical Wide-Area Data-Communication Network" (NPO-18983). Nonblocking operation achieved by decentralized switching and control scheme. Each packet routed up or down in each column of this 5-input/5-output permutation engine. Routing algorithm ensures each packet arrives at its designated output port without blocking any other packet that does not contend for same output port.

  20. Security of the Five-Round KASUMI Type Permutation

    NASA Astrophysics Data System (ADS)

    Iwata, Tetsu; Yagi, Tohru; Kurosawa, Kaoru

    KASUMI is a blockcipher that forms the heart of the 3GPP confidentiality and integrity algorithms. In this paper, we study the security of the five-round KASUMI type permutations, and derive a highly non-trivial security bound against adversaries with adaptive chosen plaintext and chosen ciphertext attacks. To derive our security bound, we heavily use the tools from graph theory. However the result does not show its super-pseudorandomness, this gives us a strong evidence that the design of KASUMI is sound.

  1. Whole-Lesion Apparent Diffusion Coefficient-Based Entropy-Related Parameters for Characterizing Cervical Cancers: Initial Findings.

    PubMed

    Guan, Yue; Li, Weifeng; Jiang, Zhuoran; Chen, Ying; Liu, Song; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-12-01

    This study aimed to develop whole-lesion apparent diffusion coefficient (ADC)-based entropy-related parameters of cervical cancer to preliminarily assess intratumoral heterogeneity of this lesion in comparison to adjacent normal cervical tissues. A total of 51 women (mean age, 49 years) with cervical cancers confirmed by biopsy underwent 3-T pelvic diffusion-weighted magnetic resonance imaging with b values of 0 and 800 s/mm 2 prospectively. ADC-based entropy-related parameters including first-order entropy and second-order entropies were derived from the whole tumor volume as well as adjacent normal cervical tissues. Intraclass correlation coefficient, Wilcoxon test with Bonferroni correction, Kruskal-Wallis test, and receiver operating characteristic curve were used for statistical analysis. All the parameters showed excellent interobserver agreement (all intraclass correlation coefficients  > 0.900). Entropy, entropy(H) 0 , entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean were significantly higher, whereas entropy(H) range and entropy(H) std were significantly lower in cervical cancers compared to adjacent normal cervical tissues (all P <.0001). Kruskal-Wallis test showed that there were no significant differences among the values of various second-order entropies including entropy(H) 0, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean. All second-order entropies had larger area under the receiver operating characteristic curve than first-order entropy in differentiating cervical cancers from adjacent normal cervical tissues. Further, entropy(H) 45 , entropy(H) 90 , entropy(H) 135 , and entropy(H) mean had the same largest area under the receiver operating characteristic curve of 0.867. Whole-lesion ADC-based entropy-related parameters of cervical cancers were developed successfully, which showed initial potential in characterizing intratumoral heterogeneity in comparison to adjacent normal cervical tissues. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  2. Development of Semantic Description for Multiscale Models of Thermo-Mechanical Treatment of Metal Alloys

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Regulski, Krzysztof

    2016-08-01

    We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.

  3. Comparison of Multiscale Method of Cells-Based Models for Predicting Elastic Properties of Filament Wound C/C-SiC

    NASA Technical Reports Server (NTRS)

    Pineda, Evan J.; Fassin, Marek; Bednarcyk, Brett A.; Reese, Stefanie; Simon, Jaan-Willem

    2017-01-01

    Three different multiscale models, based on the method of cells (generalized and high fidelity) micromechanics models were developed and used to predict the elastic properties of C/C-SiC composites. In particular, the following multiscale modeling strategies were employed: Concurrent multiscale modeling of all phases using the generalized method of cells, synergistic (two-way coupling in space) multiscale modeling with the generalized method of cells, and hierarchical (one-way coupling in space) multiscale modeling with the high fidelity generalized method of cells. The three models are validated against data from a hierarchical multiscale finite element model in the literature for a repeating unit cell of C/C-SiC. Furthermore, the multiscale models are used in conjunction with classical lamination theory to predict the stiffness of C/C-SiC plates manufactured via a wet filament winding and liquid silicon infiltration process recently developed by the German Aerospace Institute.

  4. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    PubMed

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  5. Automated matching of corresponding seed images of three simulator radiographs to allow 3D triangulation of implanted seeds.

    PubMed

    Altschuler, M D; Kassaee, A

    1997-02-01

    To match corresponding seed images in different radiographs so that the 3D seed locations can be triangulated automatically and without ambiguity requires (at least) three radiographs taken from different perspectives, and an algorithm that finds the proper permutations of the seed-image indices. Matching corresponding images in only two radiographs introduces inherent ambiguities which can be resolved only with the use of non-positional information obtained with intensive human effort. Matching images in three or more radiographs is an 'NP (Non-determinant in Polynomial time)-complete' problem. Although the matching problem is fundamental, current methods for three-radiograph seed-image matching use 'local' (seed-by-seed) methods that may lead to incorrect matchings. We describe a permutation-sampling method which not only gives good 'global' (full permutation) matches for the NP-complete three-radiograph seed-matching problem, but also determines the reliability of the radiographic data themselves, namely, whether the patient moved in the interval between radiographic perspectives.

  6. Automated matching of corresponding seed images of three simulator radiographs to allow 3D triangulation of implanted seeds

    NASA Astrophysics Data System (ADS)

    Altschuler, Martin D.; Kassaee, Alireza

    1997-02-01

    To match corresponding seed images in different radiographs so that the 3D seed locations can be triangulated automatically and without ambiguity requires (at least) three radiographs taken from different perspectives, and an algorithm that finds the proper permutations of the seed-image indices. Matching corresponding images in only two radiographs introduces inherent ambiguities which can be resolved only with the use of non-positional information obtained with intensive human effort. Matching images in three or more radiographs is an `NP (Non-determinant in Polynomial time)-complete' problem. Although the matching problem is fundamental, current methods for three-radiograph seed-image matching use `local' (seed-by-seed) methods that may lead to incorrect matchings. We describe a permutation-sampling method which not only gives good `global' (full permutation) matches for the NP-complete three-radiograph seed-matching problem, but also determines the reliability of the radiographic data themselves, namely, whether the patient moved in the interval between radiographic perspectives.

  7. Exploiting Lipid Permutation Symmetry to Compute Membrane Remodeling Free Energies.

    PubMed

    Bubnis, Greg; Risselada, Herre Jelger; Grubmüller, Helmut

    2016-10-28

    A complete physical description of membrane remodeling processes, such as fusion or fission, requires knowledge of the underlying free energy landscapes, particularly in barrier regions involving collective shape changes, topological transitions, and high curvature, where Canham-Helfrich (CH) continuum descriptions may fail. To calculate these free energies using atomistic simulations, one must address not only the sampling problem due to high free energy barriers, but also an orthogonal sampling problem of combinatorial complexity stemming from the permutation symmetry of identical lipids. Here, we solve the combinatorial problem with a permutation reduction scheme to map a structural ensemble into a compact, nondegenerate subregion of configuration space, thereby permitting straightforward free energy calculations via umbrella sampling. We applied this approach, using a coarse-grained lipid model, to test the CH description of bending and found sharp increases in the bending modulus for curvature radii below 10 nm. These deviations suggest that an anharmonic bending term may be required for CH models to give quantitative energetics of highly curved states.

  8. A fast chaos-based image encryption scheme with a dynamic state variables selection mechanism

    NASA Astrophysics Data System (ADS)

    Chen, Jun-xin; Zhu, Zhi-liang; Fu, Chong; Yu, Hai; Zhang, Li-bo

    2015-03-01

    In recent years, a variety of chaos-based image cryptosystems have been investigated to meet the increasing demand for real-time secure image transmission. Most of them are based on permutation-diffusion architecture, in which permutation and diffusion are two independent procedures with fixed control parameters. This property results in two flaws. (1) At least two chaotic state variables are required for encrypting one plain pixel, in permutation and diffusion stages respectively. Chaotic state variables produced with high computation complexity are not sufficiently used. (2) The key stream solely depends on the secret key, and hence the cryptosystem is vulnerable against known/chosen-plaintext attacks. In this paper, a fast chaos-based image encryption scheme with a dynamic state variables selection mechanism is proposed to enhance the security and promote the efficiency of chaos-based image cryptosystems. Experimental simulations and extensive cryptanalysis have been carried out and the results prove the superior security and high efficiency of the scheme.

  9. Simultaneous and Sequential MS/MS Scan Combinations and Permutations in a Linear Quadrupole Ion Trap.

    PubMed

    Snyder, Dalton T; Szalwinski, Lucas J; Cooks, R Graham

    2017-10-17

    Methods of performing precursor ion scans as well as neutral loss scans in a single linear quadrupole ion trap have recently been described. In this paper we report methodology for performing permutations of MS/MS scan modes, that is, ordered combinations of precursor, product, and neutral loss scans following a single ion injection event. Only particular permutations are allowed; the sequences demonstrated here are (1) multiple precursor ion scans, (2) precursor ion scans followed by a single neutral loss scan, (3) precursor ion scans followed by product ion scans, and (4) segmented neutral loss scans. (5) The common product ion scan can be performed earlier in these sequences, under certain conditions. Simultaneous scans can also be performed. These include multiple precursor ion scans, precursor ion scans with an accompanying neutral loss scan, and multiple neutral loss scans. We argue that the new capability to perform complex simultaneous and sequential MS n operations on single ion populations represents a significant step in increasing the selectivity of mass spectrometry.

  10. Permutation coding technique for image recognition systems.

    PubMed

    Kussul, Ernst M; Baidyk, Tatiana N; Wunsch, Donald C; Makeyev, Oleksandr; Martín, Anabel

    2006-11-01

    A feature extractor and neural classifier for image recognition systems are proposed. The proposed feature extractor is based on the concept of random local descriptors (RLDs). It is followed by the encoder that is based on the permutation coding technique that allows to take into account not only detected features but also the position of each feature on the image and to make the recognition process invariant to small displacements. The combination of RLDs and permutation coding permits us to obtain a sufficiently general description of the image to be recognized. The code generated by the encoder is used as an input data for the neural classifier. Different types of images were used to test the proposed image recognition system. It was tested in the handwritten digit recognition problem, the face recognition problem, and the microobject shape recognition problem. The results of testing are very promising. The error rate for the Modified National Institute of Standards and Technology (MNIST) database is 0.44% and for the Olivetti Research Laboratory (ORL) database it is 0.1%.

  11. A permutationally invariant full-dimensional ab initio potential energy surface for the abstraction and exchange channels of the H + CH{sub 4} system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun, E-mail: jli15@cqu.edu.cn, E-mail: zhangdh@dicp.ac.cn; Department of Chemistry and Chemical Biology, University of New Mexico, Albuquerque, New Mexico 87131; Chen, Jun

    2015-05-28

    We report a permutationally invariant global potential energy surface (PES) for the H + CH{sub 4} system based on ∼63 000 data points calculated at a high ab initio level (UCCSD(T)-F12a/AVTZ) using the recently proposed permutation invariant polynomial-neural network method. The small fitting error (5.1 meV) indicates a faithful representation of the ab initio points over a large configuration space. The rate coefficients calculated on the PES using tunneling corrected transition-state theory and quasi-classical trajectory are found to agree well with the available experimental and previous quantum dynamical results. The calculated total reaction probabilities (J{sub tot} = 0) including themore » abstraction and exchange channels using the new potential by a reduced dimensional quantum dynamic method are essentially the same as those on the Xu-Chen-Zhang PES [Chin. J. Chem. Phys. 27, 373 (2014)].« less

  12. Rank-based permutation approaches for non-parametric factorial designs.

    PubMed

    Umlauft, Maria; Konietschke, Frank; Pauly, Markus

    2017-11-01

    Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.

  13. On quantum Rényi entropies: A new generalization and some properties

    NASA Astrophysics Data System (ADS)

    Müller-Lennert, Martin; Dupuis, Frédéric; Szehr, Oleg; Fehr, Serge; Tomamichel, Marco

    2013-12-01

    The Rényi entropies constitute a family of information measures that generalizes the well-known Shannon entropy, inheriting many of its properties. They appear in the form of unconditional and conditional entropies, relative entropies, or mutual information, and have found many applications in information theory and beyond. Various generalizations of Rényi entropies to the quantum setting have been proposed, most prominently Petz's quasi-entropies and Renner's conditional min-, max-, and collision entropy. However, these quantum extensions are incompatible and thus unsatisfactory. We propose a new quantum generalization of the family of Rényi entropies that contains the von Neumann entropy, min-entropy, collision entropy, and the max-entropy as special cases, thus encompassing most quantum entropies in use today. We show several natural properties for this definition, including data-processing inequalities, a duality relation, and an entropic uncertainty relation.

  14. Structure-based Design of Cyclically Permuted HIV-1 gp120 Trimers That Elicit Neutralizing Antibodies*

    PubMed Central

    Kesavardhana, Sannula; Das, Raksha; Citron, Michael; Datta, Rohini; Ecto, Linda; Srilatha, Nonavinakere Seetharam; DiStefano, Daniel; Swoyer, Ryan; Joyce, Joseph G.; Dutta, Somnath; LaBranche, Celia C.; Montefiori, David C.; Flynn, Jessica A.; Varadarajan, Raghavan

    2017-01-01

    A major goal for HIV-1 vaccine development is an ability to elicit strong and durable broadly neutralizing antibody (bNAb) responses. The trimeric envelope glycoprotein (Env) spikes on HIV-1 are known to contain multiple epitopes that are susceptible to bNAbs isolated from infected individuals. Nonetheless, all trimeric and monomeric Env immunogens designed to date have failed to elicit such antibodies. We report the structure-guided design of HIV-1 cyclically permuted gp120 that forms homogeneous, stable trimers, and displays enhanced binding to multiple bNAbs, including VRC01, VRC03, VRC-PG04, PGT128, and the quaternary epitope-specific bNAbs PGT145 and PGDM1400. Constructs that were cyclically permuted in the V1 loop region and contained an N-terminal trimerization domain to stabilize V1V2-mediated quaternary interactions, showed the highest homogeneity and the best antigenic characteristics. In guinea pigs, a DNA prime-protein boost regimen with these new gp120 trimer immunogens elicited potent neutralizing antibody responses against highly sensitive Tier 1A isolates and weaker neutralizing antibody responses with an average titer of about 115 against a panel of heterologous Tier 2 isolates. A modest fraction of the Tier 2 virus neutralizing activity appeared to target the CD4 binding site on gp120. These results suggest that cyclically permuted HIV-1 gp120 trimers represent a viable platform in which further modifications may be made to eventually achieve protective bNAb responses. PMID:27879316

  15. Upper entropy axioms and lower entropy axioms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Jin-Li, E-mail: phd5816@163.com; Suo, Qi

    2015-04-15

    The paper suggests the concepts of an upper entropy and a lower entropy. We propose a new axiomatic definition, namely, upper entropy axioms, inspired by axioms of metric spaces, and also formulate lower entropy axioms. We also develop weak upper entropy axioms and weak lower entropy axioms. Their conditions are weaker than those of Shannon–Khinchin axioms and Tsallis axioms, while these conditions are stronger than those of the axiomatics based on the first three Shannon–Khinchin axioms. The subadditivity and strong subadditivity of entropy are obtained in the new axiomatics. Tsallis statistics is a special case of satisfying our axioms. Moreover,more » different forms of information measures, such as Shannon entropy, Daroczy entropy, Tsallis entropy and other entropies, can be unified under the same axiomatics.« less

  16. Two-level optimization of composite wing structures based on panel genetic optimization

    NASA Astrophysics Data System (ADS)

    Liu, Boyang

    The design of complex composite structures used in aerospace or automotive vehicles presents a major challenge in terms of computational cost. Discrete choices for ply thicknesses and ply angles leads to a combinatorial optimization problem that is too expensive to solve with presently available computational resources. We developed the following methodology for handling this problem for wing structural design: we used a two-level optimization approach with response-surface approximations to optimize panel failure loads for the upper-level wing optimization. We tailored efficient permutation genetic algorithms to the panel stacking sequence design on the lower level. We also developed approach for improving continuity of ply stacking sequences among adjacent panels. The decomposition approach led to a lower-level optimization of stacking sequence with a given number of plies in each orientation. An efficient permutation genetic algorithm (GA) was developed for handling this problem. We demonstrated through examples that the permutation GAs are more efficient for stacking sequence optimization than a standard GA. Repair strategies for standard GA and the permutation GAs for dealing with constraints were also developed. The repair strategies can significantly reduce computation costs for both standard GA and permutation GA. A two-level optimization procedure for composite wing design subject to strength and buckling constraints is presented. At wing-level design, continuous optimization of ply thicknesses with orientations of 0°, 90°, and +/-45° is performed to minimize weight. At the panel level, the number of plies of each orientation (rounded to integers) and inplane loads are specified, and a permutation genetic algorithm is used to optimize the stacking sequence. The process begins with many panel genetic optimizations for a range of loads and numbers of plies of each orientation. Next, a cubic polynomial response surface is fitted to the optimum buckling load. The resulting response surface is used for wing-level optimization. In general, complex composite structures consist of several laminates. A common problem in the design of such structures is that some plies in the adjacent laminates terminate in the boundary between the laminates. These discontinuities may cause stress concentrations and may increase manufacturing difficulty and cost. We developed measures of continuity of two adjacent laminates. We studied tradeoffs between weight and continuity through a simple composite wing design. Finally, we compared the two-level optimization to a single-level optimization based on flexural lamination parameters. The single-level optimization is efficient and feasible for a wing consisting of unstiffened panels.

  17. Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Michalik, Kazimierz

    2016-10-01

    Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.

  18. 25th anniversary article: scalable multiscale patterned structures inspired by nature: the role of hierarchy.

    PubMed

    Bae, Won-Gyu; Kim, Hong Nam; Kim, Doogon; Park, Suk-Hee; Jeong, Hoon Eui; Suh, Kahp-Yang

    2014-02-01

    Multiscale, hierarchically patterned surfaces, such as lotus leaves, butterfly wings, adhesion pads of gecko lizards are abundantly found in nature, where microstructures are usually used to strengthen the mechanical stability while nanostructures offer the main functionality, i.e., wettability, structural color, or dry adhesion. To emulate such hierarchical structures in nature, multiscale, multilevel patterning has been extensively utilized for the last few decades towards various applications ranging from wetting control, structural colors, to tissue scaffolds. In this review, we highlight recent advances in scalable multiscale patterning to bring about improved functions that can even surpass those found in nature, with particular focus on the analogy between natural and synthetic architectures in terms of the role of different length scales. This review is organized into four sections. First, the role and importance of multiscale, hierarchical structures is described with four representative examples. Second, recent achievements in multiscale patterning are introduced with their strengths and weaknesses. Third, four application areas of wetting control, dry adhesives, selectively filtrating membranes, and multiscale tissue scaffolds are overviewed by stressing out how and why multiscale structures need to be incorporated to carry out their performances. Finally, we present future directions and challenges for scalable, multiscale patterned surfaces. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Graph Theory Meets Ab Initio Molecular Dynamics: Atomic Structures and Transformations at the Nanoscale

    NASA Astrophysics Data System (ADS)

    Pietrucci, Fabio; Andreoni, Wanda

    2011-08-01

    Social permutation invariant coordinates are introduced describing the bond network around a given atom. They originate from the largest eigenvalue and the corresponding eigenvector of the contact matrix, are invariant under permutation of identical atoms, and bear a clear signature of an order-disorder transition. Once combined with ab initio metadynamics, these coordinates are shown to be a powerful tool for the discovery of low-energy isomers of molecules and nanoclusters as well as for a blind exploration of isomerization, association, and dissociation reactions.

  20. Finding fixed satellite service orbital allotments with a k-permutation algorithm

    NASA Technical Reports Server (NTRS)

    Reilly, Charles H.; Mount-Campbell, Clark A.; Gonsalvez, David J. A.

    1990-01-01

    A satellite system synthesis problem, the satellite location problem (SLP), is addressed. In SLP, orbital locations (longitudes) are allotted to geostationary satellites in the fixed satellite service. A linear mixed-integer programming model is presented that views SLP as a combination of two problems: the problem of ordering the satellites and the problem of locating the satellites given some ordering. A special-purpose heuristic procedure, a k-permutation algorithm, has been developed to find solutions to SLPs. Solutions to small sample problems are presented and analyzed on the basis of calculated interferences.

Top