Sample records for permutation entropy rate

  1. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS

    PubMed Central

    Kuai, Moshen; Cheng, Gang; Li, Yong

    2018-01-01

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively. PMID:29510569

  2. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS.

    PubMed

    Kuai, Moshen; Cheng, Gang; Pang, Yusong; Li, Yong

    2018-03-05

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.

  3. Automatic event detection in low SNR microseismic signals based on multi-scale permutation entropy and a support vector machine

    NASA Astrophysics Data System (ADS)

    Jia, Rui-Sheng; Sun, Hong-Mei; Peng, Yan-Jun; Liang, Yong-Quan; Lu, Xin-Ming

    2017-07-01

    Microseismic monitoring is an effective means for providing early warning of rock or coal dynamical disasters, and its first step is microseismic event detection, although low SNR microseismic signals often cannot effectively be detected by routine methods. To solve this problem, this paper presents permutation entropy and a support vector machine to detect low SNR microseismic events. First, an extraction method of signal features based on multi-scale permutation entropy is proposed by studying the influence of the scale factor on the signal permutation entropy. Second, the detection model of low SNR microseismic events based on the least squares support vector machine is built by performing a multi-scale permutation entropy calculation for the collected vibration signals, constructing a feature vector set of signals. Finally, a comparative analysis of the microseismic events and noise signals in the experiment proves that the different characteristics of the two can be fully expressed by using multi-scale permutation entropy. The detection model of microseismic events combined with the support vector machine, which has the features of high classification accuracy and fast real-time algorithms, can meet the requirements of online, real-time extractions of microseismic events.

  4. Permutation entropy of fractional Brownian motion and fractional Gaussian noise

    NASA Astrophysics Data System (ADS)

    Zunino, L.; Pérez, D. G.; Martín, M. T.; Garavaglia, M.; Plastino, A.; Rosso, O. A.

    2008-06-01

    We have worked out theoretical curves for the permutation entropy of the fractional Brownian motion and fractional Gaussian noise by using the Bandt and Shiha [C. Bandt, F. Shiha, J. Time Ser. Anal. 28 (2007) 646] theoretical predictions for their corresponding relative frequencies. Comparisons with numerical simulations show an excellent agreement. Furthermore, the entropy-gap in the transition between these processes, observed previously via numerical results, has been here theoretically validated. Also, we have analyzed the behaviour of the permutation entropy of the fractional Gaussian noise for different time delays.

  5. Generalized permutation entropy analysis based on the two-index entropic form S q , δ

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian

    2015-05-01

    Permutation entropy (PE) is a novel measure to quantify the complexity of nonlinear time series. In this paper, we propose a generalized permutation entropy ( P E q , δ ) based on the recently postulated entropic form, S q , δ , which was proposed as an unification of the well-known Sq of nonextensive-statistical mechanics and S δ , a possibly appropriate candidate for the black-hole entropy. We find that P E q , δ with appropriate parameters can amplify minor changes and trends of complexities in comparison to PE. Experiments with this generalized permutation entropy method are performed with both synthetic and stock data showing its power. Results show that P E q , δ is an exponential function of q and the power ( k ( δ ) ) is a constant if δ is determined. Some discussions about k ( δ ) are provided. Besides, we also find some interesting results about power law.

  6. Weighted fractional permutation entropy and fractional sample entropy for nonlinear Potts financial dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kaixuan; Wang, Jun

    2017-02-01

    In this paper, recently introduced permutation entropy and sample entropy are further developed to the fractional cases, weighted fractional permutation entropy (WFPE) and fractional sample entropy (FSE). The fractional order generalization of information entropy is utilized in the above two complexity approaches, to detect the statistical characteristics of fractional order information in complex systems. The effectiveness analysis of proposed methods on the synthetic data and the real-world data reveals that tuning the fractional order allows a high sensitivity and more accurate characterization to the signal evolution, which is useful in describing the dynamics of complex systems. Moreover, the numerical research on nonlinear complexity behaviors is compared between the returns series of Potts financial model and the actual stock markets. And the empirical results confirm the feasibility of the proposed model.

  7. Multi-scale symbolic transfer entropy analysis of EEG

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-10-01

    From both global and local perspectives, we symbolize two kinds of EEG and analyze their dynamic and asymmetrical information using multi-scale transfer entropy. Multi-scale process with scale factor from 1 to 199 and step size of 2 is applied to EEG of healthy people and epileptic patients, and then the permutation with embedding dimension of 3 and global approach are used to symbolize the sequences. The forward and reverse symbol sequences are taken as the inputs of transfer entropy. Scale factor intervals of permutation and global way are (37, 57) and (65, 85) where the two kinds of EEG have satisfied entropy distinctions. When scale factor is 67, transfer entropy of the healthy and epileptic subjects of permutation, 0.1137 and 0.1028, have biggest difference. And the corresponding values of the global symbolization is 0.0641 and 0.0601 which lies in the scale factor of 165. Research results show that permutation which takes contribution of local information has better distinction and is more effectively applied to our multi-scale transfer entropy analysis of EEG.

  8. The coupling analysis between stock market indices based on permutation measures

    NASA Astrophysics Data System (ADS)

    Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung

    2016-04-01

    Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  9. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring

    PubMed Central

    Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Objective Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Methods Six MSPE algorithms—derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis—were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. Results CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. Conclusions MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales. PMID:27723803

  10. A Comparison of Multiscale Permutation Entropy Measures in On-Line Depth of Anesthesia Monitoring.

    PubMed

    Su, Cui; Liang, Zhenhu; Li, Xiaoli; Li, Duan; Li, Yongwang; Ursino, Mauro

    2016-01-01

    Multiscale permutation entropy (MSPE) is becoming an interesting tool to explore neurophysiological mechanisms in recent years. In this study, six MSPE measures were proposed for on-line depth of anesthesia (DoA) monitoring to quantify the anesthetic effect on the real-time EEG recordings. The performance of these measures in describing the transient characters of simulated neural populations and clinical anesthesia EEG were evaluated and compared. Six MSPE algorithms-derived from Shannon permutation entropy (SPE), Renyi permutation entropy (RPE) and Tsallis permutation entropy (TPE) combined with the decomposition procedures of coarse-graining (CG) method and moving average (MA) analysis-were studied. A thalamo-cortical neural mass model (TCNMM) was used to generate noise-free EEG under anesthesia to quantitatively assess the robustness of each MSPE measure against noise. Then, the clinical anesthesia EEG recordings from 20 patients were analyzed with these measures. To validate their effectiveness, the ability of six measures were compared in terms of tracking the dynamical changes in EEG data and the performance in state discrimination. The Pearson correlation coefficient (R) was used to assess the relationship among MSPE measures. CG-based MSPEs failed in on-line DoA monitoring at multiscale analysis. In on-line EEG analysis, the MA-based MSPE measures at 5 decomposed scales could track the transient changes of EEG recordings and statistically distinguish the awake state, unconsciousness and recovery of consciousness (RoC) state significantly. Compared to single-scale SPE and RPE, MSPEs had better anti-noise ability and MA-RPE at scale 5 performed best in this aspect. MA-TPE outperformed other measures with faster tracking speed of the loss of unconsciousness. MA-based multiscale permutation entropies have the potential for on-line anesthesia EEG analysis with its simple computation and sensitivity to drug effect changes. CG-based multiscale permutation entropies may fail to describe the characteristics of EEG at high decomposition scales.

  11. Generalized composite multiscale permutation entropy and Laplacian score based rolling bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zheng, Jinde; Pan, Haiyang; Yang, Shubao; Cheng, Junsheng

    2018-01-01

    Multiscale permutation entropy (MPE) is a recently proposed nonlinear dynamic method for measuring the randomness and detecting the nonlinear dynamic change of time series and can be used effectively to extract the nonlinear dynamic fault feature from vibration signals of rolling bearing. To solve the drawback of coarse graining process in MPE, an improved MPE method called generalized composite multiscale permutation entropy (GCMPE) was proposed in this paper. Also the influence of parameters on GCMPE and its comparison with the MPE are studied by analyzing simulation data. GCMPE was applied to the fault feature extraction from vibration signal of rolling bearing and then based on the GCMPE, Laplacian score for feature selection and the Particle swarm optimization based support vector machine, a new fault diagnosis method for rolling bearing was put forward in this paper. Finally, the proposed method was applied to analyze the experimental data of rolling bearing. The analysis results show that the proposed method can effectively realize the fault diagnosis of rolling bearing and has a higher fault recognition rate than the existing methods.

  12. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  13. Multiscale permutation entropy analysis of laser beam wandering in isotropic turbulence.

    PubMed

    Olivares, Felipe; Zunino, Luciano; Gulich, Damián; Pérez, Darío G; Rosso, Osvaldo A

    2017-10-01

    We have experimentally quantified the temporal structural diversity from the coordinate fluctuations of a laser beam propagating through isotropic optical turbulence. The main focus here is on the characterization of the long-range correlations in the wandering of a thin Gaussian laser beam over a screen after propagating through a turbulent medium. To fulfill this goal, a laboratory-controlled experiment was conducted in which coordinate fluctuations of the laser beam were recorded at a sufficiently high sampling rate for a wide range of turbulent conditions. Horizontal and vertical displacements of the laser beam centroid were subsequently analyzed by implementing the symbolic technique based on ordinal patterns to estimate the well-known permutation entropy. We show that the permutation entropy estimations at multiple time scales evidence an interplay between different dynamical behaviors. More specifically, a crossover between two different scaling regimes is observed. We confirm a transition from an integrated stochastic process contaminated with electronic noise to a fractional Brownian motion with a Hurst exponent H=5/6 as the sampling time increases. Besides, we are able to quantify, from the estimated entropy, the amount of electronic noise as a function of the turbulence strength. We have also demonstrated that these experimental observations are in very good agreement with numerical simulations of noisy fractional Brownian motions with a well-defined crossover between two different scaling regimes.

  14. Measuring the uncertainty of coupling

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojun; Shang, Pengjian

    2015-06-01

    A new information-theoretic measure, called coupling entropy, is proposed here to detect the causal links in complex systems by taking into account the inner composition alignment of temporal structure. It is a permutation-based asymmetric association measure to infer the uncertainty of coupling between two time series. The coupling entropy is found to be effective in the analysis of Hénon maps, where different noises are added to test its accuracy and sensitivity. The coupling entropy is also applied to analyze the relationship between unemployment rate and CPI change in the U.S., where the CPI change turns out to be the driving variable while the unemployment rate is the responding one.

  15. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis

    PubMed Central

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-01-01

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods. PMID:29690526

  16. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis.

    PubMed

    Yasir, Muhammad Naveed; Koh, Bong-Hwan

    2018-04-21

    This paper presents the local mean decomposition (LMD) integrated with multi-scale permutation entropy (MPE), also known as LMD-MPE, to investigate the rolling element bearing (REB) fault diagnosis from measured vibration signals. First, the LMD decomposed the vibration data or acceleration measurement into separate product functions that are composed of both amplitude and frequency modulation. MPE then calculated the statistical permutation entropy from the product functions to extract the nonlinear features to assess and classify the condition of the healthy and damaged REB system. The comparative experimental results of the conventional LMD-based multi-scale entropy and MPE were presented to verify the authenticity of the proposed technique. The study found that LMD-MPE’s integrated approach provides reliable, damage-sensitive features when analyzing the bearing condition. The results of REB experimental datasets show that the proposed approach yields more vigorous outcomes than existing methods.

  17. A permutation information theory tour through different interest rate maturities: the Libor case.

    PubMed

    Bariviera, Aurelio Fernández; Guercio, María Belén; Martinez, Lisana B; Rosso, Osvaldo A

    2015-12-13

    This paper analyses Libor interest rates for seven different maturities and referred to operations in British pounds, euros, Swiss francs and Japanese yen, during the period 2001-2015. The analysis is performed by means of two quantifiers derived from information theory: the permutation Shannon entropy and the permutation Fisher information measure. An anomalous behaviour in the Libor is detected in all currencies except euros during the years 2006-2012. The stochastic switch is more severe in one, two and three months maturities. Given the special mechanism of Libor setting, we conjecture that the behaviour could have been produced by the manipulation that was uncovered by financial authorities. We argue that our methodology is pertinent as a market overseeing instrument. © 2015 The Author(s).

  18. Weighted multiscale Rényi permutation entropy of nonlinear time series

    NASA Astrophysics Data System (ADS)

    Chen, Shijian; Shang, Pengjian; Wu, Yue

    2018-04-01

    In this paper, based on Rényi permutation entropy (RPE), which has been recently suggested as a relative measure of complexity in nonlinear systems, we propose multiscale Rényi permutation entropy (MRPE) and weighted multiscale Rényi permutation entropy (WMRPE) to quantify the complexity of nonlinear time series over multiple time scales. First, we apply MPRE and WMPRE to the synthetic data and make a comparison of modified methods and RPE. Meanwhile, the influence of the change of parameters is discussed. Besides, we interpret the necessity of considering not only multiscale but also weight by taking the amplitude into account. Then MRPE and WMRPE methods are employed to the closing prices of financial stock markets from different areas. By observing the curves of WMRPE and analyzing the common statistics, stock markets are divided into 4 groups: (1) DJI, S&P500, and HSI, (2) NASDAQ and FTSE100, (3) DAX40 and CAC40, and (4) ShangZheng and ShenCheng. Results show that the standard deviations of weighted methods are smaller, showing WMRPE is able to ensure the results more robust. Besides, WMPRE can provide abundant dynamical properties of complex systems, and demonstrate the intrinsic mechanism.

  19. Confidence intervals and hypothesis testing for the Permutation Entropy with an application to epilepsy

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; O. Redelico, Francisco

    2018-04-01

    In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series. The literature describes some resampling methods of quantities used in nonlinear dynamics - as the largest Lyapunov exponent - but these seems to fail. In this contribution, we propose a parametric bootstrap methodology using a symbolic representation of the time series to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well-known stochastic processes: the 1/fα noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.

  20. Double symbolic joint entropy in nonlinear dynamic complexity analysis

    NASA Astrophysics Data System (ADS)

    Yao, Wenpo; Wang, Jun

    2017-07-01

    Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.

  1. Permutation entropy and statistical complexity analysis of turbulence in laboratory plasmas and the solar wind.

    PubMed

    Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T

    2015-02-01

    The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.

  2. Permutation Entropy and Signal Energy Increase the Accuracy of Neuropathic Change Detection in Needle EMG

    PubMed Central

    2018-01-01

    Background and Objective. Needle electromyography can be used to detect the number of changes and morphological changes in motor unit potentials of patients with axonal neuropathy. General mathematical methods of pattern recognition and signal analysis were applied to recognize neuropathic changes. This study validates the possibility of extending and refining turns-amplitude analysis using permutation entropy and signal energy. Methods. In this study, we examined needle electromyography in 40 neuropathic individuals and 40 controls. The number of turns, amplitude between turns, signal energy, and “permutation entropy” were used as features for support vector machine classification. Results. The obtained results proved the superior classification performance of the combinations of all of the above-mentioned features compared to the combinations of fewer features. The lowest accuracy from the tested combinations of features had peak-ratio analysis. Conclusion. Using the combination of permutation entropy with signal energy, number of turns and mean amplitude in SVM classification can be used to refine the diagnosis of polyneuropathies examined by needle electromyography. PMID:29606959

  3. Analysis of crude oil markets with improved multiscale weighted permutation entropy

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun; Liu, Cheng

    2018-03-01

    Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.

  4. Spatiotemporal Permutation Entropy as a Measure for Complexity of Cardiac Arrhythmia

    NASA Astrophysics Data System (ADS)

    Schlemmer, Alexander; Berg, Sebastian; Lilienkamp, Thomas; Luther, Stefan; Parlitz, Ulrich

    2018-05-01

    Permutation entropy (PE) is a robust quantity for measuring the complexity of time series. In the cardiac community it is predominantly used in the context of electrocardiogram (ECG) signal analysis for diagnoses and predictions with a major application found in heart rate variability parameters. In this article we are combining spatial and temporal PE to form a spatiotemporal PE that captures both, complexity of spatial structures and temporal complexity at the same time. We demonstrate that the spatiotemporal PE (STPE) quantifies complexity using two datasets from simulated cardiac arrhythmia and compare it to phase singularity analysis and spatial PE (SPE). These datasets simulate ventricular fibrillation (VF) on a two-dimensional and a three-dimensional medium using the Fenton-Karma model. We show that SPE and STPE are robust against noise and demonstrate its usefulness for extracting complexity features at different spatial scales.

  5. Characterising Dynamic Instability in High Water-Cut Oil-Water Flows Using High-Resolution Microwave Sensor Signals

    NASA Astrophysics Data System (ADS)

    Liu, Weixin; Jin, Ningde; Han, Yunfeng; Ma, Jing

    2018-06-01

    In the present study, multi-scale entropy algorithm was used to characterise the complex flow phenomena of turbulent droplets in high water-cut oil-water two-phase flow. First, we compared multi-scale weighted permutation entropy (MWPE), multi-scale approximate entropy (MAE), multi-scale sample entropy (MSE) and multi-scale complexity measure (MCM) for typical nonlinear systems. The results show that MWPE presents satisfied variability with scale and anti-noise ability. Accordingly, we conducted an experiment of vertical upward oil-water two-phase flow with high water-cut and collected the signals of a high-resolution microwave resonant sensor, based on which two indexes, the entropy rate and mean value of MWPE, were extracted. Besides, the effects of total flow rate and water-cut on these two indexes were analysed. Our researches show that MWPE is an effective method to uncover the dynamic instability of oil-water two-phase flow with high water-cut.

  6. Classifying epileptic EEG signals with delay permutation entropy and Multi-Scale K-means.

    PubMed

    Zhu, Guohun; Li, Yan; Wen, Peng Paul; Wang, Shuaifang

    2015-01-01

    Most epileptic EEG classification algorithms are supervised and require large training datasets, that hinder their use in real time applications. This chapter proposes an unsupervised Multi-Scale K-means (MSK-means) MSK-means algorithm to distinguish epileptic EEG signals and identify epileptic zones. The random initialization of the K-means algorithm can lead to wrong clusters. Based on the characteristics of EEGs, the MSK-means MSK-means algorithm initializes the coarse-scale centroid of a cluster with a suitable scale factor. In this chapter, the MSK-means algorithm is proved theoretically superior to the K-means algorithm on efficiency. In addition, three classifiers: the K-means, MSK-means MSK-means and support vector machine (SVM), are used to identify seizure and localize epileptogenic zone using delay permutation entropy features. The experimental results demonstrate that identifying seizure with the MSK-means algorithm and delay permutation entropy achieves 4. 7 % higher accuracy than that of K-means, and 0. 7 % higher accuracy than that of the SVM.

  7. Estimation of absolute solvent and solvation shell entropies via permutation reduction

    NASA Astrophysics Data System (ADS)

    Reinhard, Friedemann; Grubmüller, Helmut

    2007-01-01

    Despite its prominent contribution to the free energy of solvated macromolecules such as proteins or DNA, and although principally contained within molecular dynamics simulations, the entropy of the solvation shell is inaccessible to straightforward application of established entropy estimation methods. The complication is twofold. First, the configurational space density of such systems is too complex for a sufficiently accurate fit. Second, and in contrast to the internal macromolecular dynamics, the configurational space volume explored by the diffusive motion of the solvent molecules is too large to be exhaustively sampled by current simulation techniques. Here, we develop a method to overcome the second problem and to significantly alleviate the first one. We propose to exploit the permutation symmetry of the solvent by transforming the trajectory in a way that renders established estimation methods applicable, such as the quasiharmonic approximation or principal component analysis. Our permutation-reduced approach involves a combinatorial problem, which is solved through its equivalence with the linear assignment problem, for which O(N3) methods exist. From test simulations of dense Lennard-Jones gases, enhanced convergence and improved entropy estimates are obtained. Moreover, our approach renders diffusive systems accessible to improved fit functions.

  8. Multiscale permutation entropy analysis of electrocardiogram

    NASA Astrophysics Data System (ADS)

    Liu, Tiebing; Yao, Wenpo; Wu, Min; Shi, Zhaorong; Wang, Jun; Ning, Xinbao

    2017-04-01

    To make a comprehensive nonlinear analysis to ECG, multiscale permutation entropy (MPE) was applied to ECG characteristics extraction to make a comprehensive nonlinear analysis of ECG. Three kinds of ECG from PhysioNet database, congestive heart failure (CHF) patients, healthy young and elderly subjects, are applied in this paper. We set embedding dimension to 4 and adjust scale factor from 2 to 100 with a step size of 2, and compare MPE with multiscale entropy (MSE). As increase of scale factor, MPE complexity of the three ECG signals are showing first-decrease and last-increase trends. When scale factor is between 10 and 32, complexities of the three ECG had biggest difference, entropy of the elderly is 0.146 less than the CHF patients and 0.025 larger than the healthy young in average, in line with normal physiological characteristics. Test results showed that MPE can effectively apply in ECG nonlinear analysis, and can effectively distinguish different ECG signals.

  9. Refined composite multiscale weighted-permutation entropy of financial time series

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  10. A practical comparison of algorithms for the measurement of multiscale entropy in neural time series data.

    PubMed

    Kuntzelman, Karl; Jack Rhodes, L; Harrington, Lillian N; Miskovic, Vladimir

    2018-06-01

    There is a broad family of statistical methods for capturing time series regularity, with increasingly widespread adoption by the neuroscientific community. A common feature of these methods is that they permit investigators to quantify the entropy of brain signals - an index of unpredictability/complexity. Despite the proliferation of algorithms for computing entropy from neural time series data there is scant evidence concerning their relative stability and efficiency. Here we evaluated several different algorithmic implementations (sample, fuzzy, dispersion and permutation) of multiscale entropy in terms of their stability across sessions, internal consistency and computational speed, accuracy and precision using a combination of electroencephalogram (EEG) and synthetic 1/ƒ noise signals. Overall, we report fair to excellent internal consistency and longitudinal stability over a one-week period for the majority of entropy estimates, with several caveats. Computational timing estimates suggest distinct advantages for dispersion and permutation entropy over other entropy estimates. Considered alongside the psychometric evidence, we suggest several ways in which researchers can maximize computational resources (without sacrificing reliability), especially when working with high-density M/EEG data or multivoxel BOLD time series signals. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Information Measures for Multisensor Systems

    DTIC Science & Technology

    2013-12-11

    permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor

  12. Revisiting the European sovereign bonds with a permutation-information-theory approach

    NASA Astrophysics Data System (ADS)

    Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-12-01

    In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.

  13. Permutation Entropy Applied to Movement Behaviors of Drosophila Melanogaster

    NASA Astrophysics Data System (ADS)

    Liu, Yuedan; Chon, Tae-Soo; Baek, Hunki; Do, Younghae; Choi, Jin Hee; Chung, Yun Doo

    Movement of different strains in Drosophila melanogaster was continuously observed by using computer interfacing techniques and was analyzed by permutation entropy (PE) after exposure to toxic chemicals, toluene (0.1 mg/m3) and formaldehyde (0.01 mg/m3). The PE values based on one-dimensional time series position (vertical) data were variable according to internal constraint (i.e. strains) and accordingly increased in response to external constraint (i.e. chemicals) by reflecting diversity in movement patterns from both normal and intoxicated states. Cross-correlation function revealed temporal associations between the PE values and between the component movement patterns in different chemicals and strains through the period of intoxication. The entropy based on the order of position data could be a useful means for complexity measure in behavioral changes and for monitoring the impact of stressors in environment.

  14. EEG entropy measures in anesthesia

    PubMed Central

    Liang, Zhenhu; Wang, Yinghua; Sun, Xue; Li, Duan; Voss, Logan J.; Sleigh, Jamie W.; Hagihira, Satoshi; Li, Xiaoli

    2015-01-01

    Highlights: ► Twelve entropy indices were systematically compared in monitoring depth of anesthesia and detecting burst suppression.► Renyi permutation entropy performed best in tracking EEG changes associated with different anesthesia states.► Approximate Entropy and Sample Entropy performed best in detecting burst suppression. Objective: Entropy algorithms have been widely used in analyzing EEG signals during anesthesia. However, a systematic comparison of these entropy algorithms in assessing anesthesia drugs' effect is lacking. In this study, we compare the capability of 12 entropy indices for monitoring depth of anesthesia (DoA) and detecting the burst suppression pattern (BSP), in anesthesia induced by GABAergic agents. Methods: Twelve indices were investigated, namely Response Entropy (RE) and State entropy (SE), three wavelet entropy (WE) measures [Shannon WE (SWE), Tsallis WE (TWE), and Renyi WE (RWE)], Hilbert-Huang spectral entropy (HHSE), approximate entropy (ApEn), sample entropy (SampEn), Fuzzy entropy, and three permutation entropy (PE) measures [Shannon PE (SPE), Tsallis PE (TPE) and Renyi PE (RPE)]. Two EEG data sets from sevoflurane-induced and isoflurane-induced anesthesia respectively were selected to assess the capability of each entropy index in DoA monitoring and BSP detection. To validate the effectiveness of these entropy algorithms, pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability (Pk) analysis were applied. The multifractal detrended fluctuation analysis (MDFA) as a non-entropy measure was compared. Results: All the entropy and MDFA indices could track the changes in EEG pattern during different anesthesia states. Three PE measures outperformed the other entropy indices, with less baseline variability, higher coefficient of determination (R2) and prediction probability, and RPE performed best; ApEn and SampEn discriminated BSP best. Additionally, these entropy measures showed an advantage in computation efficiency compared with MDFA. Conclusion: Each entropy index has its advantages and disadvantages in estimating DoA. Overall, it is suggested that the RPE index was a superior measure. Investigating the advantages and disadvantages of these entropy indices could help improve current clinical indices for monitoring DoA. PMID:25741277

  15. Efficiency and credit ratings: a permutation-information-theory analysis

    NASA Astrophysics Data System (ADS)

    Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-08-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.

  16. Credit market Jitters in the course of the financial crisis: A permutation entropy approach in measuring informational efficiency in financial assets

    NASA Astrophysics Data System (ADS)

    Siokis, Fotios M.

    2018-06-01

    We explore the evolution of the informational efficiency for specific instruments of the U.S. money, bond and stock exchange markets, prior and after the outbreak of the Great Recession. We utilize the permutation entropy and the complexity-entropy causality plane to rank the time series and measure the degree of informational efficiency. We find that after the credit crunch and the collapse of Lehman Brothers the efficiency level of specific money market instruments' yield falls considerably. This is an evidence of less uncertainty included in predicting the related yields throughout the financial disarray. Similar trend is depicted in the indices of the stock exchange markets but efficiency remains in much higher levels. On the other hand, bond market instruments maintained their efficiency levels even after the outbreak of the crisis, which could be interpreted into greater randomness and less predictability of their yields.

  17. Refined two-index entropy and multiscale analysis for complex system

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2016-10-01

    As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.

  18. Permutation auto-mutual information of electroencephalogram in anesthesia

    NASA Astrophysics Data System (ADS)

    Liang, Zhenhu; Wang, Yinghua; Ouyang, Gaoxiang; Voss, Logan J.; Sleigh, Jamie W.; Li, Xiaoli

    2013-04-01

    Objective. The dynamic change of brain activity in anesthesia is an interesting topic for clinical doctors and drug designers. To explore the dynamical features of brain activity in anesthesia, a permutation auto-mutual information (PAMI) method is proposed to measure the information coupling of electroencephalogram (EEG) time series obtained in anesthesia. Approach. The PAMI is developed and applied on EEG data collected from 19 patients under sevoflurane anesthesia. The results are compared with the traditional auto-mutual information (AMI), SynchFastSlow (SFS, derived from the BIS index), permutation entropy (PE), composite PE (CPE), response entropy (RE) and state entropy (SE). Performance of all indices is assessed by pharmacokinetic/pharmacodynamic (PK/PD) modeling and prediction probability. Main results. The PK/PD modeling and prediction probability analysis show that the PAMI index correlates closely with the anesthetic effect. The coefficient of determination R2 between PAMI values and the sevoflurane effect site concentrations, and the prediction probability Pk are higher in comparison with other indices. The information coupling in EEG series can be applied to indicate the effect of the anesthetic drug sevoflurane on the brain activity as well as other indices. The PAMI of the EEG signals is suggested as a new index to track drug concentration change. Significance. The PAMI is a useful index for analyzing the EEG dynamics during general anesthesia.

  19. Exact Test of Independence Using Mutual Information

    DTIC Science & Technology

    2014-05-23

    1000 × 0.05 = 50. Entropy 2014, 16 2844 Importantly, the permutation test, which does not preserve Markov order, resulted in 489 Type I errors! Using...Block 13 ARO Report Number Block 13: Supplementary Note © 2014 . Published in Entropy , Vol. Ed. 0 16, (7) (2014), (, (7). DoD Components reserve a...official Department of the Army position, policy or decision, unless so designated by other documentation. ... Entropy 2014, 16, 2839-2849; doi:10.3390

  20. A Novel Bearing Multi-Fault Diagnosis Approach Based on Weighted Permutation Entropy and an Improved SVM Ensemble Classifier.

    PubMed

    Zhou, Shenghan; Qian, Silin; Chang, Wenbing; Xiao, Yiyong; Cheng, Yang

    2018-06-14

    Timely and accurate state detection and fault diagnosis of rolling element bearings are very critical to ensuring the reliability of rotating machinery. This paper proposes a novel method of rolling bearing fault diagnosis based on a combination of ensemble empirical mode decomposition (EEMD), weighted permutation entropy (WPE) and an improved support vector machine (SVM) ensemble classifier. A hybrid voting (HV) strategy that combines SVM-based classifiers and cloud similarity measurement (CSM) was employed to improve the classification accuracy. First, the WPE value of the bearing vibration signal was calculated to detect the fault. Secondly, if a bearing fault occurred, the vibration signal was decomposed into a set of intrinsic mode functions (IMFs) by EEMD. The WPE values of the first several IMFs were calculated to form the fault feature vectors. Then, the SVM ensemble classifier was composed of binary SVM and the HV strategy to identify the bearing multi-fault types. Finally, the proposed model was fully evaluated by experiments and comparative studies. The results demonstrate that the proposed method can effectively detect bearing faults and maintain a high accuracy rate of fault recognition when a small number of training samples are available.

  1. Forbidden patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Zanin, Massimiliano

    2008-03-01

    The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.

  2. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  3. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  4. Permutation entropy analysis of financial time series based on Hill's diversity number

    NASA Astrophysics Data System (ADS)

    Zhang, Yali; Shang, Pengjian

    2017-12-01

    In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.

  5. Inferring the Presence of Reverse Proxies Through Timing Analysis

    DTIC Science & Technology

    2015-06-01

    16 Figure 3.2 The three different instances of timing measurement configurations 17 Figure 3.3 Permutation of a web request iteration...Their data showed that they could detect at least 6 bits of entropy between unlike devices and that it was enough to determine that they are in fact...depending on the permutation being executed so that every iteration was conducted under the same distance 15 City   Lat   Long   City   Lat   Long

  6. Multiscale permutation entropy analysis of EEG recordings during sevoflurane anesthesia

    NASA Astrophysics Data System (ADS)

    Li, Duan; Li, Xiaoli; Liang, Zhenhu; Voss, Logan J.; Sleigh, Jamie W.

    2010-08-01

    Electroencephalogram (EEG) monitoring of the effect of anesthetic drugs on the central nervous system has long been used in anesthesia research. Several methods based on nonlinear dynamics, such as permutation entropy (PE), have been proposed to analyze EEG series during anesthesia. However, these measures are still single-scale based and may not completely describe the dynamical characteristics of complex EEG series. In this paper, a novel measure combining multiscale PE information, called CMSPE (composite multi-scale permutation entropy), was proposed for quantifying the anesthetic drug effect on EEG recordings during sevoflurane anesthesia. Three sets of simulated EEG series during awake, light and deep anesthesia were used to select the parameters for the multiscale PE analysis: embedding dimension m, lag τ and scales to be integrated into the CMSPE index. Then, the CMSPE index and raw single-scale PE index were applied to EEG recordings from 18 patients who received sevoflurane anesthesia. Pharmacokinetic/pharmacodynamic (PKPD) modeling was used to relate the measured EEG indices and the anesthetic drug concentration. Prediction probability (Pk) statistics and correlation analysis with the response entropy (RE) index, derived from the spectral entropy (M-entropy module; GE Healthcare, Helsinki, Finland), were investigated to evaluate the effectiveness of the new proposed measure. It was found that raw single-scale PE was blind to subtle transitions between light and deep anesthesia, while the CMSPE index tracked these changes accurately. Around the time of loss of consciousness, CMSPE responded significantly more rapidly than the raw PE, with the absolute slopes of linearly fitted response versus time plots of 0.12 (0.09-0.15) and 0.10 (0.06-0.13), respectively. The prediction probability Pk of 0.86 (0.85-0.88) and 0.85 (0.80-0.86) for CMSPE and raw PE indicated that the CMSPE index correlated well with the underlying anesthetic effect. The correlation coefficient for the comparison between the CMSPE index and RE index of 0.84 (0.80-0.88) was significantly higher than the raw PE index of 0.75 (0.66-0.84). The results show that the CMSPE outperforms the raw single-scale PE in reflecting the sevoflurane drug effect on the central nervous system.

  7. The complexity of gene expression dynamics revealed by permutation entropy

    PubMed Central

    2010-01-01

    Background High complexity is considered a hallmark of living systems. Here we investigate the complexity of temporal gene expression patterns using the concept of Permutation Entropy (PE) first introduced in dynamical systems theory. The analysis of gene expression data has so far focused primarily on the identification of differentially expressed genes, or on the elucidation of pathway and regulatory relationships. We aim to study gene expression time series data from the viewpoint of complexity. Results Applying the PE complexity metric to abiotic stress response time series data in Arabidopsis thaliana, genes involved in stress response and signaling were found to be associated with the highest complexity not only under stress, but surprisingly, also under reference, non-stress conditions. Genes with house-keeping functions exhibited lower PE complexity. Compared to reference conditions, the PE of temporal gene expression patterns generally increased upon stress exposure. High-complexity genes were found to have longer upstream intergenic regions and more cis-regulatory motifs in their promoter regions indicative of a more complex regulatory apparatus needed to orchestrate their expression, and to be associated with higher correlation network connectivity degree. Arabidopsis genes also present in other plant species were observed to exhibit decreased PE complexity compared to Arabidopsis specific genes. Conclusions We show that Permutation Entropy is a simple yet robust and powerful approach to identify temporal gene expression profiles of varying complexity that is equally applicable to other types of molecular profile data. PMID:21176199

  8. Effects of propofol, sevoflurane, remifentanil, and (S)-ketamine in subanesthetic concentrations on visceral and somatosensory pain-evoked potentials.

    PubMed

    Untergehrer, Gisela; Jordan, Denis; Eyl, Sebastian; Schneider, Gerhard

    2013-02-01

    Although electroencephalographic parameters and auditory evoked potentials (AEP) reflect the hypnotic component of anesthesia, there is currently no specific and mechanism-based monitoring tool for anesthesia-induced blockade of nociceptive inputs. The aim of this study was to assess visceral pain-evoked potentials (VPEP) and contact heat-evoked potentials (CHEP) as electroencephalographic indicators of drug-induced changes of visceral and somatosensory pain. Additionally, AEP and electroencephalographic permutation entropy were used to evaluate sedative components of the applied drugs. In a study enrolling 60 volunteers, VPEP, CHEP (amplitude N2-P1), and AEP (latency Nb, amplitude Pa-Nb) were recorded without drug application and at two subanesthetic concentration levels of propofol, sevoflurane, remifentanil, or (s)-ketamine. Drug-induced changes of evoked potentials were analyzed. VPEP were generated by electric stimuli using bipolar electrodes positioned in the distal esophagus. For CHEP, heat pulses were given to the medial aspect of the right forearm using a CHEP stimulator. In addition to AEP, electroencephalographic permutation entropy was used to indicate level of sedation. With increasing concentrations of propofol, sevoflurane, remifentanil, and (s)-ketamine, VPEP and CHEP N2-P1 amplitudes decreased. AEP and electroencephalographic permutation entropy showed neither clinically relevant nor statistically significant suppression of cortical activity during drug application. Decreasing VPEP and CHEP amplitudes under subanesthetic concentrations of propofol, sevoflurane, remifentanil, and (s)-ketamine indicate suppressive drug effects. These effects seem to be specific for analgesia.

  9. Regional Value Analysis at Threat Evaluation

    DTIC Science & Technology

    2014-06-01

    targets based on information entropy and fuzzy optimization theory. in Industrial Engineering and Engineering Management (IEEM), 2011 IEEE...Assignment by Virtual Permutation and Tabu Search Heuristics. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 2010

  10. Permutation entropy with vector embedding delays

    NASA Astrophysics Data System (ADS)

    Little, Douglas J.; Kane, Deb M.

    2017-12-01

    Permutation entropy (PE) is a statistic used widely for the detection of structure within a time series. Embedding delay times at which the PE is reduced are characteristic timescales for which such structure exists. Here, a generalized scheme is investigated where embedding delays are represented by vectors rather than scalars, permitting PE to be calculated over a (D -1 ) -dimensional space, where D is the embedding dimension. This scheme is applied to numerically generated noise, sine wave and logistic map series, and experimental data sets taken from a vertical-cavity surface emitting laser exhibiting temporally localized pulse structures within the round-trip time of the laser cavity. Results are visualized as PE maps as a function of embedding delay, with low PE values indicating combinations of embedding delays where correlation structure is present. It is demonstrated that vector embedding delays enable identification of structure that is ambiguous or masked, when the embedding delay is constrained to scalar form.

  11. A fault diagnosis scheme for planetary gearboxes using adaptive multi-scale morphology filter and modified hierarchical permutation entropy

    NASA Astrophysics Data System (ADS)

    Li, Yongbo; Li, Guoyan; Yang, Yuantao; Liang, Xihui; Xu, Minqiang

    2018-05-01

    The fault diagnosis of planetary gearboxes is crucial to reduce the maintenance costs and economic losses. This paper proposes a novel fault diagnosis method based on adaptive multi-scale morphological filter (AMMF) and modified hierarchical permutation entropy (MHPE) to identify the different health conditions of planetary gearboxes. In this method, AMMF is firstly adopted to remove the fault-unrelated components and enhance the fault characteristics. Second, MHPE is utilized to extract the fault features from the denoised vibration signals. Third, Laplacian score (LS) approach is employed to refine the fault features. In the end, the obtained features are fed into the binary tree support vector machine (BT-SVM) to accomplish the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault categories of planetary gearboxes.

  12. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    DTIC Science & Technology

    2014-07-01

    to the query graph, or subgraph permutations with the same mismatch cost (often the case for homogeneous and/or symmetrical data/query). To avoid...decisions are generated in a bottom-up manner using the metric of entropy at the cluster level (Figure 9c). Using the definition of belief messages...for a cluster and a set of data nodes in this cluster , we compute the entropy for forward and backward messages as (,) = −∑ (

  13. Permutation entropy analysis of heart rate variability for the assessment of cardiovascular autonomic neuropathy in type 1 diabetes mellitus.

    PubMed

    Carricarte Naranjo, Claudia; Sanchez-Rodriguez, Lazaro M; Brown Martínez, Marta; Estévez Báez, Mario; Machado García, Andrés

    2017-07-01

    Heart rate variability (HRV) analysis is a relevant tool for the diagnosis of cardiovascular autonomic neuropathy (CAN). To our knowledge, no previous investigation on CAN has assessed the complexity of HRV from an ordinal perspective. Therefore, the aim of this work is to explore the potential of permutation entropy (PE) analysis of HRV complexity for the assessment of CAN. For this purpose, we performed a short-term PE analysis of HRV in healthy subjects and type 1 diabetes mellitus patients, including patients with CAN. Standard HRV indicators were also calculated in the control group. A discriminant analysis was used to select the variables combination with best discriminative power between control and CAN patients groups, as well as for classifying cases. We found that for some specific temporal scales, PE indicators were significantly lower in CAN patients than those calculated for controls. In such cases, there were ordinal patterns with high probabilities of occurrence, while others were hardly found. We posit this behavior occurs due to a decrease of HRV complexity in the diseased system. Discriminant functions based on PE measures or probabilities of occurrence of ordinal patterns provided an average of 75% and 96% classification accuracy. Correlations of PE and HRV measures showed to depend only on temporal scale, regardless of pattern length. PE analysis at some specific temporal scales, seem to provide additional information to that obtained with traditional HRV methods. We concluded that PE analysis of HRV is a promising method for the assessment of CAN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Complexity analysis of the turbulent environmental fluid flow time series

    NASA Astrophysics Data System (ADS)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  15. Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation.

    PubMed

    Azami, Hamed; Escudero, Javier

    2016-05-01

    Signal segmentation and spike detection are two important biomedical signal processing applications. Often, non-stationary signals must be segmented into piece-wise stationary epochs or spikes need to be found among a background of noise before being further analyzed. Permutation entropy (PE) has been proposed to evaluate the irregularity of a time series. PE is conceptually simple, structurally robust to artifacts, and computationally fast. It has been extensively used in many applications, but it has two key shortcomings. First, when a signal is symbolized using the Bandt-Pompe procedure, only the order of the amplitude values is considered and information regarding the amplitudes is discarded. Second, in the PE, the effect of equal amplitude values in each embedded vector is not addressed. To address these issues, we propose a new entropy measure based on PE: the amplitude-aware permutation entropy (AAPE). AAPE is sensitive to the changes in the amplitude, in addition to the frequency, of the signals thanks to it being more flexible than the classical PE in the quantification of the signal motifs. To demonstrate how the AAPE method can enhance the quality of the signal segmentation and spike detection, a set of synthetic and realistic synthetic neuronal signals, electroencephalograms and neuronal data are processed. We compare the performance of AAPE in these problems against state-of-the-art approaches and evaluate the significance of the differences with a repeated ANOVA with post hoc Tukey's test. In signal segmentation, the accuracy of AAPE-based method is higher than conventional segmentation methods. AAPE also leads to more robust results in the presence of noise. The spike detection results show that AAPE can detect spikes well, even when presented with single-sample spikes, unlike PE. For multi-sample spikes, the changes in AAPE are larger than in PE. We introduce a new entropy metric, AAPE, that enables us to consider amplitude information in the formulation of PE. The AAPE algorithm can be used in almost every irregularity-based application in various signal and image processing fields. We also made freely available the Matlab code of the AAPE. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Complexity of heart rate fluctuations in near-term sheep and human fetuses during sleep.

    PubMed

    Frank, Birgit; Frasch, Martin G; Schneider, Uwe; Roedel, Marcus; Schwab, Matthias; Hoyer, Dirk

    2006-10-01

    We investigated how the complexity of fetal heart rate fluctuations (fHRF) is related to the sleep states in sheep and human fetuses. The complexity as a function of time scale for fetal heart rate data for 7 sheep and 27 human fetuses was estimated in rapid eye movement (REM) and non-REM sleep by means of permutation entropy and the associated Kullback-Leibler entropy. We found that in humans, fHRF complexity is higher in non-REM than REM sleep, whereas in sheep this relationship is reversed. To show this relation, choice of the appropriate time scale is crucial. In sheep fetuses, we found differences in the complexity of fHRF between REM and non-REM sleep only for larger time scales (above 2.5 s), whereas in human fetuses the complexity was clearly different between REM and non-REM sleep over the whole range of time scales. This may be due to inherent time scales of complexity, which reflect species-specific functions of the autonomic nervous system. Such differences have to be considered when animal data are translated to the human situation.

  17. Extending Differential Fault Analysis to Dynamic S-Box Advanced Encryption Standard Implementations

    DTIC Science & Technology

    2014-09-18

    entropy . At the same time, researchers strive to enhance AES and mitigate these growing threats. This paper researches the extension of existing...the algorithm or use side channels to reduce entropy , such as Differential Fault Analysis (DFA). At the same time, continuing research strives to...the state matrix. The S-box is an 8-bit 16x16 table built from an affine transformation on multiplicative inverses which guarantees full permutation (S

  18. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  19. Shortening a loop can increase protein native state entropy.

    PubMed

    Gavrilov, Yulian; Dagan, Shlomi; Levy, Yaakov

    2015-12-01

    Protein loops are essential structural elements that influence not only function but also protein stability and folding rates. It was recently reported that shortening a loop in the AcP protein may increase its native state conformational entropy. This effect on the entropy of the folded state can be much larger than the lower entropic penalty of ordering a shorter loop upon folding, and can therefore result in a more pronounced stabilization than predicted by polymer model for loop closure entropy. In this study, which aims at generalizing the effect of loop length shortening on native state dynamics, we use all-atom molecular dynamics simulations to study how gradual shortening a very long or solvent-exposed loop region in four different proteins can affect their stability. For two proteins, AcP and Ubc7, we show an increase in native state entropy in addition to the known effect of the loop length on the unfolded state entropy. However, for two permutants of SH3 domain, shortening a loop results only with the expected change in the entropy of the unfolded state, which nicely reproduces the observed experimental stabilization. Here, we show that an increase in the native state entropy following loop shortening is not unique to the AcP protein, yet nor is it a general rule that applies to all proteins following the truncation of any loop. This modification of the loop length on the folded state and on the unfolded state may result with a greater effect on protein stability. © 2015 Wiley Periodicals, Inc.

  20. On the efficiency of sovereign bond markets

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Fernández Bariviera, Aurelio; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2012-09-01

    The existence of memory in financial time series has been extensively studied for several stock markets around the world by means of different approaches. However, fixed income markets, i.e. those where corporate and sovereign bonds are traded, have been much less studied. We believe that, given the relevance of these markets, not only from the investors', but also from the issuers' point of view (government and firms), it is necessary to fill this gap in the literature. In this paper, we study the sovereign market efficiency of thirty bond indices of both developed and emerging countries, using an innovative statistical tool in the financial literature: the complexity-entropy causality plane. This representation space allows us to establish an efficiency ranking of different markets and distinguish different bond market dynamics. We conclude that the classification derived from the complexity-entropy causality plane is consistent with the qualifications assigned by major rating companies to the sovereign instruments. Additionally, we find a correlation between permutation entropy, economic development and market size that could be of interest for policy makers and investors.

  1. Cipher image damage and decisions in real time

    NASA Astrophysics Data System (ADS)

    Silva-García, Victor Manuel; Flores-Carapia, Rolando; Rentería-Márquez, Carlos; Luna-Benoso, Benjamín; Jiménez-Vázquez, Cesar Antonio; González-Ramírez, Marlon David

    2015-01-01

    This paper proposes a method for constructing permutations on m position arrangements. Our objective is to encrypt color images using advanced encryption standard (AES), using variable permutations means a different one for each 128-bit block in the first round after the x-or operation is applied. Furthermore, this research offers the possibility of knowing the original image when the encrypted figure suffered a failure from either an attack or not. This is achieved by permuting the original image pixel positions before being encrypted with AES variable permutations, which means building a pseudorandom permutation of 250,000 position arrays or more. To this end, an algorithm that defines a bijective function between the nonnegative integer and permutation sets is built. From this algorithm, the way to build permutations on the 0,1,…,m-1 array, knowing m-1 constants, is presented. The transcendental numbers are used to select these m-1 constants in a pseudorandom way. The quality of the proposed encryption according to the following criteria is evaluated: the correlation coefficient, the entropy, and the discrete Fourier transform. A goodness-of-fit test for each basic color image is proposed to measure the bits randomness degree of the encrypted figure. On the other hand, cipher images are obtained in a loss-less encryption way, i.e., no JPEG file formats are used.

  2. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    performing. All reasonable permutations of factors will be used to develop a multitude of unique combinations. These combinations are considered different...are seen below (Duda et al., 2001). Entropy impurity: () = −�P�ωj�log2P(ωj) j (9) Gini impurity: () =�()�� = 1 2 ∗ [1...proportion of one class to another approaches 0.5, the impurity measure reaches its maximum, which for Entropy is 1.0, while it is 0.5 for Gini and

  3. Phase Transitions in Definite Total Spin States of Two-Component Fermi Gases.

    PubMed

    Yurovsky, Vladimir A

    2017-05-19

    Second-order phase transitions have no latent heat and are characterized by a change in symmetry. In addition to the conventional symmetric and antisymmetric states under permutations of bosons and fermions, mathematical group-representation theory allows for non-Abelian permutation symmetry. Such symmetry can be hidden in states with defined total spins of spinor gases, which can be formed in optical cavities. The present work shows that the symmetry reveals itself in spin-independent or coordinate-independent properties of these gases, namely as non-Abelian entropy in thermodynamic properties. In weakly interacting Fermi gases, two phases appear associated with fermionic and non-Abelian symmetry under permutations of particle states, respectively. The second-order transitions between the phases are characterized by discontinuities in specific heat. Unlike other phase transitions, the present ones are not caused by interactions and can appear even in ideal gases. Similar effects in Bose gases and strong interactions are discussed.

  4. Monitoring the informational efficiency of European corporate bond markets with dynamical permutation min-entropy

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Bariviera, Aurelio F.; Guercio, M. Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2016-08-01

    In this paper the permutation min-entropy has been implemented to unveil the presence of temporal structures in the daily values of European corporate bond indices from April 2001 to August 2015. More precisely, the informational efficiency evolution of the prices of fifteen sectorial indices has been carefully studied by estimating this information-theory-derived symbolic tool over a sliding time window. Such a dynamical analysis makes possible to obtain relevant conclusions about the effect that the 2008 credit crisis has had on the different European corporate bond sectors. It is found that the informational efficiency of some sectors, namely banks, financial services, insurance, and basic resources, has been strongly reduced due to the financial crisis whereas another set of sectors, integrated by chemicals, automobiles, media, energy, construction, industrial goods & services, technology, and telecommunications has only suffered a transitory loss of efficiency. Last but not least, the food & beverage, healthcare, and utilities sectors show a behavior close to a random walk practically along all the period of analysis, confirming a remarkable immunity against the 2008 financial crisis.

  5. Permutation entropy of finite-length white-noise time series.

    PubMed

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  6. Analysis of HD 73045 light curve data

    NASA Astrophysics Data System (ADS)

    Das, Mrinal Kanti; Bhatraju, Naveen Kumar; Joshi, Santosh

    2018-04-01

    In this work we analyzed the Kepler light curve data of HD 73045. The raw data has been smoothened using standard filters. The power spectrum has been obtained by using a fast Fourier transform routine. It shows the presence of more than one period. In order to take care of any non-stationary behavior, we carried out a wavelet analysis to obtain the wavelet power spectrum. In addition, to identify the scale invariant structure, the data has been analyzed using a multifractal detrended fluctuation analysis. Further to characterize the diversity of embedded patterns in the HD 73045 flux time series, we computed various entropy-based complexity measures e.g. sample entropy, spectral entropy and permutation entropy. The presence of periodic structure in the time series was further analyzed using the visibility network and horizontal visibility network model of the time series. The degree distributions in the two network models confirm such structures.

  7. Complexity of cardiovascular rhythms during head-up tilt test by entropy of patterns.

    PubMed

    Wejer, Dorota; Graff, Beata; Makowiec, Danuta; Budrejko, Szymon; Struzik, Zbigniew R

    2017-05-01

    The head-up tilt (HUT) test, which provokes transient dynamical alterations in the regulation of cardiovascular system, provides insights into complex organization of this system. Based on signals with heart period intervals (RR-intervals) and/or systolic blood pressure (SBP), differences in the cardiovascular regulation between vasovagal patients (VVS) and the healthy people group (CG) are investigated. Short-term relations among signal data represented symbolically by three-beat patterns allow to qualify and quantify the complexity of the cardiovascular regulation by Shannon entropy. Four types of patterns: permutation, ordinal, deterministic and dynamical, are used, and different resolutions of signal values in the the symbolization are applied in order to verify how entropy of patterns depends on a way in which values of signals are preprocessed. At rest, in the physiologically important signal resolution ranges, independently of the type of patterns used in estimates, the complexity of SBP signals in VVS is different from the complexity found in CG. Entropy of VVS is higher than CG what could be interpreted as substantial presence of noisy ingredients in SBP of VVS. After tilting this relation switches. Entropy of CG occurs significantly higher than VVS for SBP signals. In the case of RR-intervals and large resolutions, the complexity after the tilt becomes reduced when compared to the complexity of RR-intervals at rest for both groups. However, in the case of VVS patients this reduction is significantly stronger than in CG. Our observations about opposite switches in entropy between CG and VVS might support a hypothesis that baroreflex in VVS affects stronger the heart rate because of the inefficient regulation (possibly impaired local vascular tone alternations) of the blood pressure.

  8. Characterization of Early Partial Seizure Onset: Frequency, Complexity and Entropy

    PubMed Central

    Jouny, Christophe C.; Bergey, Gregory K.

    2011-01-01

    Objective A clear classification of partial seizures onset features is not yet established. Complexity and entropy have been very widely used to describe dynamical systems, but a systematic evaluation of these measures to characterize partial seizures has never been performed. Methods Eighteen different measures including power in frequency bands up to 300Hz, Gabor atom density (GAD), Higuchi fractal dimension (HFD), Lempel-Ziv complexity, Shannon entropy, sample entropy, and permutation entropy, were selected to test sensitivity to partial seizure onset. Intracranial recordings from forty-five patients with mesial temporal, neocortical temporal and neocortical extratemporal seizure foci were included (331 partial seizures). Results GAD, Lempel-Ziv complexity, HFD, high frequency activity, and sample entropy were the most reliable measures to assess early seizure onset. Conclusions Increases in complexity and occurrence of high-frequency components appear to be commonly associated with early stages of partial seizure evolution from all regions. The type of measure (frequency-based, complexity or entropy) does not predict the efficiency of the method to detect seizure onset. Significance Differences between measures such as GAD and HFD highlight the multimodal nature of partial seizure onsets. Improved methods for early seizure detection may be achieved from a better understanding of these underlying dynamics. PMID:21872526

  9. Bubble Entropy: An Entropy Almost Free of Parameters.

    PubMed

    Manis, George; Aktaruzzaman, Md; Sassi, Roberto

    2017-11-01

    Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors. Objective : A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy . Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.

  10. Spectral Dynamics of Resting State fMRI Within the Ventral Tegmental Area and Dorsal Raphe Nuclei in Medication-Free Major Depressive Disorder in Young Adults.

    PubMed

    Wohlschläger, Afra; Karne, Harish; Jordan, Denis; Lowe, Mark J; Jones, Stephen E; Anand, Amit

    2018-01-01

    Background: Dorsal raphe nucleus (DRN) and ventral tegmental area (VTA) are major brainstem monamine nuclei consisting of serotonin and dopamine neurons respectively. Animal studies show that firing patterns in both nuclei are altered when animals exhibit depression like behaviors. Functional MRI studies in humans have shown reduced VTA activation and DRN connectivity in depression. This study for the first time aims at investigating the functional integrity of local neuronal firing concurrently in both the VTA and DRN in vivo in humans using spectral analysis of resting state low frequency fluctuation fMRI. Method: A total of 97 medication-free subjects-67 medication-free young patients (ages 18-30) with major depressive disorder and 30 closely matched healthy controls were included in the study to detect aberrant dynamics in DRN and VTA. For the investigation of altered localized dynamics we conducted power spectral analysis and above this spectral cross correlation between the two groups. Complementary to this, spectral dependence of permutation entropy, an information theoretical measure, was compared between groups. Results: Patients displayed significant spectral slowing in VTA vs. controls ( p = 0.035, corrected). In DRN, spectral slowing was less pronounced, but the amount of slowing significantly correlated with 17-item Hamilton Depression Rating scores of depression severity ( p = 0.038). Signal complexity as assessed via permutation entropy showed spectral alterations inline with the results on spectral slowing. Conclusion: Our results indicate that altered functional dynamics of VTA and DRN in depression can be detected from regional fMRI signal. On this basis, impact of antidepressant treatment and treatment response can be assessed using these markers in future studies.

  11. Folding pathway of a multidomain protein depends on its topology of domain connectivity

    PubMed Central

    Inanami, Takashi; Terada, Tomoki P.; Sasai, Masaki

    2014-01-01

    How do the folding mechanisms of multidomain proteins depend on protein topology? We addressed this question by developing an Ising-like structure-based model and applying it for the analysis of free-energy landscapes and folding kinetics of an example protein, Escherichia coli dihydrofolate reductase (DHFR). DHFR has two domains, one comprising discontinuous N- and C-terminal parts and the other comprising a continuous middle part of the chain. The simulated folding pathway of DHFR is a sequential process during which the continuous domain folds first, followed by the discontinuous domain, thereby avoiding the rapid decrease in conformation entropy caused by the association of the N- and C-terminal parts during the early phase of folding. Our simulated results consistently explain the observed experimental data on folding kinetics and predict an off-pathway structural fluctuation at equilibrium. For a circular permutant for which the topological complexity of wild-type DHFR is resolved, the balance between energy and entropy is modulated, resulting in the coexistence of the two folding pathways. This coexistence of pathways should account for the experimentally observed complex folding behavior of the circular permutant. PMID:25267632

  12. Complexity-entropy causality plane: A useful approach for distinguishing songs

    NASA Astrophysics Data System (ADS)

    Ribeiro, Haroldo V.; Zunino, Luciano; Mendes, Renio S.; Lenzi, Ervin K.

    2012-04-01

    Nowadays we are often faced with huge databases resulting from the rapid growth of data storage technologies. This is particularly true when dealing with music databases. In this context, it is essential to have techniques and tools able to discriminate properties from these massive sets. In this work, we report on a statistical analysis of more than ten thousand songs aiming to obtain a complexity hierarchy. Our approach is based on the estimation of the permutation entropy combined with an intensive complexity measure, building up the complexity-entropy causality plane. The results obtained indicate that this representation space is very promising to discriminate songs as well as to allow a relative quantitative comparison among songs. Additionally, we believe that the here-reported method may be applied in practical situations since it is simple, robust and has a fast numerical implementation.

  13. Symmetric encryption algorithms using chaotic and non-chaotic generators: A review

    PubMed Central

    Radwan, Ahmed G.; AbdElHaleem, Sherif H.; Abd-El-Hafiz, Salwa K.

    2015-01-01

    This paper summarizes the symmetric image encryption results of 27 different algorithms, which include substitution-only, permutation-only or both phases. The cores of these algorithms are based on several discrete chaotic maps (Arnold’s cat map and a combination of three generalized maps), one continuous chaotic system (Lorenz) and two non-chaotic generators (fractals and chess-based algorithms). Each algorithm has been analyzed by the correlation coefficients between pixels (horizontal, vertical and diagonal), differential attack measures, Mean Square Error (MSE), entropy, sensitivity analyses and the 15 standard tests of the National Institute of Standards and Technology (NIST) SP-800-22 statistical suite. The analyzed algorithms include a set of new image encryption algorithms based on non-chaotic generators, either using substitution only (using fractals) and permutation only (chess-based) or both. Moreover, two different permutation scenarios are presented where the permutation-phase has or does not have a relationship with the input image through an ON/OFF switch. Different encryption-key lengths and complexities are provided from short to long key to persist brute-force attacks. In addition, sensitivities of those different techniques to a one bit change in the input parameters of the substitution key as well as the permutation key are assessed. Finally, a comparative discussion of this work versus many recent research with respect to the used generators, type of encryption, and analyses is presented to highlight the strengths and added contribution of this paper. PMID:26966561

  14. Development of isothermal-isobaric replica-permutation method for molecular dynamics and Monte Carlo simulations and its application to reveal temperature and pressure dependence of folded, misfolded, and unfolded states of chignolin

    NASA Astrophysics Data System (ADS)

    Yamauchi, Masataka; Okumura, Hisashi

    2017-11-01

    We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.

  15. Detection of frequency-mode-shift during thermoacoustic combustion oscillations in a staged aircraft engine model combustor

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hiroaki; Gotoda, Hiroshi; Tachibana, Shigeru; Yoshida, Seiji

    2017-12-01

    We conduct an experimental study using time series analysis based on symbolic dynamics to detect a precursor of frequency-mode-shift during thermoacoustic combustion oscillations in a staged aircraft engine model combustor. With increasing amount of the main fuel, a significant shift in the dominant frequency-mode occurs in noisy periodic dynamics, leading to a notable increase in oscillation amplitudes. The sustainment of noisy periodic dynamics during thermoacoustic combustion oscillations is clearly shown by the multiscale complexity-entropy causality plane in terms of statistical complexity. A modified version of the permutation entropy allows us to detect a precursor of the frequency-mode-shift before the amplification of pressure fluctuations.

  16. Time-delay signature of chaos in 1550 nm VCSELs with variable-polarization FBG feedback.

    PubMed

    Li, Yan; Wu, Zheng-Mao; Zhong, Zhu-Qiang; Yang, Xian-Jie; Mao, Song; Xia, Guang-Qiong

    2014-08-11

    Based on the framework of spin-flip model (SFM), the output characteristics of a 1550 nm vertical-cavity surface-emitting laser (VCSEL) subject to variable-polarization fiber Bragg grating (FBG) feedback (VPFBGF) have been investigated. With the aid of the self-correlation function (SF) and the permutation entropy (PE) function, the time-delay signature (TDS) of chaos in the VPFBGF-VCSEL is evaluated, and then the influences of the operation parameters on the TDS of chaos are analyzed. The results show that the TDS of chaos can be suppressed efficiently through selecting suitable coupling coefficient and feedback rate of the FBG, and is weaker than that of chaos generated by traditional variable-polarization mirror feedback VCSELs (VPMF-VCSELs) or polarization-preserved FBG feedback VCSELs (PPFBGF-VCSELs).

  17. Dispersion entropy for the analysis of resting-state MEG regularity in Alzheimer's disease.

    PubMed

    Azami, Hamed; Rostaghi, Mostafa; Fernandez, Alberto; Escudero, Javier

    2016-08-01

    Alzheimer's disease (AD) is a progressive degenerative brain disorder affecting memory, thinking, behaviour and emotion. It is the most common form of dementia and a big social problem in western societies. The analysis of brain activity may help to diagnose this disease. Changes in entropy methods have been reported useful in research studies to characterize AD. We have recently proposed dispersion entropy (DisEn) as a very fast and powerful tool to quantify the irregularity of time series. The aim of this paper is to evaluate the ability of DisEn, in comparison with fuzzy entropy (FuzEn), sample entropy (SampEn), and permutation entropy (PerEn), to discriminate 36 AD patients from 26 elderly control subjects using resting-state magnetoencephalogram (MEG) signals. The results obtained by DisEn, FuzEn, and SampEn, unlike PerEn, show that the AD patients' signals are more regular than controls' time series. The p-values obtained by DisEn, FuzEn, SampEn, and PerEn based methods demonstrate the superiority of DisEn over PerEn, SampEn, and PerEn. Moreover, the computation time for the newly proposed DisEn-based method is noticeably less than for the FuzEn, SampEn, and PerEn based approaches.

  18. Comparison of background EEG activity of different groups of patients with idiopathic epilepsy using Shannon spectral entropy and cluster-based permutation statistical testing

    PubMed Central

    Artieda, Julio; Iriarte, Jorge

    2017-01-01

    Idiopathic epilepsy is characterized by generalized seizures with no apparent cause. One of its main problems is the lack of biomarkers to monitor the evolution of patients. The only tools they can use are limited to inspecting the amount of seizures during previous periods of time and assessing the existence of interictal discharges. As a result, there is a need for improving the tools to assist the diagnosis and follow up of these patients. The goal of the present study is to compare and find a way to differentiate between two groups of patients suffering from idiopathic epilepsy, one group that could be followed-up by means of specific electroencephalographic (EEG) signatures (intercritical activity present), and another one that could not due to the absence of these markers. To do that, we analyzed the background EEG activity of each in the absence of seizures and epileptic intercritical activity. We used the Shannon spectral entropy (SSE) as a metric to discriminate between the two groups and performed permutation-based statistical tests to detect the set of frequencies that show significant differences. By constraining the spectral entropy estimation to the [6.25–12.89) Hz range, we detect statistical differences (at below 0.05 alpha-level) between both types of epileptic patients at all available recording channels. Interestingly, entropy values follow a trend that is inversely related to the elapsed time from the last seizure. Indeed, this trend shows asymptotical convergence to the SSE values measured in a group of healthy subjects, which present SSE values lower than any of the two groups of patients. All these results suggest that the SSE, measured in a specific range of frequencies, could serve to follow up the evolution of patients suffering from idiopathic epilepsy. Future studies remain to be conducted in order to assess the predictive value of this approach for the anticipation of seizures. PMID:28922360

  19. EEG entropy measures indicate decrease of cortical information processing in Disorders of Consciousness.

    PubMed

    Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel

    2016-02-01

    Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  1. Classification of Partial Discharge Signals by Combining Adaptive Local Iterative Filtering and Entropy Features

    PubMed Central

    Morison, Gordon; Boreham, Philip

    2018-01-01

    Electromagnetic Interference (EMI) is a technique for capturing Partial Discharge (PD) signals in High-Voltage (HV) power plant apparatus. EMI signals can be non-stationary which makes their analysis difficult, particularly for pattern recognition applications. This paper elaborates upon a previously developed software condition-monitoring model for improved EMI events classification based on time-frequency signal decomposition and entropy features. The idea of the proposed method is to map multiple discharge source signals captured by EMI and labelled by experts, including PD, from the time domain to a feature space, which aids in the interpretation of subsequent fault information. Here, instead of using only one permutation entropy measure, a more robust measure, called Dispersion Entropy (DE), is added to the feature vector. Multi-Class Support Vector Machine (MCSVM) methods are utilized for classification of the different discharge sources. Results show an improved classification accuracy compared to previously proposed methods. This yields to a successful development of an expert’s knowledge-based intelligent system. Since this method is demonstrated to be successful with real field data, it brings the benefit of possible real-world application for EMI condition monitoring. PMID:29385030

  2. Permutational symmetries for coincidence rates in multimode multiphotonic interferometry

    NASA Astrophysics Data System (ADS)

    Khalid, Abdullah; Spivak, Dylan; Sanders, Barry C.; de Guise, Hubert

    2018-06-01

    We obtain coincidence rates for passive optical interferometry by exploiting the permutational symmetries of partially distinguishable input photons, and our approach elucidates qualitative features of multiphoton coincidence landscapes. We treat the interferometer input as a product state of any number of photons in each input mode with photons distinguished by their arrival time. Detectors at the output of the interferometer count photons from each output mode over a long integration time. We generalize and prove the claim of Tillmann et al. [Phys. Rev. X 5, 041015 (2015), 10.1103/PhysRevX.5.041015] that coincidence rates can be elegantly expressed in terms of immanants. Immanants are functions of matrices that exhibit permutational symmetries and the immanants appearing in our coincidence-rate expressions share permutational symmetries with the input state. Our results are obtained by employing representation theory of the symmetric group to analyze systems of an arbitrary number of photons in arbitrarily sized interferometers.

  3. MEMD-enhanced multivariate fuzzy entropy for the evaluation of complexity in biomedical signals.

    PubMed

    Azami, Hamed; Smith, Keith; Escudero, Javier

    2016-08-01

    Multivariate multiscale entropy (mvMSE) has been proposed as a combination of the coarse-graining process and multivariate sample entropy (mvSE) to quantify the irregularity of multivariate signals. However, both the coarse-graining process and mvSE may not be reliable for short signals. Although the coarse-graining process can be replaced with multivariate empirical mode decomposition (MEMD), the relative instability of mvSE for short signals remains a problem. Here, we address this issue by proposing the multivariate fuzzy entropy (mvFE) with a new fuzzy membership function. The results using white Gaussian noise show that the mvFE leads to more reliable and stable results, especially for short signals, in comparison with mvSE. Accordingly, we propose MEMD-enhanced mvFE to quantify the complexity of signals. The characteristics of brain regions influenced by partial epilepsy are investigated by focal and non-focal electroencephalogram (EEG) time series. In this sense, the proposed MEMD-enhanced mvFE and mvSE are employed to discriminate focal EEG signals from non-focal ones. The results demonstrate the MEMD-enhanced mvFE values have a smaller coefficient of variation in comparison with those obtained by the MEMD-enhanced mvSE, even for long signals. The results also show that the MEMD-enhanced mvFE has better performance to quantify focal and non-focal signals compared with multivariate multiscale permutation entropy.

  4. Scale-specific effects: A report on multiscale analysis of acupunctured EEG in entropy and power

    NASA Astrophysics Data System (ADS)

    Song, Zhenxi; Deng, Bin; Wei, Xile; Cai, Lihui; Yu, Haitao; Wang, Jiang; Wang, Ruofan; Chen, Yingyuan

    2018-02-01

    Investigating acupuncture effects contributes to improving clinical application and understanding neuronal dynamics under external stimulation. In this report, we recorded electroencephalography (EEG) signals evoked by acupuncture at ST36 acupoint with three stimulus frequencies of 50, 100 and 200 times per minutes, and selected non-acupuncture EEGs as the control group. Multiscale analyses were introduced to investigate the possible acupuncture effects on complexity and power in multiscale level. Using multiscale weighted-permutation entropy, we found the significant effects on increased complexity degree in EEG signals induced by acupuncture. The comparison of three stimulation manipulations showed that 100 times/min generated most obvious effects, and affected most cortical regions. By estimating average power spectral density, we found decreased power induced by acupuncture. The joint distribution of entropy and power indicated an inverse correlation, and this relationship was weakened by acupuncture effects, especially under the manipulation of 100 times/min frequency. Above findings are more evident and stable in large scales than small scales, which suggests that multiscale analysis allows evaluating significant effects in specific scale and enables to probe the inherent characteristics underlying physiological signals.

  5. Classification enhancement for post-stroke dementia using fuzzy neighborhood preserving analysis with QR-decomposition.

    PubMed

    Al-Qazzaz, Noor Kamal; Ali, Sawal; Ahmad, Siti Anom; Escudero, Javier

    2017-07-01

    The aim of the present study was to discriminate the electroencephalogram (EEG) of 5 patients with vascular dementia (VaD), 15 patients with stroke-related mild cognitive impairment (MCI), and 15 control normal subjects during a working memory (WM) task. We used independent component analysis (ICA) and wavelet transform (WT) as a hybrid preprocessing approach for EEG artifact removal. Three different features were extracted from the cleaned EEG signals: spectral entropy (SpecEn), permutation entropy (PerEn) and Tsallis entropy (TsEn). Two classification schemes were applied - support vector machine (SVM) and k-nearest neighbors (kNN) - with fuzzy neighborhood preserving analysis with QR-decomposition (FNPAQR) as a dimensionality reduction technique. The FNPAQR dimensionality reduction technique increased the SVM classification accuracy from 82.22% to 90.37% and from 82.6% to 86.67% for kNN. These results suggest that FNPAQR consistently improves the discrimination of VaD, MCI patients and control normal subjects and it could be a useful feature selection to help the identification of patients with VaD and MCI.

  6. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  7. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  8. Characterization of complexities in combustion instability in a lean premixed gas-turbine model combustor.

    PubMed

    Gotoda, Hiroshi; Amano, Masahito; Miyano, Takaya; Ikawa, Takuya; Maki, Koshiro; Tachibana, Shigeru

    2012-12-01

    We characterize complexities in combustion instability in a lean premixed gas-turbine model combustor by nonlinear time series analysis to evaluate permutation entropy, fractal dimensions, and short-term predictability. The dynamic behavior in combustion instability near lean blowout exhibits a self-affine structure and is ascribed to fractional Brownian motion. It undergoes chaos by the onset of combustion oscillations with slow amplitude modulation. Our results indicate that nonlinear time series analysis is capable of characterizing complexities in combustion instability close to lean blowout.

  9. Distribution entropy analysis of epileptic EEG signals.

    PubMed

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p <; 0.05) increased DistEn for the interictal class in compassion with the normal class, whereas both analyses using relatively long EEG signals failed in tracking this difference between them, which may be due to a nonstationarity effect on entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.

  10. Decreased Complexity in Alzheimer's Disease: Resting-State fMRI Evidence of Brain Entropy Mapping.

    PubMed

    Wang, Bin; Niu, Yan; Miao, Liwen; Cao, Rui; Yan, Pengfei; Guo, Hao; Li, Dandan; Guo, Yuxiang; Yan, Tianyi; Wu, Jinglong; Xiang, Jie; Zhang, Hui

    2017-01-01

    Alzheimer's disease (AD) is a frequently observed, irreversible brain function disorder among elderly individuals. Resting-state functional magnetic resonance imaging (rs-fMRI) has been introduced as an alternative approach to assessing brain functional abnormalities in AD patients. However, alterations in the brain rs-fMRI signal complexities in mild cognitive impairment (MCI) and AD patients remain unclear. Here, we described the novel application of permutation entropy (PE) to investigate the abnormal complexity of rs-fMRI signals in MCI and AD patients. The rs-fMRI signals of 30 normal controls (NCs), 33 early MCI (EMCI), 32 late MCI (LMCI), and 29 AD patients were obtained from the Alzheimer's disease Neuroimaging Initiative (ADNI) database. After preprocessing, whole-brain entropy maps of the four groups were extracted and subjected to Gaussian smoothing. We performed a one-way analysis of variance (ANOVA) on the brain entropy maps of the four groups. The results after adjusting for age and sex differences together revealed that the patients with AD exhibited lower complexity than did the MCI and NC controls. We found five clusters that exhibited significant differences and were distributed primarily in the occipital, frontal, and temporal lobes. The average PE of the five clusters exhibited a decreasing trend from MCI to AD. The AD group exhibited the least complexity. Additionally, the average PE of the five clusters was significantly positively correlated with the Mini-Mental State Examination (MMSE) scores and significantly negatively correlated with Functional Assessment Questionnaire (FAQ) scores and global Clinical Dementia Rating (CDR) scores in the patient groups. Significant correlations were also found between the PE and regional homogeneity (ReHo) in the patient groups. These results indicated that declines in PE might be related to changes in regional functional homogeneity in AD. These findings suggested that complexity analyses using PE in rs-fMRI signals can provide important information about the fMRI characteristics of cognitive impairments in MCI and AD.

  11. Decreased Complexity in Alzheimer's Disease: Resting-State fMRI Evidence of Brain Entropy Mapping

    PubMed Central

    Wang, Bin; Niu, Yan; Miao, Liwen; Cao, Rui; Yan, Pengfei; Guo, Hao; Li, Dandan; Guo, Yuxiang; Yan, Tianyi; Wu, Jinglong; Xiang, Jie; Zhang, Hui

    2017-01-01

    Alzheimer's disease (AD) is a frequently observed, irreversible brain function disorder among elderly individuals. Resting-state functional magnetic resonance imaging (rs-fMRI) has been introduced as an alternative approach to assessing brain functional abnormalities in AD patients. However, alterations in the brain rs-fMRI signal complexities in mild cognitive impairment (MCI) and AD patients remain unclear. Here, we described the novel application of permutation entropy (PE) to investigate the abnormal complexity of rs-fMRI signals in MCI and AD patients. The rs-fMRI signals of 30 normal controls (NCs), 33 early MCI (EMCI), 32 late MCI (LMCI), and 29 AD patients were obtained from the Alzheimer's disease Neuroimaging Initiative (ADNI) database. After preprocessing, whole-brain entropy maps of the four groups were extracted and subjected to Gaussian smoothing. We performed a one-way analysis of variance (ANOVA) on the brain entropy maps of the four groups. The results after adjusting for age and sex differences together revealed that the patients with AD exhibited lower complexity than did the MCI and NC controls. We found five clusters that exhibited significant differences and were distributed primarily in the occipital, frontal, and temporal lobes. The average PE of the five clusters exhibited a decreasing trend from MCI to AD. The AD group exhibited the least complexity. Additionally, the average PE of the five clusters was significantly positively correlated with the Mini-Mental State Examination (MMSE) scores and significantly negatively correlated with Functional Assessment Questionnaire (FAQ) scores and global Clinical Dementia Rating (CDR) scores in the patient groups. Significant correlations were also found between the PE and regional homogeneity (ReHo) in the patient groups. These results indicated that declines in PE might be related to changes in regional functional homogeneity in AD. These findings suggested that complexity analyses using PE in rs-fMRI signals can provide important information about the fMRI characteristics of cognitive impairments in MCI and AD. PMID:29209199

  12. Comparing vector-based and Bayesian memory models using large-scale datasets: User-generated hashtag and tag prediction on Twitter and Stack Overflow.

    PubMed

    Stanley, Clayton; Byrne, Michael D

    2016-12-01

    The growth of social media and user-created content on online sites provides unique opportunities to study models of human declarative memory. By framing the task of choosing a hashtag for a tweet and tagging a post on Stack Overflow as a declarative memory retrieval problem, 2 cognitively plausible declarative memory models were applied to millions of posts and tweets and evaluated on how accurately they predict a user's chosen tags. An ACT-R based Bayesian model and a random permutation vector-based model were tested on the large data sets. The results show that past user behavior of tag use is a strong predictor of future behavior. Furthermore, past behavior was successfully incorporated into the random permutation model that previously used only context. Also, ACT-R's attentional weight term was linked to an entropy-weighting natural language processing method used to attenuate high-frequency words (e.g., articles and prepositions). Word order was not found to be a strong predictor of tag use, and the random permutation model performed comparably to the Bayesian model without including word order. This shows that the strength of the random permutation model is not in the ability to represent word order, but rather in the way in which context information is successfully compressed. The results of the large-scale exploration show how the architecture of the 2 memory models can be modified to significantly improve accuracy, and may suggest task-independent general modifications that can help improve model fit to human data in a much wider range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Fourier-Mellin moment-based intertwining map for image encryption

    NASA Astrophysics Data System (ADS)

    Kaur, Manjit; Kumar, Vijay

    2018-03-01

    In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.

  14. A Novel Bit-level Image Encryption Method Based on Chaotic Map and Dynamic Grouping

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Ji; Shen, Yan

    2012-10-01

    In this paper, a novel bit-level image encryption method based on dynamic grouping is proposed. In the proposed method, the plain-image is divided into several groups randomly, then permutation-diffusion process on bit level is carried out. The keystream generated by logistic map is related to the plain-image, which confuses the relationship between the plain-image and the cipher-image. The computer simulation results of statistical analysis, information entropy analysis and sensitivity analysis show that the proposed encryption method is secure and reliable enough to be used for communication application.

  15. Electromyographic Permutation Entropy Quantifies Diaphragmatic Denervation and Reinnervation

    PubMed Central

    Kretschmer, Alexander; Lehmeyer, Veronika; Kellermann, Kristine; Schaller, Stephan J.; Blobner, Manfred; Kochs, Eberhard F.; Fink, Heidrun

    2014-01-01

    Spontaneous reinnervation after diaphragmatic paralysis due to trauma, surgery, tumors and spinal cord injuries is frequently observed. A possible explanation could be collateral reinnervation, since the diaphragm is commonly double-innervated by the (accessory) phrenic nerve. Permutation entropy (PeEn), a complexity measure for time series, may reflect a functional state of neuromuscular transmission by quantifying the complexity of interactions across neural and muscular networks. In an established rat model, electromyographic signals of the diaphragm after phrenicotomy were analyzed using PeEn quantifying denervation and reinnervation. Thirty-three anesthetized rats were unilaterally phrenicotomized. After 1, 3, 9, 27 and 81 days, diaphragmatic electromyographic PeEn was analyzed in vivo from sternal, mid-costal and crural areas of both hemidiaphragms. After euthanasia of the animals, both hemidiaphragms were dissected for fiber type evaluation. The electromyographic incidence of an accessory phrenic nerve was 76%. At day 1 after phrenicotomy, PeEn (normalized values) was significantly diminished in the sternal (median: 0.69; interquartile range: 0.66–0.75) and mid-costal area (0.68; 0.66–0.72) compared to the non-denervated side (0.84; 0.78–0.90) at threshold p<0.05. In the crural area, innervated by the accessory phrenic nerve, PeEn remained unchanged (0.79; 0.72–0.86). During reinnervation over 81 days, PeEn normalized in the mid-costal area (0.84; 0.77–0.86), whereas it remained reduced in the sternal area (0.77; 0.70–0.81). Fiber type grouping, a histological sign for reinnervation, was found in the mid-costal area in 20% after 27 days and in 80% after 81 days. Collateral reinnervation can restore diaphragm activity after phrenicotomy. Electromyographic PeEn represents a new, distinctive assessment characterizing intramuscular function following denervation and reinnervation. PMID:25532023

  16. OmpF, a nucleotide-sensing nanoprobe, computational evaluation of single channel activities

    NASA Astrophysics Data System (ADS)

    Abdolvahab, R. H.; Mobasheri, H.; Nikouee, A.; Ejtehadi, M. R.

    2016-09-01

    The results of highthroughput practical single channel experiments should be formulated and validated by signal analysis approaches to increase the recognition precision of translocating molecules. For this purpose, the activities of the single nano-pore forming protein, OmpF, in the presence of nucleotides were recorded in real time by the voltage clamp technique and used as a means for nucleotide recognition. The results were analyzed based on the permutation entropy of current Time Series (TS), fractality, autocorrelation, structure function, spectral density, and peak fraction to recognize each nucleotide, based on its signature effect on the conductance, gating frequency and voltage sensitivity of channel at different concentrations and membrane potentials. The amplitude and frequency of ion current fluctuation increased in the presence of Adenine more than Cytosine and Thymine in milli-molar (0.5 mM) concentrations. The variance of the current TS at various applied voltages showed a non-monotonic trend whose initial increasing slope in the presence of Thymine changed to a decreasing one in the second phase and was different from that of Adenine and Cytosine; e.g., by increasing the voltage from 40 to 140 mV in the 0.5 mM concentration of Adenine or Cytosine, the variance decreased by one third while for the case of Thymine it was doubled. Moreover, according to the structure function of TS, the fractality of current TS differed as a function of varying membrane potentials (pd) and nucleotide concentrations. Accordingly, the calculated permutation entropy of the TS, validated the biophysical approach defined for the recognition of different nucleotides at various concentrations, pd's and polarities. Thus, the promising outcomes of the combined experimental and theoretical methodologies presented here can be implemented as a complementary means in pore-based nucleotide recognition approaches.

  17. Complexity-Entropy Causality Plane as a Complexity Measure for Two-Dimensional Patterns

    PubMed Central

    Ribeiro, Haroldo V.; Zunino, Luciano; Lenzi, Ervin K.; Santoro, Perseu A.; Mendes, Renio S.

    2012-01-01

    Complexity measures are essential to understand complex systems and there are numerous definitions to analyze one-dimensional data. However, extensions of these approaches to two or higher-dimensional data, such as images, are much less common. Here, we reduce this gap by applying the ideas of the permutation entropy combined with a relative entropic index. We build up a numerical procedure that can be easily implemented to evaluate the complexity of two or higher-dimensional patterns. We work out this method in different scenarios where numerical experiments and empirical data were taken into account. Specifically, we have applied the method to fractal landscapes generated numerically where we compare our measures with the Hurst exponent; liquid crystal textures where nematic-isotropic-nematic phase transitions were properly identified; 12 characteristic textures of liquid crystals where the different values show that the method can distinguish different phases; and Ising surfaces where our method identified the critical temperature and also proved to be stable. PMID:22916097

  18. Analysis of spontaneous EEG activity in Alzheimer's disease using cross-sample entropy and graph theory.

    PubMed

    Gomez, Carlos; Poza, Jesus; Gomez-Pilar, Javier; Bachiller, Alejandro; Juan-Cruz, Celia; Tola-Arribas, Miguel A; Carreres, Alicia; Cano, Monica; Hornero, Roberto

    2016-08-01

    The aim of this pilot study was to analyze spontaneous electroencephalography (EEG) activity in Alzheimer's disease (AD) by means of Cross-Sample Entropy (Cross-SampEn) and two local measures derived from graph theory: clustering coefficient (CC) and characteristic path length (PL). Five minutes of EEG activity were recorded from 37 patients with dementia due to AD and 29 elderly controls. Our results showed that Cross-SampEn values were lower in the AD group than in the control one for all the interactions among EEG channels. This finding indicates that EEG activity in AD is characterized by a lower statistical dissimilarity among channels. Significant differences were found mainly for fronto-central interactions (p <; 0.01, permutation test). Additionally, the application of graph theory measures revealed diverse neural network changes, i.e. lower CC and higher PL values in AD group, leading to a less efficient brain organization. This study suggests the usefulness of our approach to provide further insights into the underlying brain dynamics associated with AD.

  19. Linear models: permutation methods

    USGS Publications Warehouse

    Cade, B.S.; Everitt, B.S.; Howell, D.C.

    2005-01-01

    Permutation tests (see Permutation Based Inference) for the linear model have applications in behavioral studies when traditional parametric assumptions about the error term in a linear model are not tenable. Improved validity of Type I error rates can be achieved with properly constructed permutation tests. Perhaps more importantly, increased statistical power, improved robustness to effects of outliers, and detection of alternative distributional differences can be achieved by coupling permutation inference with alternative linear model estimators. For example, it is well-known that estimates of the mean in linear model are extremely sensitive to even a single outlying value of the dependent variable compared to estimates of the median [7, 19]. Traditionally, linear modeling focused on estimating changes in the center of distributions (means or medians). However, quantile regression allows distributional changes to be estimated in all or any selected part of a distribution or responses, providing a more complete statistical picture that has relevance to many biological questions [6]...

  20. Exact analytical thermodynamic expressions for a Brownian heat engine

    NASA Astrophysics Data System (ADS)

    Taye, Mesfin Asfaw

    2015-09-01

    The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t . Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.

  1. Exact analytical thermodynamic expressions for a Brownian heat engine.

    PubMed

    Taye, Mesfin Asfaw

    2015-09-01

    The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t. Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.

  2. Modulation of a protein free-energy landscape by circular permutation.

    PubMed

    Radou, Gaël; Enciso, Marta; Krivov, Sergei; Paci, Emanuele

    2013-11-07

    Circular permutations usually retain the native structure and function of a protein while inevitably perturbing its folding dynamics. By using simulations with a structure-based model and a rigorous methodology to determine free-energy surfaces from trajectories, we evaluate the effect of a circular permutation on the free-energy landscape of the protein T4 lysozyme. We observe changes which, although subtle, largely affect the cooperativity between the two subdomains. Such a change in cooperativity has been previously experimentally observed and recently also characterized using single molecule optical tweezers and the Crooks relation. The free-energy landscapes show that both the wild type and circular permutant have an on-pathway intermediate, previously experimentally characterized, in which one of the subdomains is completely formed. The landscapes, however, differ in the position of the rate-limiting step for folding, which occurs before the intermediate in the wild type and after in the circular permutant. This shift of transition state explains the observed change in the cooperativity. The underlying free-energy landscape thus provides a microscopic description of the folding dynamics and the connection between circular permutation and the loss of cooperativity experimentally observed.

  3. Monitoring the Depth of Anesthesia Using a New Adaptive Neurofuzzy System.

    PubMed

    Shalbaf, Ahmad; Saffar, Mohsen; Sleigh, Jamie W; Shalbaf, Reza

    2018-05-01

    Accurate and noninvasive monitoring of the depth of anesthesia (DoA) is highly desirable. Since the anesthetic drugs act mainly on the central nervous system, the analysis of brain activity using electroencephalogram (EEG) is very useful. This paper proposes a novel automated method for assessing the DoA using EEG. First, 11 features including spectral, fractal, and entropy are extracted from EEG signal and then, by applying an algorithm according to exhaustive search of all subsets of features, a combination of the best features (Beta-index, sample entropy, shannon permutation entropy, and detrended fluctuation analysis) is selected. Accordingly, we feed these extracted features to a new neurofuzzy classification algorithm, adaptive neurofuzzy inference system with linguistic hedges (ANFIS-LH). This structure can successfully model systems with nonlinear relationships between input and output, and also classify overlapped classes accurately. ANFIS-LH, which is based on modified classical fuzzy rules, reduces the effects of the insignificant features in input space, which causes overlapping and modifies the output layer structure. The presented method classifies EEG data into awake, light, general, and deep states during anesthesia with sevoflurane in 17 patients. Its accuracy is 92% compared to a commercial monitoring system (response entropy index) successfully. Moreover, this method reaches the classification accuracy of 93% to categorize EEG signal to awake and general anesthesia states by another database of propofol and volatile anesthesia in 50 patients. To sum up, this method is potentially applicable to a new real-time monitoring system to help the anesthesiologist with continuous assessment of DoA quickly and accurately.

  4. Entropy generation of nanofluid flow in a microchannel heat sink

    NASA Astrophysics Data System (ADS)

    Manay, Eyuphan; Akyürek, Eda Feyza; Sahin, Bayram

    2018-06-01

    Present study aims to investigate the effects of the presence of nano sized TiO2 particles in the base fluid on entropy generation rate in a microchannel heat sink. Pure water was chosen as base fluid, and TiO2 particles were suspended into the pure water in five different particle volume fractions of 0.25%, 0.5%, 1.0%, 1.5% and 2.0%. Under laminar, steady state flow and constant heat flux boundary conditions, thermal, frictional, total entropy generation rates and entropy generation number ratios of nanofluids were experimentally analyzed in microchannel flow for different channel heights of 200 μm, 300 μm, 400 μm and 500 μm. It was observed that frictional and total entropy generation rates increased as thermal entropy generation rate were decreasing with an increase in particle volume fraction. In microchannel flows, thermal entropy generation could be neglected due to its too low rate smaller than 1.10e-07 in total entropy generation. Higher channel heights caused higher thermal entropy generation rates, and increasing channel height yielded an increase from 30% to 52% in thermal entropy generation. When channel height decreased, an increase of 66%-98% in frictional entropy generation was obtained. Adding TiO2 nanoparticles into the base fluid caused thermal entropy generation to decrease about 1.8%-32.4%, frictional entropy generation to increase about 3.3%-21.6%.

  5. Exact solutions for the entropy production rate of several irreversible processes.

    PubMed

    Ross, John; Vlad, Marcel O

    2005-11-24

    We investigate thermal conduction described by Newton's law of cooling and by Fourier's transport equation and chemical reactions based on mass action kinetics where we detail a simple example of a reaction mechanism with one intermediate. In these cases we derive exact expressions for the entropy production rate and its differential. We show that at a stationary state the entropy production rate is an extremum if and only if the stationary state is a state of thermodynamic equilibrium. These results are exact and independent of any expansions of the entropy production rate. In the case of thermal conduction we compare our exact approach with the conventional approach based on the expansion of the entropy production rate near equilibrium. If we expand the entropy production rate in a series and keep terms up to the third order in the deviation variables and then differentiate, we find out that the entropy production rate is not an extremum at a nonequilibrium steady state. If there is a strict proportionality between fluxes and forces, then the entropy production rate is an extremum at the stationary state even if the stationary state is far away from equilibrium.

  6. Possible signatures of dissipation from time-series analysis techniques using a turbulent laboratory magnetohydrodynamic plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaffner, D. A.; Brown, M. R.; Rock, A. B.

    The frequency spectrum of magnetic fluctuations as measured on the Swarthmore Spheromak Experiment is broadband and exhibits a nearly Kolmogorov 5/3 scaling. It features a steepening region which is indicative of dissipation of magnetic fluctuation energy similar to that observed in fluid and magnetohydrodynamic turbulence systems. Two non-spectrum based time-series analysis techniques are implemented on this data set in order to seek other possible signatures of turbulent dissipation beyond just the steepening of fluctuation spectra. Presented here are results for the flatness, permutation entropy, and statistical complexity, each of which exhibits a particular character at spectral steepening scales which canmore » then be compared to the behavior of the frequency spectrum.« less

  7. An entropy-based nonparametric test for the validation of surrogate endpoints.

    PubMed

    Miao, Xiaopeng; Wang, Yong-Cheng; Gangopadhyay, Ashis

    2012-06-30

    We present a nonparametric test to validate surrogate endpoints based on measure of divergence and random permutation. This test is a proposal to directly verify the Prentice statistical definition of surrogacy. The test does not impose distributional assumptions on the endpoints, and it is robust to model misspecification. Our simulation study shows that the proposed nonparametric test outperforms the practical test of the Prentice criterion in terms of both robustness of size and power. We also evaluate the performance of three leading methods that attempt to quantify the effect of surrogate endpoints. The proposed method is applied to validate magnetic resonance imaging lesions as the surrogate endpoint for clinical relapses in a multiple sclerosis trial. Copyright © 2012 John Wiley & Sons, Ltd.

  8. Radiative entropy generation in a gray absorbing, emitting, and scattering planar medium at radiative equilibrium

    NASA Astrophysics Data System (ADS)

    Sadeghi, Pegah; Safavinejad, Ali

    2017-11-01

    Radiative entropy generation through a gray absorbing, emitting, and scattering planar medium at radiative equilibrium with diffuse-gray walls is investigated. The radiative transfer equation and radiative entropy generation equations are solved using discrete ordinates method. Components of the radiative entropy generation are considered for two different boundary conditions: two walls are at a prescribed temperature and mixed boundary conditions, which one wall is at a prescribed temperature and the other is at a prescribed heat flux. The effect of wall emissivities, optical thickness, single scattering albedo, and anisotropic-scattering factor on the entropy generation is attentively investigated. The results reveal that entropy generation in the system mainly arises from irreversible radiative transfer at wall with lower temperature. Total entropy generation rate for the system with prescribed temperature at walls remarkably increases as wall emissivity increases; conversely, for system with mixed boundary conditions, total entropy generation rate slightly decreases. Furthermore, as the optical thickness increases, total entropy generation rate remarkably decreases for the system with prescribed temperature at walls; nevertheless, for the system with mixed boundary conditions, total entropy generation rate increases. The variation of single scattering albedo does not considerably affect total entropy generation rate. This parametric analysis demonstrates that the optical thickness and wall emissivities have a significant effect on the entropy generation in the system at radiative equilibrium. Considering the parameters affecting radiative entropy generation significantly, provides an opportunity to optimally design or increase overall performance and efficiency by applying entropy minimization techniques for the systems at radiative equilibrium.

  9. Full spinal cord regeneration after total transection is not possible due to entropy change.

    PubMed

    Zielinski, P; Sokal, P

    2016-09-01

    Transected spinal cord regeneration is a main challenge of regenerative medicine. The mainstream of research is focused on the promotion of spinal axons growth, which is strongly inhibited in mammals. Assuming that the inhibition of the axonal growth may be ever overcome, the complexity of neural reconnections may be the second serious stand to overcome. Peripheral nerve axons regeneration seem to form a random pattern of their targets reconnections. The hypothesis is that due to the laws of entropy or irreversible information loss the full spinal cord restoration after the transection is not possible. The hypothesis is discussed based on several assumptions. Simplifying the dissertation spinal cord is represented by 2millions of pyramidal axons. After the transection each of these axons has to make a growth and reconnect with exactly matching targets below the transection, in the same number. Axons are guided by neurotrophic factors and afterwards reconnected with neuroplasticity mechanisms. Assuming random reconnections, there are 2,000,000! permutations [Formula: see text] , therefore the chance of ideally random but correct reconnection of pyramidal axons with adequate targets is 1/2,000,000!. Apart from pyramidal axons, there are other axons, like extrapyramidal, sensory and associative. Empirical data and analysis of neurotrophic factors and organogenesis mechanisms may seem to slightly contradict the hypothesis, but strictly adhering to the second law of thermodynamics and entropy laws the full restoration of the transected cord may never be possible. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Renyi entropy measures of heart rate Gaussianity.

    PubMed

    Lake, Douglas E

    2006-01-01

    Sample entropy and approximate entropy are measures that have been successfully utilized to study the deterministic dynamics of heart rate (HR). A complementary stochastic point of view and a heuristic argument using the Central Limit Theorem suggests that the Gaussianity of HR is a complementary measure of the physiological complexity of the underlying signal transduction processes. Renyi entropy (or q-entropy) is a widely used measure of Gaussianity in many applications. Particularly important members of this family are differential (or Shannon) entropy (q = 1) and quadratic entropy (q = 2). We introduce the concepts of differential and conditional Renyi entropy rate and, in conjunction with Burg's theorem, develop a measure of the Gaussianity of a linear random process. Robust algorithms for estimating these quantities are presented along with estimates of their standard errors.

  11. A Computationally Efficient Hypothesis Testing Method for Epistasis Analysis using Multifactor Dimensionality Reduction

    PubMed Central

    Pattin, Kristine A.; White, Bill C.; Barney, Nate; Gui, Jiang; Nelson, Heather H.; Kelsey, Karl R.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.

    2008-01-01

    Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free data mining method for detecting, characterizing, and interpreting epistasis in the absence of significant main effects in genetic and epidemiologic studies of complex traits such as disease susceptibility. The goal of MDR is to change the representation of the data using a constructive induction algorithm to make nonadditive interactions easier to detect using any classification method such as naïve Bayes or logistic regression. Traditionally, MDR constructed variables have been evaluated with a naïve Bayes classifier that is combined with 10-fold cross validation to obtain an estimate of predictive accuracy or generalizability of epistasis models. Traditionally, we have used permutation testing to statistically evaluate the significance of models obtained through MDR. The advantage of permutation testing is that it controls for false-positives due to multiple testing. The disadvantage is that permutation testing is computationally expensive. This is in an important issue that arises in the context of detecting epistasis on a genome-wide scale. The goal of the present study was to develop and evaluate several alternatives to large-scale permutation testing for assessing the statistical significance of MDR models. Using data simulated from 70 different epistasis models, we compared the power and type I error rate of MDR using a 1000-fold permutation test with hypothesis testing using an extreme value distribution (EVD). We find that this new hypothesis testing method provides a reasonable alternative to the computationally expensive 1000-fold permutation test and is 50 times faster. We then demonstrate this new method by applying it to a genetic epidemiology study of bladder cancer susceptibility that was previously analyzed using MDR and assessed using a 1000-fold permutation test. PMID:18671250

  12. Evaluation of Feature Extraction and Recognition for Activity Monitoring and Fall Detection Based on Wearable sEMG Sensors.

    PubMed

    Xi, Xugang; Tang, Minyan; Miran, Seyed M; Luo, Zhizeng

    2017-05-27

    As an essential subfield of context awareness, activity awareness, especially daily activity monitoring and fall detection, plays a significant role for elderly or frail people who need assistance in their daily activities. This study investigates the feature extraction and pattern recognition of surface electromyography (sEMG), with the purpose of determining the best features and classifiers of sEMG for daily living activities monitoring and fall detection. This is done by a serial of experiments. In the experiments, four channels of sEMG signal from wireless, wearable sensors located on lower limbs are recorded from three subjects while they perform seven activities of daily living (ADL). A simulated trip fall scenario is also considered with a custom-made device attached to the ankle. With this experimental setting, 15 feature extraction methods of sEMG, including time, frequency, time/frequency domain and entropy, are analyzed based on class separability and calculation complexity, and five classification methods, each with 15 features, are estimated with respect to the accuracy rate of recognition and calculation complexity for activity monitoring and fall detection. It is shown that a high accuracy rate of recognition and a minimal calculation time for daily activity monitoring and fall detection can be achieved in the current experimental setting. Specifically, the Wilson Amplitude (WAMP) feature performs the best, and the classifier Gaussian Kernel Support Vector Machine (GK-SVM) with Permutation Entropy (PE) or WAMP results in the highest accuracy for activity monitoring with recognition rates of 97.35% and 96.43%. For fall detection, the classifier Fuzzy Min-Max Neural Network (FMMNN) has the best sensitivity and specificity at the cost of the longest calculation time, while the classifier Gaussian Kernel Fisher Linear Discriminant Analysis (GK-FDA) with the feature WAMP guarantees a high sensitivity (98.70%) and specificity (98.59%) with a short calculation time (65.586 ms), making it a possible choice for pre-impact fall detection. The thorough quantitative comparison of the features and classifiers in this study supports the feasibility of a wireless, wearable sEMG sensor system for automatic activity monitoring and fall detection.

  13. Evaluation of Feature Extraction and Recognition for Activity Monitoring and Fall Detection Based on Wearable sEMG Sensors

    PubMed Central

    Xi, Xugang; Tang, Minyan; Miran, Seyed M.; Luo, Zhizeng

    2017-01-01

    As an essential subfield of context awareness, activity awareness, especially daily activity monitoring and fall detection, plays a significant role for elderly or frail people who need assistance in their daily activities. This study investigates the feature extraction and pattern recognition of surface electromyography (sEMG), with the purpose of determining the best features and classifiers of sEMG for daily living activities monitoring and fall detection. This is done by a serial of experiments. In the experiments, four channels of sEMG signal from wireless, wearable sensors located on lower limbs are recorded from three subjects while they perform seven activities of daily living (ADL). A simulated trip fall scenario is also considered with a custom-made device attached to the ankle. With this experimental setting, 15 feature extraction methods of sEMG, including time, frequency, time/frequency domain and entropy, are analyzed based on class separability and calculation complexity, and five classification methods, each with 15 features, are estimated with respect to the accuracy rate of recognition and calculation complexity for activity monitoring and fall detection. It is shown that a high accuracy rate of recognition and a minimal calculation time for daily activity monitoring and fall detection can be achieved in the current experimental setting. Specifically, the Wilson Amplitude (WAMP) feature performs the best, and the classifier Gaussian Kernel Support Vector Machine (GK-SVM) with Permutation Entropy (PE) or WAMP results in the highest accuracy for activity monitoring with recognition rates of 97.35% and 96.43%. For fall detection, the classifier Fuzzy Min-Max Neural Network (FMMNN) has the best sensitivity and specificity at the cost of the longest calculation time, while the classifier Gaussian Kernel Fisher Linear Discriminant Analysis (GK-FDA) with the feature WAMP guarantees a high sensitivity (98.70%) and specificity (98.59%) with a short calculation time (65.586 ms), making it a possible choice for pre-impact fall detection. The thorough quantitative comparison of the features and classifiers in this study supports the feasibility of a wireless, wearable sEMG sensor system for automatic activity monitoring and fall detection. PMID:28555016

  14. Using permutations to detect dependence between time series

    NASA Astrophysics Data System (ADS)

    Cánovas, Jose S.; Guillamón, Antonio; Ruíz, María del Carmen

    2011-07-01

    In this paper, we propose an independence test between two time series which is based on permutations. The proposed test can be carried out by means of different common statistics such as Pearson’s chi-square or the likelihood ratio. We also point out why an exact test is necessary. Simulated and real data (return exchange rates between several currencies) reveal the capacity of this test to detect linear and nonlinear dependences.

  15. Testing of Error-Correcting Sparse Permutation Channel Codes

    NASA Technical Reports Server (NTRS)

    Shcheglov, Kirill, V.; Orlov, Sergei S.

    2008-01-01

    A computer program performs Monte Carlo direct numerical simulations for testing sparse permutation channel codes, which offer strong error-correction capabilities at high code rates and are considered especially suitable for storage of digital data in holographic and volume memories. A word in a code of this type is characterized by, among other things, a sparseness parameter (M) and a fixed number (K) of 1 or "on" bits in a channel block length of N.

  16. Neuronal Entropy-Rate Feature of Entopeduncular Nucleus in Rat Model of Parkinson's Disease.

    PubMed

    Darbin, Olivier; Jin, Xingxing; Von Wrangel, Christof; Schwabe, Kerstin; Nambu, Atsushi; Naritoku, Dean K; Krauss, Joachim K; Alam, Mesbah

    2016-03-01

    The function of the nigro-striatal pathway on neuronal entropy in the basal ganglia (BG) output nucleus, i.e. the entopeduncular nucleus (EPN) was investigated in the unilaterally 6-hyroxydopamine (6-OHDA)-lesioned rat model of Parkinson's disease (PD). In both control subjects and subjects with 6-OHDA lesion of dopamine (DA) the nigro-striatal pathway, a histological hallmark for parkinsonism, neuronal entropy in EPN was maximal in neurons with firing rates ranging between 15 and 25 Hz. In 6-OHDA lesioned rats, neuronal entropy in the EPN was specifically higher in neurons with firing rates above 25 Hz. Our data establishes that the nigro-striatal pathway controls neuronal entropy in motor circuitry and that the parkinsonian condition is associated with abnormal relationship between firing rate and neuronal entropy in BG output nuclei. The neuronal firing rates and entropy relationship provide putative relevant electrophysiological information to investigate the sensory-motor processing in normal condition and conditions such as movement disorders.

  17. RELATIONSHIP BETWEEN ENTROPY OF SPIKE TIMING AND FIRING RATE IN ENTOPEDUNCULAR NUCLEUS NEURONS IN ANESTHETIZED RATS: FUNCTION OF THE NIGRO-STRIATAL PATHWAY

    PubMed Central

    Darbin, Olivier; Jin, Xingxing; von Wrangel, Christof; Schwabe, Kerstin; Nambu, Atsushi; Naritoku, Dean K; Krauss, Joachim K.; Alam, Mesbah

    2016-01-01

    The function of the nigro-striatal pathway on neuronal entropy in the basal ganglia (BG) output nucleus (entopeduncular nucleus, EPN) was investigated in the unilaterally 6-hyroxydopamine (6-OHDA)-lesioned rat model of Parkinson’s disease (PD). In both control subjects and subjects with 6-OHDA lesion of the nigro-striatal pathway, a histological hallmark for parkinsonism, neuronal entropy in EPN was maximal in neurons with firing rates ranging between 15Hz and 25 Hz. In 6-OHDA lesioned rats, neuronal entropy in the EPN was specifically higher in neurons with firing rates above 25Hz. Our data establishes that nigro-striatal pathway controls neuronal entropy in motor circuitry and that the parkinsonian condition is associated with abnormal relationship between firing rate and neuronal entropy in BG output nuclei. The neuronal firing rates and entropy relationship provide putative relevant electrophysiological information to investigate the sensory-motor processing in normal condition and conditions with movement disorders. PMID:26711712

  18. Searching for the fastest dynamo: laminar ABC flows.

    PubMed

    Alexakis, Alexandros

    2011-08-01

    The growth rate of the dynamo instability as a function of the magnetic Reynolds number R(M) is investigated by means of numerical simulations for the family of the Arnold-Beltrami-Childress (ABC) flows and for two different forcing scales. For the ABC flows that are driven at the largest available length scale, it is found that, as the magnetic Reynolds number is increased: (a) The flow that results first in a dynamo is the 2 1/2-dimensional flow for which A=B and C=0 (and all permutations). (b) The second type of flow that results in a dynamo is the one for which A=B≃2C/5 (and permutations). (c) The most symmetric flow, A=B=C, is the third type of flow that results in a dynamo. (d) As R(M) is increased, the A=B=C flow stops being a dynamo and transitions from a local maximum to a third-order saddle point. (e) At larger R(M), the A=B=C flow reestablishes itself as a dynamo but remains a saddle point. (f) At the largest examined R(M), the growth rate of the 2 1/2-dimensional flows starts to decay, the A=B=C flow comes close to a local maximum again, and the flow A=B≃2C/5 (and permutations) results in the fastest dynamo with growth rate γ≃0.12 at the largest examined R(M). For the ABC flows that are driven at the second largest available length scale, it is found that (a) the 2 1/2-dimensional flows A=B,C=0 (and permutations) are again the first flows that result in a dynamo with a decreased onset. (b) The most symmetric flow, A=B=C, is the second type of flow that results in a dynamo. It is, and it remains, a local maximum. (c) At larger R(M), the flow A=B≃2C/5 (and permutations) appears as the third type of flow that results in a dynamo. As R(M) is increased, it becomes the flow with the largest growth rate. The growth rates appear to have some correlation with the Lyapunov exponents, but constructive refolding of the field lines appears equally important in determining the fastest dynamo flow.

  19. Tree tensor network approach to simulating Shor's algorithm

    NASA Astrophysics Data System (ADS)

    Dumitrescu, Eugene

    2017-12-01

    Constructively simulating quantum systems furthers our understanding of qualitative and quantitative features which may be analytically intractable. In this paper, we directly simulate and explore the entanglement structure present in the paradigmatic example for exponential quantum speedups: Shor's algorithm. To perform our simulation, we construct a dynamic tree tensor network which manifestly captures two salient circuit features for modular exponentiation. These are the natural two-register bipartition and the invariance of entanglement with respect to permutations of the top-register qubits. Our construction help identify the entanglement entropy properties, which we summarize by a scaling relation. Further, the tree network is efficiently projected onto a matrix product state from which we efficiently execute the quantum Fourier transform. Future simulation of quantum information states with tensor networks exploiting circuit symmetries is discussed.

  20. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns.

    PubMed

    Osgood, Nathaniel D; Paul, Tuhin; Stanley, Kevin G; Qian, Weicheng

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist.

  1. Folding behavior of ribosomal protein S6 studied by modified Go¯ -like model

    NASA Astrophysics Data System (ADS)

    Wu, L.; Zhang, J.; Wang, J.; Li, W. F.; Wang, W.

    2007-03-01

    Recent experimental and theoretical studies suggest that, although topology is the determinant factor in protein folding, especially for small single-domain proteins, energetic factors also play an important role in the folding process. The ribosomal protein S6 has been subjected to intensive studies. A radical change of the transition state in its circular permutants has been observed, which is believed to be caused by a biased distribution of contact energies. Since the simplistic topology-only Gō -like model is not able to reproduce such an observation, we modify the model by introducing variable contact energies between residues based on their physicochemical properties. The modified Gō -like model can successfully reproduce the Φ -value distributions, folding nucleus, and folding pathways of both the wild-type and circular permutants of S6. Furthermore, by comparing the results of the modified and the simplistic models, we find that the hydrophobic effect constructs the major force that balances the loop entropies. This may indicate that nature maintains the folding cooperativity of this protein by carefully arranging the location of hydrophobic residues in the sequence. Our study reveals a strategy or mechanism used by nature to get out of the dilemma when the native structure, possibly required by biological function, conflicts with folding cooperativity. Finally, the possible relationship between such a design of nature and amyloidosis is also discussed.

  2. Non-parametric combination and related permutation tests for neuroimaging.

    PubMed

    Winkler, Anderson M; Webster, Matthew A; Brooks, Jonathan C; Tracey, Irene; Smith, Stephen M; Nichols, Thomas E

    2016-04-01

    In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well-known definition of union-intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non-parametric combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one-way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  3. Comment on: The cancer Warburg effect may be a testable example of the minimum entropy production rate principle.

    PubMed

    Sadeghi Ghuchani, Mostafa

    2018-02-08

    This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.

  4. Comment on: The cancer Warburg effect may be a testable example of the minimum entropy production rate principle

    NASA Astrophysics Data System (ADS)

    Sadeghi Ghuchani, Mostafa

    2018-03-01

    This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.

  5. Scaling of the entropy budget with surface temperature in radiative-convective equilibrium

    NASA Astrophysics Data System (ADS)

    Singh, Martin S.; O'Gorman, Paul A.

    2016-09-01

    The entropy budget of the atmosphere is examined in simulations of radiative-convective equilibrium with a cloud-system resolving model over a wide range of surface temperatures from 281 to 311 K. Irreversible phase changes and the diffusion of water vapor account for more than half of the irreversible entropy production within the atmosphere, even in the coldest simulation. As the surface temperature is increased, the atmospheric radiative cooling rate increases, driving a greater entropy sink that must be matched by greater irreversible entropy production. The entropy production resulting from irreversible moist processes increases at a similar fractional rate as the entropy sink and at a lower rate than that implied by Clausius-Clapeyron scaling. This allows the entropy production from frictional drag on hydrometeors and on the atmospheric flow to also increase with warming, in contrast to recent results for simulations with global climate models in which the work output decreases with warming. A set of approximate scaling relations is introduced for the terms in the entropy budget as the surface temperature is varied, and many of the terms are found to scale with the mean surface precipitation rate. The entropy budget provides some insight into changes in frictional dissipation in response to warming or changes in model resolution, but it is argued that frictional dissipation is not closely linked to other measures of convective vigor.

  6. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle

    NASA Astrophysics Data System (ADS)

    Marín, Dolores; Sabater, Bartolomé

    2017-04-01

    Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.

  7. The cancer Warburg effect may be a testable example of the minimum entropy production rate principle.

    PubMed

    Marín, Dolores; Sabater, Bartolomé

    2017-04-28

    Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO 2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.

  8. Compression based entropy estimation of heart rate variability on multiple time scales.

    PubMed

    Baumert, Mathias; Voss, Andreas; Javorka, Michal

    2013-01-01

    Heart rate fluctuates beat by beat in a complex manner. The aim of this study was to develop a framework for entropy assessment of heart rate fluctuations on multiple time scales. We employed the Lempel-Ziv algorithm for lossless data compression to investigate the compressibility of RR interval time series on different time scales, using a coarse-graining procedure. We estimated the entropy of RR interval time series of 20 young and 20 old subjects and also investigated the compressibility of randomly shuffled surrogate RR time series. The original RR time series displayed significantly smaller compression entropy values than randomized RR interval data. The RR interval time series of older subjects showed significantly different entropy characteristics over multiple time scales than those of younger subjects. In conclusion, data compression may be useful approach for multiscale entropy assessment of heart rate variability.

  9. A Theoretical Basis for Entropy-Scaling Effects in Human Mobility Patterns

    PubMed Central

    2016-01-01

    Characterizing how people move through space has been an important component of many disciplines. With the advent of automated data collection through GPS and other location sensing systems, researchers have the opportunity to examine human mobility at spatio-temporal resolution heretofore impossible. However, the copious and complex data collected through these logging systems can be difficult for humans to fully exploit, leading many researchers to propose novel metrics for encapsulating movement patterns in succinct and useful ways. A particularly salient proposed metric is the mobility entropy rate of the string representing the sequence of locations visited by an individual. However, mobility entropy rate is not scale invariant: entropy rate calculations based on measurements of the same trajectory at varying spatial or temporal granularity do not yield the same value, limiting the utility of mobility entropy rate as a metric by confounding inter-experimental comparisons. In this paper, we derive a scaling relationship for mobility entropy rate of non-repeating straight line paths from the definition of Lempel-Ziv compression. We show that the resulting formulation predicts the scaling behavior of simulated mobility traces, and provides an upper bound on mobility entropy rate under certain assumptions. We further show that this formulation has a maximum value for a particular sampling rate, implying that optimal sampling rates for particular movement patterns exist. PMID:27571423

  10. Comparative analysis of electric field influence on the quantum wells with different boundary conditions.: I. Energy spectrum, quantum information entropy and polarization.

    PubMed

    Olendski, Oleg

    2015-04-01

    Analytical solutions of the Schrödinger equation for the one-dimensional quantum well with all possible permutations of the Dirichlet and Neumann boundary conditions (BCs) in perpendicular to the interfaces uniform electric field [Formula: see text] are used for the comparative investigation of their interaction and its influence on the properties of the system. Limiting cases of the weak and strong voltages allow an easy mathematical treatment and its clear physical explanation; in particular, for the small [Formula: see text], the perturbation theory derives for all geometries a linear dependence of the polarization on the field with the BC-dependent proportionality coefficient being positive (negative) for the ground (excited) states. Simple two-level approximation elementary explains the negative polarizations as a result of the field-induced destructive interference of the unperturbed modes and shows that in this case the admixture of only the neighboring states plays a dominant role. Different magnitudes of the polarization for different BCs in this regime are explained physically and confirmed numerically. Hellmann-Feynman theorem reveals a fundamental relation between the polarization and the speed of the energy change with the field. It is proved that zero-voltage position entropies [Formula: see text] are BC independent and for all states but the ground Neumann level (which has [Formula: see text]) are equal to [Formula: see text] while the momentum entropies [Formula: see text] depend on the edge requirements and the level. Varying electric field changes position and momentum entropies in the opposite directions such that the entropic uncertainty relation is satisfied. Other physical quantities such as the BC-dependent zero-energy and zero-polarization fields are also studied both numerically and analytically. Applications to different branches of physics, such as ocean fluid dynamics and atmospheric and metallic waveguide electrodynamics, are discussed.

  11. Correntropy-based partial directed coherence for testing multivariate Granger causality in nonlinear processes

    NASA Astrophysics Data System (ADS)

    Kannan, Rohit; Tangirala, Arun K.

    2014-06-01

    Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.

  12. GABAergic excitation of spider mechanoreceptors increases information capacity by increasing entropy rather than decreasing jitter.

    PubMed

    Pfeiffer, Keram; French, Andrew S

    2009-09-02

    Neurotransmitter chemicals excite or inhibit a range of sensory afferents and sensory pathways. These changes in firing rate or static sensitivity can also be associated with changes in dynamic sensitivity or membrane noise and thus action potential timing. We measured action potential firing produced by random mechanical stimulation of spider mechanoreceptor neurons during long-duration excitation by the GABAA agonist muscimol. Information capacity was estimated from signal-to-noise ratio by averaging responses to repeated identical stimulation sequences. Information capacity was also estimated from the coherence function between input and output signals. Entropy rate was estimated by a data compression algorithm and maximum entropy rate from the firing rate. Action potential timing variability, or jitter, was measured as normalized interspike interval distance. Muscimol increased firing rate, information capacity, and entropy rate, but jitter was unchanged. We compared these data with the effects of increasing firing rate by current injection. Our results indicate that the major increase in information capacity by neurotransmitter action arose from the increased entropy rate produced by increased firing rate, not from reduction in membrane noise and action potential jitter.

  13. Non‐parametric combination and related permutation tests for neuroimaging

    PubMed Central

    Webster, Matthew A.; Brooks, Jonathan C.; Tracey, Irene; Smith, Stephen M.; Nichols, Thomas E.

    2016-01-01

    Abstract In this work, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. Using the well‐known definition of union‐intersection tests and closed testing procedures, we use synchronized permutations to correct for such multiplicity of tests, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume‐based representations of the brain, including non‐imaging data. For the problem of joint inference, we propose and evaluate a modification of the recently introduced non‐parametric combination (NPC) methodology, such that instead of a two‐phase algorithm and large data storage requirements, the inference can be performed in a single phase, with reasonable computational demands. The method compares favorably to classical multivariate tests (such as MANCOVA), even when the latter is assessed using permutations. We also evaluate, in the context of permutation tests, various combining methods that have been proposed in the past decades, and identify those that provide the best control over error rate and power across a range of situations. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination. Finally, we discuss how the correction can solve certain problems of multiple comparisons in one‐way ANOVA designs, and how the combination is distinguished from conjunctions, even though both can be assessed using permutation tests. We also provide a common algorithm that accommodates combination and correction. Hum Brain Mapp 37:1486‐1511, 2016. © 2016 Wiley Periodicals, Inc. PMID:26848101

  14. Foreign exchange rate entropy evolution during financial crises

    NASA Astrophysics Data System (ADS)

    Stosic, Darko; Stosic, Dusan; Ludermir, Teresa; de Oliveira, Wilson; Stosic, Tatijana

    2016-05-01

    This paper examines the effects of financial crises on foreign exchange (FX) markets, where entropy evolution is measured for different exchange rates, using the time-dependent block entropy method. Empirical results suggest that financial crises are associated with significant increase of exchange rate entropy, reflecting instability in FX market dynamics. In accordance with phenomenological expectations, it is found that FX markets with large liquidity and large trading volume are more inert - they recover quicker from a crisis than markets with small liquidity and small trading volume. Moreover, our numerical analysis shows that periods of economic uncertainty are preceded by periods of low entropy values, which may serve as a tool for anticipating the onset of financial crises.

  15. Phase transitions between lower and higher level management learning in times of crisis: an experimental study based on synergetics.

    PubMed

    Liening, Andreas; Strunk, Guido; Mittelstadt, Ewald

    2013-10-01

    Much has been written about the differences between single- and double-loop learning, or more general between lower level and higher level learning. Especially in times of a fundamental crisis, a transition between lower and higher level learning would be an appropriate reaction to a challenge coming entirely out of the dark. However, so far there is no quantitative method to monitor such a transition. Therefore we introduce theory and methods of synergetics and present results from an experimental study based on the simulation of a crisis within a business simulation game. Hypothesized critical fluctuations - as a marker for so-called phase transitions - have been assessed with permutation entropy. Results show evidence for a phase transition during the crisis, which can be interpreted as a transition between lower and higher level learning.

  16. Permutation approach, high frequency trading and variety of micro patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Ebrahimian, Mehran; Tahmooresi, Hamed

    2014-11-01

    Permutation approach is suggested as a method to investigate financial time series in micro scales. The method is used to see how high frequency trading in recent years has affected the micro patterns which may be seen in financial time series. Tick to tick exchange rates are considered as examples. It is seen that variety of patterns evolve through time; and that the scale over which the target markets have no dominant patterns, have decreased steadily over time with the emergence of higher frequency trading.

  17. Novel permutation measures for image encryption algorithms

    NASA Astrophysics Data System (ADS)

    Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.

    2016-10-01

    This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.

  18. Blocks in cycles and k-commuting permutations.

    PubMed

    Moreno, Rutilo; Rivera, Luis Manuel

    2016-01-01

    We introduce and study k -commuting permutations. One of our main results is a characterization of permutations that k -commute with a given permutation. Using this characterization, we obtain formulas for the number of permutations that k -commute with a permutation [Formula: see text], for some cycle types of [Formula: see text]. Our enumerative results are related with integer sequences in "The On-line Encyclopedia of Integer Sequences", and in some cases provide new interpretations for such sequences.

  19. Fast algorithms for transforming back and forth between a signed permutation and its equivalent simple permutation.

    PubMed

    Gog, Simon; Bader, Martin

    2008-10-01

    The problem of sorting signed permutations by reversals is a well-studied problem in computational biology. The first polynomial time algorithm was presented by Hannenhalli and Pevzner in 1995. The algorithm was improved several times, and nowadays the most efficient algorithm has a subquadratic running time. Simple permutations played an important role in the development of these algorithms. Although the latest result of Tannier et al. does not require simple permutations, the preliminary version of their algorithm as well as the first polynomial time algorithm of Hannenhalli and Pevzner use the structure of simple permutations. More precisely, the latter algorithms require a precomputation that transforms a permutation into an equivalent simple permutation. To the best of our knowledge, all published algorithms for this transformation have at least a quadratic running time. For further investigations on genome rearrangement problems, the existence of a fast algorithm for the transformation could be crucial. Another important task is the back transformation, i.e. if we have a sorting on the simple permutation, transform it into a sorting on the original permutation. Again, the naive approach results in an algorithm with quadratic running time. In this paper, we present a linear time algorithm for transforming a permutation into an equivalent simple permutation, and an O(n log n) algorithm for the back transformation of the sorting sequence.

  20. A Random Variable Related to the Inversion Vector of a Partial Random Permutation

    ERIC Educational Resources Information Center

    Laghate, Kavita; Deshpande, M. N.

    2005-01-01

    In this article, we define the inversion vector of a permutation of the integers 1, 2,..., n. We set up a particular kind of permutation, called a partial random permutation. The sum of the elements of the inversion vector of such a permutation is a random variable of interest.

  1. A transposase strategy for creating libraries of circularly permuted proteins.

    PubMed

    Mehta, Manan M; Liu, Shirley; Silberg, Jonathan J

    2012-05-01

    A simple approach for creating libraries of circularly permuted proteins is described that is called PERMutation Using Transposase Engineering (PERMUTE). In PERMUTE, the transposase MuA is used to randomly insert a minitransposon that can function as a protein expression vector into a plasmid that contains the open reading frame (ORF) being permuted. A library of vectors that express different permuted variants of the ORF-encoded protein is created by: (i) using bacteria to select for target vectors that acquire an integrated minitransposon; (ii) excising the ensemble of ORFs that contain an integrated minitransposon from the selected vectors; and (iii) circularizing the ensemble of ORFs containing integrated minitransposons using intramolecular ligation. Construction of a Thermotoga neapolitana adenylate kinase (AK) library using PERMUTE revealed that this approach produces vectors that express circularly permuted proteins with distinct sequence diversity from existing methods. In addition, selection of this library for variants that complement the growth of Escherichia coli with a temperature-sensitive AK identified functional proteins with novel architectures, suggesting that PERMUTE will be useful for the directed evolution of proteins with new functions.

  2. A transposase strategy for creating libraries of circularly permuted proteins

    PubMed Central

    Mehta, Manan M.; Liu, Shirley; Silberg, Jonathan J.

    2012-01-01

    A simple approach for creating libraries of circularly permuted proteins is described that is called PERMutation Using Transposase Engineering (PERMUTE). In PERMUTE, the transposase MuA is used to randomly insert a minitransposon that can function as a protein expression vector into a plasmid that contains the open reading frame (ORF) being permuted. A library of vectors that express different permuted variants of the ORF-encoded protein is created by: (i) using bacteria to select for target vectors that acquire an integrated minitransposon; (ii) excising the ensemble of ORFs that contain an integrated minitransposon from the selected vectors; and (iii) circularizing the ensemble of ORFs containing integrated minitransposons using intramolecular ligation. Construction of a Thermotoga neapolitana adenylate kinase (AK) library using PERMUTE revealed that this approach produces vectors that express circularly permuted proteins with distinct sequence diversity from existing methods. In addition, selection of this library for variants that complement the growth of Escherichia coli with a temperature-sensitive AK identified functional proteins with novel architectures, suggesting that PERMUTE will be useful for the directed evolution of proteins with new functions. PMID:22319214

  3. Bootstrapping on Undirected Binary Networks Via Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice

    2014-09-01

    We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a fine description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.

  4. Time-series analysis of multiple foreign exchange rates using time-dependent pattern entropy

    NASA Astrophysics Data System (ADS)

    Ishizaki, Ryuji; Inoue, Masayoshi

    2018-01-01

    Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in multiple foreign exchange rates. The time-dependent pattern entropy of 7 foreign exchange rates (AUD/USD, CAD/USD, CHF/USD, EUR/USD, GBP/USD, JPY/USD, and NZD/USD) was found to be high in the long period after the Lehman shock, and be low in the long period after Mar 2012. We compared the correlation matrix between exchange rates in periods of high and low of the time-dependent pattern entropy.

  5. Ecosystem functioning and maximum entropy production: a quantitative test of hypotheses.

    PubMed

    Meysman, Filip J R; Bruers, Stijn

    2010-05-12

    The idea that entropy production puts a constraint on ecosystem functioning is quite popular in ecological thermodynamics. Yet, until now, such claims have received little quantitative verification. Here, we examine three 'entropy production' hypotheses that have been forwarded in the past. The first states that increased entropy production serves as a fingerprint of living systems. The other two hypotheses invoke stronger constraints. The state selection hypothesis states that when a system can attain multiple steady states, the stable state will show the highest entropy production rate. The gradient response principle requires that when the thermodynamic gradient increases, the system's new stable state should always be accompanied by a higher entropy production rate. We test these three hypotheses by applying them to a set of conventional food web models. Each time, we calculate the entropy production rate associated with the stable state of the ecosystem. This analysis shows that the first hypothesis holds for all the food webs tested: the living state shows always an increased entropy production over the abiotic state. In contrast, the state selection and gradient response hypotheses break down when the food web incorporates more than one trophic level, indicating that they are not generally valid.

  6. Interictal cardiorespiratory variability in temporal lobe and absence epilepsy in childhood.

    PubMed

    Varon, Carolina; Montalto, Alessandro; Jansen, Katrien; Lagae, Lieven; Marinazzo, Daniele; Faes, Luca; Van Huffel, Sabine

    2015-04-01

    It is well known that epilepsy has a profound effect on the autonomic nervous system, especially on the autonomic control of heart rate and respiration. This effect has been widely studied during seizure activity, but less attention has been given to interictal (i.e. seizure-free) activity. The studies that have been done on this topic, showed that heart rate and respiration can be affected individually, even without the occurrence of seizures. In this work, the interactions between these two individual physiological variables are analysed during interictal activity in temporal lobe and absence epilepsy in childhood. These interactions are assessed by decomposing the predictive information about heart rate variability, into different components like the transfer entropy, cross-entropy, self- entropy and the conditional self entropy. Each one of these components quantifies different types of shared information. However, when using the cross-entropy and the conditional self entropy, it is possible to split the information carried by the heart rate, into two main components, one related to respiration and one related to different mechanisms, like sympathetic activation. This can be done after assuming a directional link going from respiration to heart rate. After analysing all the entropy components, it is shown that in subjects with absence epilepsy the information shared by respiration and heart rate is significantly lower than for normal subjects. And a more remarkable finding indicates that this type of epilepsy seems to have a long term effect on the cardiac and respiratory control mechanisms of the autonomic nervous system.

  7. Numerical investigation for entropy generation in hydromagnetic flow of fluid with variable properties and slip

    NASA Astrophysics Data System (ADS)

    Khan, M. Ijaz; Hayat, Tasawar; Alsaedi, Ahmed

    2018-02-01

    This modeling and computations present the study of viscous fluid flow with variable properties by a rotating stretchable disk. Rotating flow is generated through nonlinear rotating stretching surface. Nonlinear thermal radiation and heat generation/absorption are studied. Flow is conducting for a constant applied magnetic field. No polarization is taken. Induced magnetic field is not taken into account. Attention is focused on the entropy generation rate and Bejan number. The entropy generation rate and Bejan number clearly depend on velocity and thermal fields. The von Kármán approach is utilized to convert the partial differential expressions into ordinary ones. These expressions are non-dimensionalized, and numerical results are obtained for flow variables. The effects of the magnetic parameter, Prandtl number, radiative parameter, heat generation/absorption parameter, and slip parameter on velocity and temperature fields as well as the entropy generation rate and Bejan number are discussed. Drag forces (radial and tangential) and heat transfer rates are calculated and discussed. Furthermore the entropy generation rate is a decreasing function of magnetic variable and Reynolds number. The Bejan number effect on the entropy generation rate is reverse to that of the magnetic variable. Also opposite behavior of heat transfers is observed for varying estimations of radiative and slip variables.

  8. Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation

    PubMed Central

    Recchia, Gabriel; Sahlgren, Magnus; Kanerva, Pentti; Jones, Michael N.

    2015-01-01

    Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, “noisy” permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics. PMID:25954306

  9. Finite state model and compatibility theory - New analysis tools for permutation networks

    NASA Technical Reports Server (NTRS)

    Huang, S.-T.; Tripathi, S. K.

    1986-01-01

    A simple model to describe the fundamental operation theory of shuffle-exchange-type permutation networks, the finite permutation machine (FPM), is described, and theorems which transform the control matrix result to a continuous compatible vector result are developed. It is found that only 2n-1 shuffle exchange passes are necessary, and that 3n-3 passes are sufficient, to realize all permutations, reducing the sufficient number of passes by two from previous results. The flexibility of the approach is demonstrated by the description of a stack permutation machine (SPM) which can realize all permutations, and by showing that the FPM corresponding to the Benes (1965) network belongs to the SPM. The FPM corresponding to the network with two cascaded reverse-exchange networks is found to realize all permutations, and a simple mechanism to verify several equivalence relationships of various permutation networks is discussed.

  10. Sorting permutations by prefix and suffix rearrangements.

    PubMed

    Lintzmayer, Carla Negri; Fertin, Guillaume; Dias, Zanoni

    2017-02-01

    Some interesting combinatorial problems have been motivated by genome rearrangements, which are mutations that affect large portions of a genome. When we represent genomes as permutations, the goal is to transform a given permutation into the identity permutation with the minimum number of rearrangements. When they affect segments from the beginning (respectively end) of the permutation, they are called prefix (respectively suffix) rearrangements. This paper presents results for rearrangement problems that involve prefix and suffix versions of reversals and transpositions considering unsigned and signed permutations. We give 2-approximation and ([Formula: see text])-approximation algorithms for these problems, where [Formula: see text] is a constant divided by the number of breakpoints (pairs of consecutive elements that should not be consecutive in the identity permutation) in the input permutation. We also give bounds for the diameters concerning these problems and provide ways of improving the practical results of our algorithms.

  11. Entropy generation in magnetohydrodynamic radiative flow due to rotating disk in presence of viscous dissipation and Joule heating

    NASA Astrophysics Data System (ADS)

    Hayat, Tasawar; Qayyum, Sumaira; Khan, Muhammad Ijaz; Alsaedi, Ahmed

    2018-01-01

    Simultaneous effects of viscous dissipation and Joule heating in flow by rotating disk of variable thickness are examined. Radiative flow saturating porous space is considered. Much attention is given to entropy generation outcome. Developed nonlinear ordinary differential systems are computed for the convergent series solutions. Specifically, the results of velocity, temperature, entropy generation, Bejan number, coefficient of skin friction, and local Nusselt number are discussed. Clearly the entropy generation rate depends on velocity and temperature distributions. Moreover the entropy generation rate is a decreasing function of Hartmann number, Eckert number, and Reynolds number, while they gave opposite behavior for Bejan numbers.

  12. Low-Complexity Discriminative Feature Selection From EEG Before and After Short-Term Memory Task.

    PubMed

    Behzadfar, Neda; Firoozabadi, S Mohammad P; Badie, Kambiz

    2016-10-01

    A reliable and unobtrusive quantification of changes in cortical activity during short-term memory task can be used to evaluate the efficacy of interfaces and to provide real-time user-state information. In this article, we investigate changes in electroencephalogram signals in short-term memory with respect to the baseline activity. The electroencephalogram signals have been analyzed using 9 linear and nonlinear/dynamic measures. We applied statistical Wilcoxon examination and Davis-Bouldian criterion to select optimal discriminative features. The results show that among the features, the permutation entropy significantly increased in frontal lobe and the occipital second lower alpha band activity decreased during memory task. These 2 features reflect the same mental task; however, their correlation with memory task varies in different intervals. In conclusion, it is suggested that the combination of the 2 features would improve the performance of memory based neurofeedback systems. © EEG and Clinical Neuroscience Society (ECNS) 2016.

  13. Polarization-resolved time-delay signatures of chaos induced by FBG-feedback in VCSEL.

    PubMed

    Zhong, Zhu-Qiang; Li, Song-Sui; Chan, Sze-Chun; Xia, Guang-Qiong; Wu, Zheng-Mao

    2015-06-15

    Polarization-resolved chaotic emission intensities from a vertical-cavity surface-emitting laser (VCSEL) subject to feedback from a fiber Bragg grating (FBG) are numerically investigated. Time-delay (TD) signatures of the feedback are examined through various means including self-correlations of intensity time-series of individual polarizations, cross-correlation of intensities time-series between both polarizations, and permutation entropies calculated for the individual polarizations. The results show that the TD signatures can be clearly suppressed by selecting suitable operation parameters such as the feedback strength, FBG bandwidth, and Bragg frequency. Also, in the operational parameter space, numerical maps of TD signatures and effective bandwidths are obtained, which show regions of chaotic signals with both wide bandwidths and weak TD signatures. Finally, by comparing with a VCSEL subject to feedback from a mirror, the VCSEL subject to feedback from the FBG generally shows better concealment of the TD signatures with similar, or even wider, bandwidths.

  14. A Reversible Logical Circuit Synthesis Algorithm Based on Decomposition of Cycle Representations of Permutations

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Li, Zhiqiang; Zhang, Gaoman; Pan, Suhan; Zhang, Wei

    2018-05-01

    A reversible function is isomorphic to a permutation and an arbitrary permutation can be represented by a series of cycles. A new synthesis algorithm for 3-qubit reversible circuits was presented. It consists of two parts, the first part used the Number of reversible function's Different Bits (NDBs) to decide whether the NOT gate should be added to decrease the Hamming distance of the input and output vectors; the second part was based on the idea of exploring properties of the cycle representation of permutations, decomposed the cycles to make the permutation closer to the identity permutation and finally turn into the identity permutation, it was realized by using totally controlled Toffoli gates with positive and negative controls.

  15. Time evolution of Rényi entropy under the Lindblad equation.

    PubMed

    Abe, Sumiyoshi

    2016-08-01

    In recent years, the Rényi entropy has repeatedly been discussed for characterization of quantum critical states and entanglement. Here, time evolution of the Rényi entropy is studied. A compact general formula is presented for the lower bound on the entropy rate.

  16. Fundamental limits on quantum dynamics based on entropy change

    NASA Astrophysics Data System (ADS)

    Das, Siddhartha; Khatri, Sumeet; Siopsis, George; Wilde, Mark M.

    2018-01-01

    It is well known in the realm of quantum mechanics and information theory that the entropy is non-decreasing for the class of unital physical processes. However, in general, the entropy does not exhibit monotonic behavior. This has restricted the use of entropy change in characterizing evolution processes. Recently, a lower bound on the entropy change was provided in the work of Buscemi, Das, and Wilde [Phys. Rev. A 93(6), 062314 (2016)]. We explore the limit that this bound places on the physical evolution of a quantum system and discuss how these limits can be used as witnesses to characterize quantum dynamics. In particular, we derive a lower limit on the rate of entropy change for memoryless quantum dynamics, and we argue that it provides a witness of non-unitality. This limit on the rate of entropy change leads to definitions of several witnesses for testing memory effects in quantum dynamics. Furthermore, from the aforementioned lower bound on entropy change, we obtain a measure of non-unitarity for unital evolutions.

  17. The maximum entropy production and maximum Shannon information entropy in enzyme kinetics

    NASA Astrophysics Data System (ADS)

    Dobovišek, Andrej; Markovič, Rene; Brumen, Milan; Fajmut, Aleš

    2018-04-01

    We demonstrate that the maximum entropy production principle (MEPP) serves as a physical selection principle for the description of the most probable non-equilibrium steady states in simple enzymatic reactions. A theoretical approach is developed, which enables maximization of the density of entropy production with respect to the enzyme rate constants for the enzyme reaction in a steady state. Mass and Gibbs free energy conservations are considered as optimization constraints. In such a way computed optimal enzyme rate constants in a steady state yield also the most uniform probability distribution of the enzyme states. This accounts for the maximal Shannon information entropy. By means of the stability analysis it is also demonstrated that maximal density of entropy production in that enzyme reaction requires flexible enzyme structure, which enables rapid transitions between different enzyme states. These results are supported by an example, in which density of entropy production and Shannon information entropy are numerically maximized for the enzyme Glucose Isomerase.

  18. Consciousness Indexing and Outcome Prediction with Resting-State EEG in Severe Disorders of Consciousness.

    PubMed

    Stefan, Sabina; Schorr, Barbara; Lopez-Rolon, Alex; Kolassa, Iris-Tatjana; Shock, Jonathan P; Rosenfelder, Martin; Heck, Suzette; Bender, Andreas

    2018-04-17

    We applied the following methods to resting-state EEG data from patients with disorders of consciousness (DOC) for consciousness indexing and outcome prediction: microstates, entropy (i.e. approximate, permutation), power in alpha and delta frequency bands, and connectivity (i.e. weighted symbolic mutual information, symbolic transfer entropy, complex network analysis). Patients with unresponsive wakefulness syndrome (UWS) and patients in a minimally conscious state (MCS) were classified into these two categories by fitting and testing a generalised linear model. We aimed subsequently to develop an automated system for outcome prediction in severe DOC by selecting an optimal subset of features using sequential floating forward selection (SFFS). The two outcome categories were defined as UWS or dead, and MCS or emerged from MCS. Percentage of time spent in microstate D in the alpha frequency band performed best at distinguishing MCS from UWS patients. The average clustering coefficient obtained from thresholding beta coherence performed best at predicting outcome. The optimal subset of features selected with SFFS consisted of the frequency of microstate A in the 2-20 Hz frequency band, path length obtained from thresholding alpha coherence, and average path length obtained from thresholding alpha coherence. Combining these features seemed to afford high prediction power. Python and MATLAB toolboxes for the above calculations are freely available under the GNU public license for non-commercial use ( https://qeeg.wordpress.com ).

  19. Time-series analysis of foreign exchange rates using time-dependent pattern entropy

    NASA Astrophysics Data System (ADS)

    Ishizaki, Ryuji; Inoue, Masayoshi

    2013-08-01

    Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in foreign exchange rates, in particular, the dollar-yen rate. The time-dependent pattern entropy of the dollar-yen rate was found to be high in the following periods: before and after the turning points of the yen from strong to weak or from weak to strong, and the period after the Lehman shock.

  20. Calculation of heat transfer on shuttle type configurations including the effects of variable entropy at boundary layer edge

    NASA Technical Reports Server (NTRS)

    Dejarnette, F. R.

    1972-01-01

    A relatively simple method is presented for including the effect of variable entropy at the boundary-layer edge in a heat transfer method developed previously. For each inviscid surface streamline an approximate shockwave shape is calculated using a modified form of Maslen's method for inviscid axisymmetric flows. The entropy for the streamline at the edge of the boundary layer is determined by equating the mass flux through the shock wave to that inside the boundary layer. Approximations used in this technique allow the heating rates along each inviscid surface streamline to be calculated independent of the other streamlines. The shock standoff distances computed by the present method are found to compare well with those computed by Maslen's asymmetric method. Heating rates are presented for blunted circular and elliptical cones and a typical space shuttle orbiter at angles of attack. Variable entropy effects are found to increase heating rates downstream of the nose significantly higher than those computed using normal-shock entropy, and turbulent heating rates increased more than laminar rates. Effects of Reynolds number and angles of attack are also shown.

  1. Effective Iterated Greedy Algorithm for Flow-Shop Scheduling Problems with Time lags

    NASA Astrophysics Data System (ADS)

    ZHAO, Ning; YE, Song; LI, Kaidian; CHEN, Siyu

    2017-05-01

    Flow shop scheduling problem with time lags is a practical scheduling problem and attracts many studies. Permutation problem(PFSP with time lags) is concentrated but non-permutation problem(non-PFSP with time lags) seems to be neglected. With the aim to minimize the makespan and satisfy time lag constraints, efficient algorithms corresponding to PFSP and non-PFSP problems are proposed, which consist of iterated greedy algorithm for permutation(IGTLP) and iterated greedy algorithm for non-permutation (IGTLNP). The proposed algorithms are verified using well-known simple and complex instances of permutation and non-permutation problems with various time lag ranges. The permutation results indicate that the proposed IGTLP can reach near optimal solution within nearly 11% computational time of traditional GA approach. The non-permutation results indicate that the proposed IG can reach nearly same solution within less than 1% computational time compared with traditional GA approach. The proposed research combines PFSP and non-PFSP together with minimal and maximal time lag consideration, which provides an interesting viewpoint for industrial implementation.

  2. n-Order and maximum fuzzy similarity entropy for discrimination of signals of different complexity: Application to fetal heart rate signals.

    PubMed

    Zaylaa, Amira; Oudjemia, Souad; Charara, Jamal; Girault, Jean-Marc

    2015-09-01

    This paper presents two new concepts for discrimination of signals of different complexity. The first focused initially on solving the problem of setting entropy descriptors by varying the pattern size instead of the tolerance. This led to the search for the optimal pattern size that maximized the similarity entropy. The second paradigm was based on the n-order similarity entropy that encompasses the 1-order similarity entropy. To improve the statistical stability, n-order fuzzy similarity entropy was proposed. Fractional Brownian motion was simulated to validate the different methods proposed, and fetal heart rate signals were used to discriminate normal from abnormal fetuses. In all cases, it was found that it was possible to discriminate time series of different complexity such as fractional Brownian motion and fetal heart rate signals. The best levels of performance in terms of sensitivity (90%) and specificity (90%) were obtained with the n-order fuzzy similarity entropy. However, it was shown that the optimal pattern size and the maximum similarity measurement were related to intrinsic features of the time series. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Heat Transfer and Entropy Generation Analysis of an Intermediate Heat Exchanger in ADS

    NASA Astrophysics Data System (ADS)

    Wang, Yongwei; Huai, Xiulan

    2018-04-01

    The intermediate heat exchanger for enhancement heat transfer is the important equipment in the usage of nuclear energy. In the present work, heat transfer and entropy generation of an intermediate heat exchanger (IHX) in the accelerator driven subcritical system (ADS) are investigated experimentally. The variation of entropy generation number with performance parameters of the IHX is analyzed, and effects of inlet conditions of the IHX on entropy generation number and heat transfer are discussed. Compared with the results at two working conditions of the constant mass flow rates of liquid lead-bismuth eutectic (LBE) and helium gas, the total pumping power all tends to reduce with the decreasing entropy generation number, but the variations of the effectiveness, number of transfer units and thermal capacity rate ratio are inconsistent, and need to analyze respectively. With the increasing inlet mass flow rate or LBE inlet temperature, the entropy generation number increases and the heat transfer is enhanced, while the opposite trend occurs with the increasing helium gas inlet temperature. The further study is necessary for obtaining the optimized operation parameters of the IHX to minimize entropy generation and enhance heat transfer.

  4. Entanglement entropy and mutual information production rates in acoustic black holes.

    PubMed

    Giovanazzi, Stefano

    2011-01-07

    A method to investigate acoustic Hawking radiation is proposed, where entanglement entropy and mutual information are measured from the fluctuations of the number of particles. The rate of entropy radiated per one-dimensional (1D) channel is given by S=κ/12, where κ is the sound acceleration on the sonic horizon. This entropy production is accompanied by a corresponding formation of mutual information to ensure the overall conservation of information. The predictions are confirmed using an ab initio analytical approach in transonic flows of 1D degenerate ideal Fermi fluids.

  5. Nonlinear radiative heat flux and heat source/sink on entropy generation minimization rate

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Khan, M. Waleed Ahmed; Khan, M. Ijaz; Alsaedi, A.

    2018-06-01

    Entropy generation minimization in nonlinear radiative mixed convective flow towards a variable thicked surface is addressed. Entropy generation for momentum and temperature is carried out. The source for this flow analysis is stretching velocity of sheet. Transformations are used to reduce system of partial differential equations into ordinary ones. Total entropy generation rate is determined. Series solutions for the zeroth and mth order deformation systems are computed. Domain of convergence for obtained solutions is identified. Velocity, temperature and concentration fields are plotted and interpreted. Entropy equation is studied through nonlinear mixed convection and radiative heat flux. Velocity and temperature gradients are discussed through graphs. Meaningful results are concluded in the final remarks.

  6. The equivalence of minimum entropy production and maximum thermal efficiency in endoreversible heat engines.

    PubMed

    Haseli, Y

    2016-05-01

    The objective of this study is to investigate the thermal efficiency and power production of typical models of endoreversible heat engines at the regime of minimum entropy generation rate. The study considers the Curzon-Ahlborn engine, the Novikov's engine, and the Carnot vapor cycle. The operational regimes at maximum thermal efficiency, maximum power output and minimum entropy production rate are compared for each of these engines. The results reveal that in an endoreversible heat engine, a reduction in entropy production corresponds to an increase in thermal efficiency. The three criteria of minimum entropy production, the maximum thermal efficiency, and the maximum power may become equivalent at the condition of fixed heat input.

  7. Decryption of pure-position permutation algorithms.

    PubMed

    Zhao, Xiao-Yu; Chen, Gang; Zhang, Dan; Wang, Xiao-Hong; Dong, Guang-Chang

    2004-07-01

    Pure position permutation image encryption algorithms, commonly used as image encryption investigated in this work are unfortunately frail under known-text attack. In view of the weakness of pure position permutation algorithm, we put forward an effective decryption algorithm for all pure-position permutation algorithms. First, a summary of the pure position permutation image encryption algorithms is given by introducing the concept of ergodic matrices. Then, by using probability theory and algebraic principles, the decryption probability of pure-position permutation algorithms is verified theoretically; and then, by defining the operation system of fuzzy ergodic matrices, we improve a specific decryption algorithm. Finally, some simulation results are shown.

  8. Weight distributions for turbo codes using random and nonrandom permutations

    NASA Technical Reports Server (NTRS)

    Dolinar, S.; Divsalar, D.

    1995-01-01

    This article takes a preliminary look at the weight distributions achievable for turbo codes using random, nonrandom, and semirandom permutations. Due to the recursiveness of the encoders, it is important to distinguish between self-terminating and non-self-terminating input sequences. The non-self-terminating sequences have little effect on decoder performance, because they accumulate high encoded weight until they are artificially terminated at the end of the block. From probabilistic arguments based on selecting the permutations randomly, it is concluded that the self-terminating weight-2 data sequences are the most important consideration in the design of constituent codes; higher-weight self-terminating sequences have successively decreasing importance. Also, increasing the number of codes and, correspondingly, the number of permutations makes it more and more likely that the bad input sequences will be broken up by one or more of the permuters. It is possible to design nonrandom permutations that ensure that the minimum distance due to weight-2 input sequences grows roughly as the square root of (2N), where N is the block length. However, these nonrandom permutations amplify the bad effects of higher-weight inputs, and as a result they are inferior in performance to randomly selected permutations. But there are 'semirandom' permutations that perform nearly as well as the designed nonrandom permutations with respect to weight-2 input sequences and are not as susceptible to being foiled by higher-weight inputs.

  9. Permutation coding technique for image recognition systems.

    PubMed

    Kussul, Ernst M; Baidyk, Tatiana N; Wunsch, Donald C; Makeyev, Oleksandr; Martín, Anabel

    2006-11-01

    A feature extractor and neural classifier for image recognition systems are proposed. The proposed feature extractor is based on the concept of random local descriptors (RLDs). It is followed by the encoder that is based on the permutation coding technique that allows to take into account not only detected features but also the position of each feature on the image and to make the recognition process invariant to small displacements. The combination of RLDs and permutation coding permits us to obtain a sufficiently general description of the image to be recognized. The code generated by the encoder is used as an input data for the neural classifier. Different types of images were used to test the proposed image recognition system. It was tested in the handwritten digit recognition problem, the face recognition problem, and the microobject shape recognition problem. The results of testing are very promising. The error rate for the Modified National Institute of Standards and Technology (MNIST) database is 0.44% and for the Olivetti Research Laboratory (ORL) database it is 0.1%.

  10. A permutationally invariant full-dimensional ab initio potential energy surface for the abstraction and exchange channels of the H + CH{sub 4} system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun, E-mail: jli15@cqu.edu.cn, E-mail: zhangdh@dicp.ac.cn; Department of Chemistry and Chemical Biology, University of New Mexico, Albuquerque, New Mexico 87131; Chen, Jun

    2015-05-28

    We report a permutationally invariant global potential energy surface (PES) for the H + CH{sub 4} system based on ∼63 000 data points calculated at a high ab initio level (UCCSD(T)-F12a/AVTZ) using the recently proposed permutation invariant polynomial-neural network method. The small fitting error (5.1 meV) indicates a faithful representation of the ab initio points over a large configuration space. The rate coefficients calculated on the PES using tunneling corrected transition-state theory and quasi-classical trajectory are found to agree well with the available experimental and previous quantum dynamical results. The calculated total reaction probabilities (J{sub tot} = 0) including themore » abstraction and exchange channels using the new potential by a reduced dimensional quantum dynamic method are essentially the same as those on the Xu-Chen-Zhang PES [Chin. J. Chem. Phys. 27, 373 (2014)].« less

  11. PERMutation Using Transposase Engineering (PERMUTE): A Simple Approach for Constructing Circularly Permuted Protein Libraries.

    PubMed

    Jones, Alicia M; Atkinson, Joshua T; Silberg, Jonathan J

    2017-01-01

    Rearrangements that alter the order of a protein's sequence are used in the lab to study protein folding, improve activity, and build molecular switches. One of the simplest ways to rearrange a protein sequence is through random circular permutation, where native protein termini are linked together and new termini are created elsewhere through random backbone fission. Transposase mutagenesis has emerged as a simple way to generate libraries encoding different circularly permuted variants of proteins. With this approach, a synthetic transposon (called a permuteposon) is randomly inserted throughout a circularized gene to generate vectors that express different permuted variants of a protein. In this chapter, we outline the protocol for constructing combinatorial libraries of circularly permuted proteins using transposase mutagenesis, and we describe the different permuteposons that have been developed to facilitate library construction.

  12. Excess Entropy Production in Quantum System: Quantum Master Equation Approach

    NASA Astrophysics Data System (ADS)

    Nakajima, Satoshi; Tokura, Yasuhiro

    2017-12-01

    For open systems described by the quantum master equation (QME), we investigate the excess entropy production under quasistatic operations between nonequilibrium steady states. The average entropy production is composed of the time integral of the instantaneous steady entropy production rate and the excess entropy production. We propose to define average entropy production rate using the average energy and particle currents, which are calculated by using the full counting statistics with QME. The excess entropy production is given by a line integral in the control parameter space and its integrand is called the Berry-Sinitsyn-Nemenman (BSN) vector. In the weakly nonequilibrium regime, we show that BSN vector is described by ln \\breve{ρ }_0 and ρ _0 where ρ _0 is the instantaneous steady state of the QME and \\breve{ρ }_0 is that of the QME which is given by reversing the sign of the Lamb shift term. If the system Hamiltonian is non-degenerate or the Lamb shift term is negligible, the excess entropy production approximately reduces to the difference between the von Neumann entropies of the system. Additionally, we point out that the expression of the entropy production obtained in the classical Markov jump process is different from our result and show that these are approximately equivalent only in the weakly nonequilibrium regime.

  13. Respiration and heart rate complexity: Effects of age and gender assessed by band-limited transfer entropy

    PubMed Central

    Nemati, Shamim; Edwards, Bradley A.; Lee, Joon; Pittman-Polletta, Benjamin; Butler, James P.; Malhotra, Atul

    2013-01-01

    Aging and disease are accompanied with a reduction of complex variability in the temporal patterns of heart rate. This reduction has been attributed to a break down of the underlying regulatory feedback mechanisms that maintain a homeodynamic state. Previous work has established the utility of entropy as an index of disorder, for quantification of changes in heart rate complexity. However, questions remain regarding the origin of heart rate complexity and the mechanisms involved in its reduction with aging and disease. In this work we use a newly developed technique based on the concept of band-limited transfer entropy to assess the aging-related changes in contribution of respiration and blood pressure to entropy of heart rate at different frequency bands. Noninvasive measurements of heart beat interval, respiration, and systolic blood pressure were recorded from 20 young (21–34 years) and 20 older (68–85 years) healthy adults. Band-limited transfer entropy analysis revealed a reduction in high-frequency contribution of respiration to heart rate complexity (p < 0.001) with normal aging, particularly in men. These results have the potential for dissecting the relative contributions of respiration and blood pressure-related reflexes to heart rate complexity and their degeneration with normal aging. PMID:23811194

  14. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  15. Ensembles of physical states and random quantum circuits on graphs

    NASA Astrophysics Data System (ADS)

    Hamma, Alioscia; Santra, Siddhartha; Zanardi, Paolo

    2012-11-01

    In this paper we continue and extend the investigations of the ensembles of random physical states introduced in Hamma [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.109.040502 109, 040502 (2012)]. These ensembles are constructed by finite-length random quantum circuits (RQC) acting on the (hyper)edges of an underlying (hyper)graph structure. The latter encodes for the locality structure associated with finite-time quantum evolutions generated by physical, i.e., local, Hamiltonians. Our goal is to analyze physical properties of typical states in these ensembles; in particular here we focus on proxies of quantum entanglement as purity and α-Renyi entropies. The problem is formulated in terms of matrix elements of superoperators which depend on the graph structure, choice of probability measure over the local unitaries, and circuit length. In the α=2 case these superoperators act on a restricted multiqubit space generated by permutation operators associated to the subsets of vertices of the graph. For permutationally invariant interactions the dynamics can be further restricted to an exponentially smaller subspace. We consider different families of RQCs and study their typical entanglement properties for finite time as well as their asymptotic behavior. We find that area law holds in average and that the volume law is a typical property (that is, it holds in average and the fluctuations around the average are vanishing for the large system) of physical states. The area law arises when the evolution time is O(1) with respect to the size L of the system, while the volume law arises as is typical when the evolution time scales like O(L).

  16. Modulation of frustration in folding by sequence permutation.

    PubMed

    Nobrega, R Paul; Arora, Karunesh; Kathuria, Sagar V; Graceffa, Rita; Barrea, Raul A; Guo, Liang; Chakravarthy, Srinivas; Bilsel, Osman; Irving, Thomas C; Brooks, Charles L; Matthews, C Robert

    2014-07-22

    Folding of globular proteins can be envisioned as the contraction of a random coil unfolded state toward the native state on an energy surface rough with local minima trapping frustrated species. These substructures impede productive folding and can serve as nucleation sites for aggregation reactions. However, little is known about the relationship between frustration and its underlying sequence determinants. Chemotaxis response regulator Y (CheY), a 129-amino acid bacterial protein, has been shown previously to populate an off-pathway kinetic trap in the microsecond time range. The frustration has been ascribed to premature docking of the N- and C-terminal subdomains or, alternatively, to the formation of an unproductive local-in-sequence cluster of branched aliphatic side chains, isoleucine, leucine, and valine (ILV). The roles of the subdomains and ILV clusters in frustration were tested by altering the sequence connectivity using circular permutations. Surprisingly, the stability and buried surface area of the intermediate could be increased or decreased depending on the location of the termini. Comparison with the results of small-angle X-ray-scattering experiments and simulations points to the accelerated formation of a more compact, on-pathway species for the more stable intermediate. The effect of chain connectivity in modulating the structures and stabilities of the early kinetic traps in CheY is better understood in terms of the ILV cluster model. However, the subdomain model captures the requirement for an intact N-terminal domain to access the native conformation. Chain entropy and aliphatic-rich sequences play crucial roles in biasing the early events leading to frustration in the folding of CheY.

  17. Mixing times towards demographic equilibrium in insect populations with temperature variable age structures.

    PubMed

    Damos, Petros

    2015-08-01

    In this study, we use entropy related mixing rate modules to measure the effects of temperature on insect population stability and demographic breakdown. The uncertainty in the age of the mother of a randomly chosen newborn, and how it is moved after a finite act of time steps, is modeled using a stochastic transformation of the Leslie matrix. Age classes are represented as a cycle graph and its transitions towards the stable age distribution are brought forth as an exact Markov chain. The dynamics of divergence, from a non equilibrium state towards equilibrium, are evaluated using the Kolmogorov-Sinai entropy. Moreover, Kullback-Leibler distance is applied as information-theoretic measure to estimate exact mixing times of age transitions probabilities towards equilibrium. Using empirically data, we show that on the initial conditions and simulated projection's trough time, that population entropy can effectively be applied to detect demographic variability towards equilibrium under different temperature conditions. Changes in entropy are correlated with the fluctuations of the insect population decay rates (i.e. demographic stability towards equilibrium). Moreover, shorter mixing times are directly linked to lower entropy rates and vice versa. This may be linked to the properties of the insect model system, which in contrast to warm blooded animals has the ability to greatly change its metabolic and demographic rates. Moreover, population entropy and the related distance measures that are applied, provide a means to measure these rates. The current results and model projections provide clear biological evidence why dynamic population entropy may be useful to measure population stability. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Time reversibility and nonequilibrium thermodynamics of second-order stochastic processes.

    PubMed

    Ge, Hao

    2014-02-01

    Nonequilibrium thermodynamics of a general second-order stochastic system is investigated. We prove that at steady state, under inversion of velocities, the condition of time reversibility over the phase space is equivalent to the antisymmetry of spatial flux and the symmetry of velocity flux. Then we show that the condition of time reversibility alone cannot always guarantee the Maxwell-Boltzmann distribution. Comparing the two conditions together, we find that the frictional force naturally emerges as the unique odd term of the total force at thermodynamic equilibrium, and is followed by the Einstein relation. The two conditions respectively correspond to two previously reported different entropy production rates. In the case where the external force is only position dependent, the two entropy production rates become one. We prove that such an entropy production rate can be decomposed into two non-negative terms, expressed respectively by the conditional mean and variance of the thermodynamic force associated with the irreversible velocity flux at any given spatial coordinate. In the small inertia limit, the former term becomes the entropy production rate of the corresponding overdamped dynamics, while the anomalous entropy production rate originates from the latter term. Furthermore, regarding the connection between the first law and second law, we find that in the steady state of such a limit, the anomalous entropy production rate is also the leading order of the Boltzmann-factor weighted difference between the spatial heat dissipation densities of the underdamped and overdamped dynamics, while their unweighted difference always tends to vanish.

  19. Opposition-Based Memetic Algorithm and Hybrid Approach for Sorting Permutations by Reversals.

    PubMed

    Soncco-Álvarez, José Luis; Muñoz, Daniel M; Ayala-Rincón, Mauricio

    2018-02-21

    Sorting unsigned permutations by reversals is a difficult problem; indeed, it was proved to be NP-hard by Caprara (1997). Because of its high complexity, many approximation algorithms to compute the minimal reversal distance were proposed until reaching the nowadays best-known theoretical ratio of 1.375. In this article, two memetic algorithms to compute the reversal distance are proposed. The first one uses the technique of opposition-based learning leading to an opposition-based memetic algorithm; the second one improves the previous algorithm by applying the heuristic of two breakpoint elimination leading to a hybrid approach. Several experiments were performed with one-hundred randomly generated permutations, single benchmark permutations, and biological permutations. Results of the experiments showed that the proposed OBMA and Hybrid-OBMA algorithms achieve the best results for practical cases, that is, for permutations of length up to 120. Also, Hybrid-OBMA showed to improve the results of OBMA for permutations greater than or equal to 60. The applicability of our proposed algorithms was checked processing permutations based on biological data, in which case OBMA gave the best average results for all instances.

  20. [Local fractal analysis of noise-like time series by all permutations method for 1-115 min periods].

    PubMed

    Panchelyuga, V A; Panchelyuga, M S

    2015-01-01

    Results of local fractal analysis of 329-per-day time series of 239Pu alpha-decay rate fluctuations by means of all permutations method (APM) are presented. The APM-analysis reveals in the time series some steady frequency set. The coincidence of the frequency set with the Earth natural oscillations was demonstrated. A short review of works by different authors who analyzed the time series of fluctuations in processes of different nature is given. We have shown that the periods observed in those works correspond to the periods revealed in our study. It points to a common mechanism of the phenomenon observed.

  1. Stochastic approach to equilibrium and nonequilibrium thermodynamics.

    PubMed

    Tomé, Tânia; de Oliveira, Mário J

    2015-04-01

    We develop the stochastic approach to thermodynamics based on stochastic dynamics, which can be discrete (master equation) and continuous (Fokker-Planck equation), and on two assumptions concerning entropy. The first is the definition of entropy itself and the second the definition of entropy production rate, which is non-negative and vanishes in thermodynamic equilibrium. Based on these assumptions, we study interacting systems with many degrees of freedom in equilibrium or out of thermodynamic equilibrium and how the macroscopic laws are derived from the stochastic dynamics. These studies include the quasiequilibrium processes; the convexity of the equilibrium surface; the monotonic time behavior of thermodynamic potentials, including entropy; the bilinear form of the entropy production rate; the Onsager coefficients and reciprocal relations; and the nonequilibrium steady states of chemical reactions.

  2. Visual recognition of permuted words

    NASA Astrophysics Data System (ADS)

    Rashid, Sheikh Faisal; Shafait, Faisal; Breuel, Thomas M.

    2010-02-01

    In current study we examine how letter permutation affects in visual recognition of words for two orthographically dissimilar languages, Urdu and German. We present the hypothesis that recognition or reading of permuted and non-permuted words are two distinct mental level processes, and that people use different strategies in handling permuted words as compared to normal words. A comparison between reading behavior of people in these languages is also presented. We present our study in context of dual route theories of reading and it is observed that the dual-route theory is consistent with explanation of our hypothesis of distinction in underlying cognitive behavior for reading permuted and non-permuted words. We conducted three experiments in lexical decision tasks to analyze how reading is degraded or affected by letter permutation. We performed analysis of variance (ANOVA), distribution free rank test, and t-test to determine the significance differences in response time latencies for two classes of data. Results showed that the recognition accuracy for permuted words is decreased 31% in case of Urdu and 11% in case of German language. We also found a considerable difference in reading behavior for cursive and alphabetic languages and it is observed that reading of Urdu is comparatively slower than reading of German due to characteristics of cursive script.

  3. Four applications of permutation methods to testing a single-mediator model.

    PubMed

    Taylor, Aaron B; MacKinnon, David P

    2012-09-01

    Four applications of permutation tests to the single-mediator model are described and evaluated in this study. Permutation tests work by rearranging data in many possible ways in order to estimate the sampling distribution for the test statistic. The four applications to mediation evaluated here are the permutation test of ab, the permutation joint significance test, and the noniterative and iterative permutation confidence intervals for ab. A Monte Carlo simulation study was used to compare these four tests with the four best available tests for mediation found in previous research: the joint significance test, the distribution of the product test, and the percentile and bias-corrected bootstrap tests. We compared the different methods on Type I error, power, and confidence interval coverage. The noniterative permutation confidence interval for ab was the best performer among the new methods. It successfully controlled Type I error, had power nearly as good as the most powerful existing methods, and had better coverage than any existing method. The iterative permutation confidence interval for ab had lower power than do some existing methods, but it performed better than any other method in terms of coverage. The permutation confidence interval methods are recommended when estimating a confidence interval is a primary concern. SPSS and SAS macros that estimate these confidence intervals are provided.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asplund, Curtis T., E-mail: ca2621@columbia.edu; Berenstein, David, E-mail: dberens@physics.ucsb.edu

    We consider oscillators evolving subject to a periodic driving force that dynamically entangles them, and argue that this gives the linearized evolution around periodic orbits in a general chaotic Hamiltonian dynamical system. We show that the entanglement entropy, after tracing over half of the oscillators, generically asymptotes to linear growth at a rate given by the sum of the positive Lyapunov exponents of the system. These exponents give a classical entropy growth rate, in the sense of Kolmogorov, Sinai and Pesin. We also calculate the dependence of this entropy on linear mixtures of the oscillator Hilbert-space factors, to investigate themore » dependence of the entanglement entropy on the choice of coarse graining. We find that for almost all choices the asymptotic growth rate is the same.« less

  5. The large deviation function for entropy production: the optimal trajectory and the role of fluctuations

    NASA Astrophysics Data System (ADS)

    Speck, Thomas; Engel, Andreas; Seifert, Udo

    2012-12-01

    We study the large deviation function for the entropy production rate in two driven one-dimensional systems: the asymmetric random walk on a discrete lattice and Brownian motion in a continuous periodic potential. We compare two approaches: using the Donsker-Varadhan theory and using the Freidlin-Wentzell theory. We show that the wings of the large deviation function are dominated by a single optimal trajectory: either in the forward direction (positive rate) or in the backward direction (negative rate). The joining of the two branches at zero entropy production implies a non-differentiability and thus the appearance of a ‘kink’. However, around zero entropy production, many trajectories contribute and thus the ‘kink’ is smeared out.

  6. Minimization of the Renyi entropy production in the stationary states of the Brownian process with matched death and birth rates.

    PubMed

    Cybulski, Olgierd; Babin, Volodymyr; Hołyst, Robert

    2004-01-01

    We analyze the Fleming-Viot process. The system is confined in a box, whose boundaries act as a sink of Brownian particles. The death rate at the boundaries is matched by the branching (birth) rate in the system and thus the number of particles is kept constant. We show that such a process is described by the Renyi entropy whose production is minimized in the stationary state. The entropy production in this process is a monotonically decreasing function of time irrespective of the initial conditions. The first Laplacian eigenvalue is shown to be equal to the Renyi entropy production in the stationary state. As an example we simulate the process in a two-dimensional box.

  7. Spin-phase-space-entropy production

    NASA Astrophysics Data System (ADS)

    Santos, Jader P.; Céleri, Lucas C.; Brito, Frederico; Landi, Gabriel T.; Paternostro, Mauro

    2018-05-01

    Quantifying the degree of irreversibility of an open system dynamics represents a problem of both fundamental and applied relevance. Even though a well-known framework exists for thermal baths, the results give diverging results in the limit of zero temperature and are also not readily extended to nonequilibrium reservoirs, such as dephasing baths. Aimed at filling this gap, in this paper we introduce a phase-space-entropy production framework for quantifying the irreversibility of spin systems undergoing Lindblad dynamics. The theory is based on the spin Husimi-Q function and its corresponding phase-space entropy, known as Wehrl entropy. Unlike the von Neumann entropy production rate, we show that in our framework, the Wehrl entropy production rate remains valid at any temperature and is also readily extended to arbitrary nonequilibrium baths. As an application, we discuss the irreversibility associated with the interaction of a two-level system with a single-photon pulse, a problem which cannot be treated using the conventional approach.

  8. Efficient computation of significance levels for multiple associations in large studies of correlated data, including genomewide association studies.

    PubMed

    Dudbridge, Frank; Koeleman, Bobby P C

    2004-09-01

    Large exploratory studies, including candidate-gene-association testing, genomewide linkage-disequilibrium scans, and array-expression experiments, are becoming increasingly common. A serious problem for such studies is that statistical power is compromised by the need to control the false-positive rate for a large family of tests. Because multiple true associations are anticipated, methods have been proposed that combine evidence from the most significant tests, as a more powerful alternative to individually adjusted tests. The practical application of these methods is currently limited by a reliance on permutation testing to account for the correlated nature of single-nucleotide polymorphism (SNP)-association data. On a genomewide scale, this is both very time-consuming and impractical for repeated explorations with standard marker panels. Here, we alleviate these problems by fitting analytic distributions to the empirical distribution of combined evidence. We fit extreme-value distributions for fixed lengths of combined evidence and a beta distribution for the most significant length. An initial phase of permutation sampling is required to fit these distributions, but it can be completed more quickly than a simple permutation test and need be done only once for each panel of tests, after which the fitted parameters give a reusable calibration of the panel. Our approach is also a more efficient alternative to a standard permutation test. We demonstrate the accuracy of our approach and compare its efficiency with that of permutation tests on genomewide SNP data released by the International HapMap Consortium. The estimation of analytic distributions for combined evidence will allow these powerful methods to be applied more widely in large exploratory studies.

  9. Optimization of a Circular Microchannel With Entropy Generation Minimization Method

    NASA Astrophysics Data System (ADS)

    Jafari, Arash; Ghazali, Normah Mohd

    2010-06-01

    New advances in micro and nano scales are being realized and the contributions of micro and nano heat dissipation devices are of high importance in this novel technology development. Past studies showed that microchannel design depends on its thermal resistance and pressure drop. However, entropy generation minimization (EGM) as a new optimization theory stated that the rate of entropy generation should be also optimized. Application of EGM in microchannel heat sink design is reviewed and discussed in this paper. Latest principles for deriving the entropy generation relations are discussed to present how this approach can be achieved. An optimization procedure using EGM method with the entropy generation rate is derived for a circular microchannel heat sink based upon thermal resistance and pressure drop. The equations are solved using MATLAB and the obtained results are compared to similar past studies. The effects of channel diameter, number of channels, heat flux, and pumping power on the entropy generation rate and Reynolds number are investigated. Analytical correlations are utilized for heat transfer and friction coefficients. A minimum entropy generation has been observed for N = 40 and channel diameter of 90μm. It is concluded that for N = 40 and channel hydraulic diameter of 90μm, the circular microchannel heat sink is on its optimum operating point based on second law of thermodynamics.

  10. Species Entropies in the Kinetic Range of Collisionless Plasma Turbulence: Particle-in-cell Simulations

    NASA Astrophysics Data System (ADS)

    Gary, S. Peter; Zhao, Yinjian; Hughes, R. Scott; Wang, Joseph; Parashar, Tulasi N.

    2018-06-01

    Three-dimensional particle-in-cell simulations of the forward cascade of decaying turbulence in the relatively short-wavelength kinetic range have been carried out as initial-value problems on collisionless, homogeneous, magnetized electron-ion plasma models. The simulations have addressed both whistler turbulence at β i = β e = 0.25 and kinetic Alfvén turbulence at β i = β e = 0.50, computing the species energy dissipation rates as well as the increase of the Boltzmann entropies for both ions and electrons as functions of the initial dimensionless fluctuating magnetic field energy density ε o in the range 0 ≤ ε o ≤ 0.50. This study shows that electron and ion entropies display similar rates of increase and that all four entropy rates increase approximately as ε o , consistent with the assumption that the quasilinear premise is valid for the initial conditions assumed for these simulations. The simulations further predict that the time rates of ion entropy increase should be substantially greater for kinetic Alfvén turbulence than for whistler turbulence.

  11. Circular Permutation of a Chaperonin Protein: Biophysics and Application to Nanotechnology

    NASA Technical Reports Server (NTRS)

    Paavola, Chad; Chan, Suzanne; Li, Yi-Fen; McMillan, R. Andrew; Trent, Jonathan

    2004-01-01

    We have designed five circular permutants of a chaperonin protein derived from the hyperthermophilic organism Sulfolobus shibatae. These permuted proteins were expressed in E. coli and are well-folded. Furthermore, all the permutants assemble into 18-mer double rings of the same form as the wild-type protein. We characterized the thermodynamics of folding for each permutant by both guanidine denaturation and differential scanning calorimetry. We also examined the assembly of chaperonin rings into higher order structures that may be used as nanoscale templates. The results show that circular permutation can be used to tune the thermodynamic properties of a protein template as well as facilitating the fusion of peptides, binding proteins or enzymes onto nanostructured templates.

  12. The structure of a thermophilic kinase shapes fitness upon random circular permutation

    PubMed Central

    Jones, Alicia M.; Mehta, Manan M.; Thomas, Emily E.; Atkinson, Joshua T.; Segall-Shapiro, Thomas H.; Liu, Shirley; Silberg, Jonathan J.

    2016-01-01

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement where native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein’s functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AK with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and they reveal a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection. PMID:26976658

  13. The Structure of a Thermophilic Kinase Shapes Fitness upon Random Circular Permutation.

    PubMed

    Jones, Alicia M; Mehta, Manan M; Thomas, Emily E; Atkinson, Joshua T; Segall-Shapiro, Thomas H; Liu, Shirley; Silberg, Jonathan J

    2016-05-20

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement in which native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein's functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AKs with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and it reveals a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection.

  14. Teaching Tip: When a Matrix and Its Inverse Are Stochastic

    ERIC Educational Resources Information Center

    Ding, J.; Rhee, N. H.

    2013-01-01

    A stochastic matrix is a square matrix with nonnegative entries and row sums 1. The simplest example is a permutation matrix, whose rows permute the rows of an identity matrix. A permutation matrix and its inverse are both stochastic. We prove the converse, that is, if a matrix and its inverse are both stochastic, then it is a permutation matrix.

  15. A new algorithm for finding survival coefficients employed in reliability equations

    NASA Technical Reports Server (NTRS)

    Bouricius, W. G.; Flehinger, B. J.

    1973-01-01

    Product reliabilities are predicted from past failure rates and reasonable estimate of future failure rates. Algorithm is used to calculate probability that product will function correctly. Algorithm sums the probabilities of each survival pattern and number of permutations for that pattern, over all possible ways in which product can survive.

  16. Modelling the spreading rate of controlled communicable epidemics through an entropy-based thermodynamic model

    NASA Astrophysics Data System (ADS)

    Wang, WenBin; Wu, ZiNiu; Wang, ChunFeng; Hu, RuiFeng

    2013-11-01

    A model based on a thermodynamic approach is proposed for predicting the dynamics of communicable epidemics assumed to be governed by controlling efforts of multiple scales so that an entropy is associated with the system. All the epidemic details are factored into a single and time-dependent coefficient, the functional form of this coefficient is found through four constraints, including notably the existence of an inflexion point and a maximum. The model is solved to give a log-normal distribution for the spread rate, for which a Shannon entropy can be defined. The only parameter, that characterizes the width of the distribution function, is uniquely determined through maximizing the rate of entropy production. This entropy-based thermodynamic (EBT) model predicts the number of hospitalized cases with a reasonable accuracy for SARS in the year 2003. This EBT model can be of use for potential epidemics such as avian influenza and H7N9 in China.

  17. Permutation-based inference for the AUC: A unified approach for continuous and discontinuous data.

    PubMed

    Pauly, Markus; Asendorf, Thomas; Konietschke, Frank

    2016-11-01

    We investigate rank-based studentized permutation methods for the nonparametric Behrens-Fisher problem, that is, inference methods for the area under the ROC curve. We hereby prove that the studentized permutation distribution of the Brunner-Munzel rank statistic is asymptotically standard normal, even under the alternative. Thus, incidentally providing the hitherto missing theoretical foundation for the Neubert and Brunner studentized permutation test. In particular, we do not only show its consistency, but also that confidence intervals for the underlying treatment effects can be computed by inverting this permutation test. In addition, we derive permutation-based range-preserving confidence intervals. Extensive simulation studies show that the permutation-based confidence intervals appear to maintain the preassigned coverage probability quite accurately (even for rather small sample sizes). For a convenient application of the proposed methods, a freely available software package for the statistical software R has been developed. A real data example illustrates the application. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Visual field progression in glaucoma: estimating the overall significance of deterioration with permutation analyses of pointwise linear regression (PoPLR).

    PubMed

    O'Leary, Neil; Chauhan, Balwantray C; Artes, Paul H

    2012-10-01

    To establish a method for estimating the overall statistical significance of visual field deterioration from an individual patient's data, and to compare its performance to pointwise linear regression. The Truncated Product Method was used to calculate a statistic S that combines evidence of deterioration from individual test locations in the visual field. The overall statistical significance (P value) of visual field deterioration was inferred by comparing S with its permutation distribution, derived from repeated reordering of the visual field series. Permutation of pointwise linear regression (PoPLR) and pointwise linear regression were evaluated in data from patients with glaucoma (944 eyes, median mean deviation -2.9 dB, interquartile range: -6.3, -1.2 dB) followed for more than 4 years (median 10 examinations over 8 years). False-positive rates were estimated from randomly reordered series of this dataset, and hit rates (proportion of eyes with significant deterioration) were estimated from the original series. The false-positive rates of PoPLR were indistinguishable from the corresponding nominal significance levels and were independent of baseline visual field damage and length of follow-up. At P < 0.05, the hit rates of PoPLR were 12, 29, and 42%, at the fifth, eighth, and final examinations, respectively, and at matching specificities they were consistently higher than those of pointwise linear regression. In contrast to population-based progression analyses, PoPLR provides a continuous estimate of statistical significance for visual field deterioration individualized to a particular patient's data. This allows close control over specificity, essential for monitoring patients in clinical practice and in clinical trials.

  19. Efficient Blockwise Permutation Tests Preserving Exchangeability

    PubMed Central

    Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.

    2014-01-01

    In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113

  20. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.

    PubMed

    Groppe, David M; Urbach, Thomas P; Kutas, Marta

    2011-12-01

    Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.

  1. An AUC-based permutation variable importance measure for random forests

    PubMed Central

    2013-01-01

    Background The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. Results We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. Conclusions The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html. PMID:23560875

  2. An AUC-based permutation variable importance measure for random forests.

    PubMed

    Janitza, Silke; Strobl, Carolin; Boulesteix, Anne-Laure

    2013-04-05

    The random forest (RF) method is a commonly used tool for classification with high dimensional data as well as for ranking candidate predictors based on the so-called random forest variable importance measures (VIMs). However the classification performance of RF is known to be suboptimal in case of strongly unbalanced data, i.e. data where response class sizes differ considerably. Suggestions were made to obtain better classification performance based either on sampling procedures or on cost sensitivity analyses. However to our knowledge the performance of the VIMs has not yet been examined in the case of unbalanced response classes. In this paper we explore the performance of the permutation VIM for unbalanced data settings and introduce an alternative permutation VIM based on the area under the curve (AUC) that is expected to be more robust towards class imbalance. We investigated the performance of the standard permutation VIM and of our novel AUC-based permutation VIM for different class imbalance levels using simulated data and real data. The results suggest that the new AUC-based permutation VIM outperforms the standard permutation VIM for unbalanced data settings while both permutation VIMs have equal performance for balanced data settings. The standard permutation VIM loses its ability to discriminate between associated predictors and predictors not associated with the response for increasing class imbalance. It is outperformed by our new AUC-based permutation VIM for unbalanced data settings, while the performance of both VIMs is very similar in the case of balanced classes. The new AUC-based VIM is implemented in the R package party for the unbiased RF variant based on conditional inference trees. The codes implementing our study are available from the companion website: http://www.ibe.med.uni-muenchen.de/organisation/mitarbeiter/070_drittmittel/janitza/index.html.

  3. Intracellular signaling entropy can be a biomarker for predicting the development of cervical intraepithelial neoplasia.

    PubMed

    Sato, Masakazu; Kawana, Kei; Adachi, Katsuyuki; Fujimoto, Asaha; Yoshida, Mitsuyo; Nakamura, Hiroe; Nishida, Haruka; Inoue, Tomoko; Taguchi, Ayumi; Ogishima, Juri; Eguchi, Satoko; Yamashita, Aki; Tomio, Kensuke; Wada-Hiraike, Osamu; Oda, Katsutoshi; Nagamatsu, Takeshi; Osuga, Yutaka; Fujii, Tomoyuki

    2017-01-01

    While the mortality rates for cervical cancer have been drastically reduced after the introduction of the Pap smear test, it still is one of the leading causes of death in women worldwide. Additionally, studies that appropriately evaluate the risk of developing cervical lesions are needed. Therefore, we investigated whether intracellular signaling entropy, which is measured with microarray data, could be useful for predicting the risks of developing cervical lesions. We used three datasets, GSE63514 (histology), GSE27678 (cytology) and GSE75132 (cytology, a prospective study). From the data in GSE63514, the entropy rate was significantly increased with disease progression (normal < cervical intraepithelial neoplasia, CIN < cancer) (Kruskal-Wallis test, p < 0.0001). From the data in GSE27678, similar results (normal < low-grade squamous intraepithelial lesions, LSILs < high-grade squamous intraepithelial lesions, HSILs ≤ cancer) were obtained (Kruskal-Wallis test, p < 0.001). From the data in GSE75132, the entropy rate tended to be higher in the HPV-persistent groups than the HPV-negative group. The group that was destined to progress to CIN 3 or higher had a tendency to have a higher entropy rate than the HPV16-positive without progression group. In conclusion, signaling entropy was suggested to be different for different lesion statuses and could be a useful biomarker for predicting the development of cervical intraepithelial neoplasia.

  4. Intracellular signaling entropy can be a biomarker for predicting the development of cervical intraepithelial neoplasia

    PubMed Central

    Sato, Masakazu; Adachi, Katsuyuki; Fujimoto, Asaha; Yoshida, Mitsuyo; Nakamura, Hiroe; Nishida, Haruka; Inoue, Tomoko; Taguchi, Ayumi; Ogishima, Juri; Eguchi, Satoko; Yamashita, Aki; Tomio, Kensuke; Wada-Hiraike, Osamu; Oda, Katsutoshi; Nagamatsu, Takeshi; Osuga, Yutaka; Fujii, Tomoyuki

    2017-01-01

    While the mortality rates for cervical cancer have been drastically reduced after the introduction of the Pap smear test, it still is one of the leading causes of death in women worldwide. Additionally, studies that appropriately evaluate the risk of developing cervical lesions are needed. Therefore, we investigated whether intracellular signaling entropy, which is measured with microarray data, could be useful for predicting the risks of developing cervical lesions. We used three datasets, GSE63514 (histology), GSE27678 (cytology) and GSE75132 (cytology, a prospective study). From the data in GSE63514, the entropy rate was significantly increased with disease progression (normal < cervical intraepithelial neoplasia, CIN < cancer) (Kruskal-Wallis test, p < 0.0001). From the data in GSE27678, similar results (normal < low-grade squamous intraepithelial lesions, LSILs < high-grade squamous intraepithelial lesions, HSILs ≤ cancer) were obtained (Kruskal-Wallis test, p < 0.001). From the data in GSE75132, the entropy rate tended to be higher in the HPV-persistent groups than the HPV-negative group. The group that was destined to progress to CIN 3 or higher had a tendency to have a higher entropy rate than the HPV16-positive without progression group. In conclusion, signaling entropy was suggested to be different for different lesion statuses and could be a useful biomarker for predicting the development of cervical intraepithelial neoplasia. PMID:28453530

  5. Inference With Difference-in-Differences With a Small Number of Groups: A Review, Simulation Study, and Empirical Application Using SHARE Data.

    PubMed

    Rokicki, Slawa; Cohen, Jessica; Fink, Günther; Salomon, Joshua A; Landrum, Mary Beth

    2018-01-01

    Difference-in-differences (DID) estimation has become increasingly popular as an approach to evaluate the effect of a group-level policy on individual-level outcomes. Several statistical methodologies have been proposed to correct for the within-group correlation of model errors resulting from the clustering of data. Little is known about how well these corrections perform with the often small number of groups observed in health research using longitudinal data. First, we review the most commonly used modeling solutions in DID estimation for panel data, including generalized estimating equations (GEE), permutation tests, clustered standard errors (CSE), wild cluster bootstrapping, and aggregation. Second, we compare the empirical coverage rates and power of these methods using a Monte Carlo simulation study in scenarios in which we vary the degree of error correlation, the group size balance, and the proportion of treated groups. Third, we provide an empirical example using the Survey of Health, Ageing, and Retirement in Europe. When the number of groups is small, CSE are systematically biased downwards in scenarios when data are unbalanced or when there is a low proportion of treated groups. This can result in over-rejection of the null even when data are composed of up to 50 groups. Aggregation, permutation tests, bias-adjusted GEE, and wild cluster bootstrap produce coverage rates close to the nominal rate for almost all scenarios, though GEE may suffer from low power. In DID estimation with a small number of groups, analysis using aggregation, permutation tests, wild cluster bootstrap, or bias-adjusted GEE is recommended.

  6. The Primordial Entropy of Jupiter

    NASA Astrophysics Data System (ADS)

    Cumming, Andrew; Helled, Ravit; Venturini, Julia

    2018-04-01

    The formation history of giant planets determines their primordial structure and consequent evolution. We simulate various formation paths of Jupiter to determine its primordial entropy, and find that a common outcome is for proto-Jupiter to have non-convective regions in its interior. We use planet formation models to calculate how the entropy and post-formation luminosity depend on model properties such as the solid accretion rate and opacity, and show that the gas accretion rate and its time evolution play a key role in determining the entropy profile. The predicted luminosity of Jupiter shortly after formation varies by a factor of 2-3 for different choices of model parameters. We find that entropy gradients inside Jupiter persist for ˜10 Myr after formation. We suggest that these gradients should be considered together with heavy-element composition gradients when modeling Jupiter's evolution and internal structure.

  7. The primordial entropy of Jupiter

    NASA Astrophysics Data System (ADS)

    Cumming, Andrew; Helled, Ravit; Venturini, Julia

    2018-07-01

    The formation history of giant planets determines their primordial structure and consequent evolution. We simulate various formation paths of Jupiter to determine its primordial entropy, and find that a common outcome is for proto-Jupiter to have non-convective regions in its interior. We use planet formation models to calculate how the entropy and post-formation luminosity depend on model properties such as the solid accretion rate and opacity, and show that the gas accretion rate and its time evolution play a key role in determining the entropy profile. The predicted luminosity of Jupiter shortly after formation varies by a factor of 2-3 for different choices of model parameters. We find that entropy gradients inside Jupiter persist for ˜10 Myr after formation. We suggest that these gradients should be considered together with heavy-element composition gradients when modelling Jupiter's evolution and internal structure.

  8. Metabolic networks evolve towards states of maximum entropy production.

    PubMed

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Stochastic thermodynamics and entropy production of chemical reaction systems

    NASA Astrophysics Data System (ADS)

    Tomé, Tânia; de Oliveira, Mário J.

    2018-06-01

    We investigate the nonequilibrium stationary states of systems consisting of chemical reactions among molecules of several chemical species. To this end, we introduce and develop a stochastic formulation of nonequilibrium thermodynamics of chemical reaction systems based on a master equation defined on the space of microscopic chemical states and on appropriate definitions of entropy and entropy production. The system is in contact with a heat reservoir and is placed out of equilibrium by the contact with particle reservoirs. In our approach, the fluxes of various types, such as the heat and particle fluxes, play a fundamental role in characterizing the nonequilibrium chemical state. We show that the rate of entropy production in the stationary nonequilibrium state is a bilinear form in the affinities and the fluxes of reaction, which are expressed in terms of rate constants and transition rates, respectively. We also show how the description in terms of microscopic states can be reduced to a description in terms of the numbers of particles of each species, from which follows the chemical master equation. As an example, we calculate the rate of entropy production of the first and second Schlögl reaction models.

  10. Using heteroclinic orbits to quantify topological entropy in fluid flows

    NASA Astrophysics Data System (ADS)

    Sattari, Sulimon; Chen, Qianting; Mitchell, Kevin A.

    2016-03-01

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or "ghost," rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding of ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.

  11. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S [Santa Fe, NM; Cabantous, Stephanie [Los Alamos, NM

    2008-06-24

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  12. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S; Cabantous, Stephanie

    2013-02-12

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  13. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S [Santa Fe, NM; Cabantous, Stephanie [Los Alamos, NM

    2011-06-14

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  14. Circular permutant GFP insertion folding reporters

    DOEpatents

    Waldo, Geoffrey S.; Cabantous, Stephanie

    2013-04-16

    Provided are methods of assaying and improving protein folding using circular permutants of fluorescent proteins, including circular permutants of GFP variants and combinations thereof. The invention further provides various nucleic acid molecules and vectors incorporating such nucleic acid molecules, comprising polynucleotides encoding fluorescent protein circular permutants derived from superfolder GFP, which polynucleotides include an internal cloning site into which a heterologous polynucleotide may be inserted in-frame with the circular permutant coding sequence, and which when expressed are capable of reporting on the degree to which a polypeptide encoded by such an inserted heterologous polynucleotide is correctly folded by correlation with the degree of fluorescence exhibited.

  15. Sample entropy analysis for the estimating depth of anaesthesia through human EEG signal at different levels of unconsciousness during surgeries.

    PubMed

    Liu, Quan; Ma, Li; Fan, Shou-Zen; Abbod, Maysam F; Shieh, Jiann-Shing

    2018-01-01

    Estimating the depth of anaesthesia (DoA) in operations has always been a challenging issue due to the underlying complexity of the brain mechanisms. Electroencephalogram (EEG) signals are undoubtedly the most widely used signals for measuring DoA. In this paper, a novel EEG-based index is proposed to evaluate DoA for 24 patients receiving general anaesthesia with different levels of unconsciousness. Sample Entropy (SampEn) algorithm was utilised in order to acquire the chaotic features of the signals. After calculating the SampEn from the EEG signals, Random Forest was utilised for developing learning regression models with Bispectral index (BIS) as the target. Correlation coefficient, mean absolute error, and area under the curve (AUC) were used to verify the perioperative performance of the proposed method. Validation comparisons with typical nonstationary signal analysis methods (i.e., recurrence analysis and permutation entropy) and regression methods (i.e., neural network and support vector machine) were conducted. To further verify the accuracy and validity of the proposed methodology, the data is divided into four unconsciousness-level groups on the basis of BIS levels. Subsequently, analysis of variance (ANOVA) was applied to the corresponding index (i.e., regression output). Results indicate that the correlation coefficient improved to 0.72 ± 0.09 after filtering and to 0.90 ± 0.05 after regression from the initial values of 0.51 ± 0.17. Similarly, the final mean absolute error dramatically declined to 5.22 ± 2.12. In addition, the ultimate AUC increased to 0.98 ± 0.02, and the ANOVA analysis indicates that each of the four groups of different anaesthetic levels demonstrated significant difference from the nearest levels. Furthermore, the Random Forest output was extensively linear in relation to BIS, thus with better DoA prediction accuracy. In conclusion, the proposed method provides a concrete basis for monitoring patients' anaesthetic level during surgeries.

  16. Entropy as a measure of diffusion

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Amir; Fatollahi, Amir H.; Khorrami, Mohammad; Shariati, Ahmad

    2013-10-01

    The time variation of entropy, as an alternative to the variance, is proposed as a measure of the diffusion rate. It is shown that for linear and time-translationally invariant systems having a large-time limit for the density, at large times the entropy tends exponentially to a constant. For systems with no stationary density, at large times the entropy is logarithmic with a coefficient specifying the speed of the diffusion. As an example, the large-time behaviors of the entropy and the variance are compared for various types of fractional-derivative diffusions.

  17. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  18. Probing and exploiting the chaotic dynamics of a hydrodynamic photochemical oscillator to implement all the basic binary logic functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hayashi, Kenta; Department of Chemistry, Biology, and Biotechnology, University of Perugia, 06123 Perugia; Gotoda, Hiroshi

    2016-05-15

    The convective motions within a solution of a photochromic spiro-oxazine being irradiated by UV only on the bottom part of its volume, give rise to aperiodic spectrophotometric dynamics. In this paper, we study three nonlinear properties of the aperiodic time series: permutation entropy, short-term predictability and long-term unpredictability, and degree distribution of the visibility graph networks. After ascertaining the extracted chaotic features, we show how the aperiodic time series can be exploited to implement all the fundamental two-inputs binary logic functions (AND, OR, NAND, NOR, XOR, and XNOR) and some basic arithmetic operations (half-adder, full-adder, half-subtractor). This is possible duemore » to the wide range of states a nonlinear system accesses in the course of its evolution. Therefore, the solution of the convective photochemical oscillator results in hardware for chaos-computing alternative to conventional complementary metal-oxide semiconductor-based integrated circuits.« less

  19. Modelling the structure of Zr-rich Pb(Zr1-xTix)O3, x = 0.4 by a multiphase approach.

    PubMed

    Bogdanov, Alexander; Mysovsky, Andrey; Pickard, Chris J; Kimmel, Anna V

    2016-10-12

    Solid solution perovskite Pb(Zr 1-x Ti x )O 3 (PZT) is an industrially important material. Despite the long history of experimental and theoretical studies, the structure of this material is still under intensive discussion. In this work, we have applied structure searching coupled with density functional theory methods to provide a multiphase description of this material at x = 0.4. We demonstrate that the permutational freedom of B-site cations leads to the stabilisation of a variety of local phases reflecting a relatively flat energy landscape of PZT. Using a set of predicted local phases we reproduce the experimental pair distribution function (PDF) profile with high accuracy. We introduce a complex multiphase picture of the structure of PZT and show that additional monoclinic and rhombohedral phases account for a better description of the experimental PDF profile. We propose that such a multiphase picture reflects the entropy reached in the sample during the preparation process.

  20. [Portable Epileptic Seizure Monitoring Intelligent System Based on Android System].

    PubMed

    Liang, Zhenhu; Wu, Shufeng; Yang, Chunlin; Jiang, Zhenzhou; Yu, Tao; Lu, Chengbiao; Li, Xiaoli

    2016-02-01

    The clinical electroencephalogram (EEG) monitoring systems based on personal computer system can not meet the requirements of portability and home usage. The epilepsy patients have to be monitored in hospital for an extended period of time, which imposes a heavy burden on hospitals. In the present study, we designed a portable 16-lead networked monitoring system based on the Android smart phone. The system uses some technologies including the active electrode, the WiFi wireless transmission, the multi-scale permutation entropy (MPE) algorithm, the back-propagation (BP) neural network algorithm, etc. Moreover, the software of Android mobile application can realize the processing and analysis of EEG data, the display of EEG waveform and the alarm of epileptic seizure. The system has been tested on the mobile phones with Android 2. 3 operating system or higher version and the results showed that this software ran accurately and steadily in the detection of epileptic seizure. In conclusion, this paper provides a portable and reliable solution for epileptic seizure monitoring in clinical and home applications.

  1. A new methodology for automated diagnosis of mild cognitive impairment (MCI) using magnetoencephalography (MEG).

    PubMed

    Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat

    2016-05-15

    Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. An improvement of the measurement of time series irreversibility with visibility graph approach

    NASA Astrophysics Data System (ADS)

    Wu, Zhenyu; Shang, Pengjian; Xiong, Hui

    2018-07-01

    We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.

  3. Experiments and Model for Serration Statistics in Low-Entropy, Medium-Entropy, and High-Entropy Alloys

    DOE PAGES

    Carroll, Robert; Lee, Chi; Tsai, Che-Wei; ...

    2015-11-23

    In this study, high-entropy alloys (HEAs) are new alloys that contain five or more elements in roughly-equal proportion. We present new experiments and theory on the deformation behavior of HEAs under slow stretching (straining), and observe differences, compared to conventional alloys with fewer elements. For a specific range of temperatures and strain-rates, HEAs deform in a jerky way, with sudden slips that make it difficult to precisely control the deformation. An analytic model explains these slips as avalanches of slipping weak spots and predicts the observed slip statistics, stress-strain curves, and their dependence on temperature, strain-rate, and material composition. Themore » ratio of the weak spots’ healing rate to the strain-rate is the main tuning parameter, reminiscent of the Portevin- LeChatellier effect and time-temperature superposition in polymers. Our model predictions agree with the experimental results. The proposed widely-applicable deformation mechanism is useful for deformation control and alloy design.« less

  4. Image encryption using a synchronous permutation-diffusion technique

    NASA Astrophysics Data System (ADS)

    Enayatifar, Rasul; Abdullah, Abdul Hanan; Isnin, Ismail Fauzi; Altameem, Ayman; Lee, Malrey

    2017-03-01

    In the past decade, the interest on digital images security has been increased among scientists. A synchronous permutation and diffusion technique is designed in order to protect gray-level image content while sending it through internet. To implement the proposed method, two-dimensional plain-image is converted to one dimension. Afterward, in order to reduce the sending process time, permutation and diffusion steps for any pixel are performed in the same time. The permutation step uses chaotic map and deoxyribonucleic acid (DNA) to permute a pixel, while diffusion employs DNA sequence and DNA operator to encrypt the pixel. Experimental results and extensive security analyses have been conducted to demonstrate the feasibility and validity of this proposed image encryption method.

  5. Linear growth of the entanglement entropy and the Kolmogorov-Sinai rate

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Hackl, Lucas; Yokomizo, Nelson

    2018-03-01

    The rate of entropy production in a classical dynamical system is characterized by the Kolmogorov-Sinai entropy rate h KS given by the sum of all positive Lyapunov exponents of the system. We prove a quantum version of this result valid for bosonic systems with unstable quadratic Hamiltonian. The derivation takes into account the case of time-dependent Hamiltonians with Floquet instabilities. We show that the entanglement entropy S A of a Gaussian state grows linearly for large times in unstable systems, with a rate Λ A ≤ h KS determined by the Lyapunov exponents and the choice of the subsystem A. We apply our results to the analysis of entanglement production in unstable quadratic potentials and due to periodic quantum quenches in many-body quantum systems. Our results are relevant for quantum field theory, for which we present three applications: a scalar field in a symmetry-breaking potential, parametric resonance during post-inflationary reheating and cosmological perturbations during inflation. Finally, we conjecture that the same rate Λ A appears in the entanglement growth of chaotic quantum systems prepared in a semiclassical state.

  6. Modeling the Overalternating Bias with an Asymmetric Entropy Measure

    PubMed Central

    Gronchi, Giorgio; Raglianti, Marco; Noventa, Stefano; Lazzeri, Alessandro; Guazzini, Andrea

    2016-01-01

    Psychological research has found that human perception of randomness is biased. In particular, people consistently show the overalternating bias: they rate binary sequences of symbols (such as Heads and Tails in coin flipping) with an excess of alternation as more random than prescribed by the normative criteria of Shannon's entropy. Within data mining for medical applications, Marcellin proposed an asymmetric measure of entropy that can be ideal to account for such bias and to quantify subjective randomness. We fitted Marcellin's entropy and Renyi's entropy (a generalized form of uncertainty measure comprising many different kinds of entropies) to experimental data found in the literature with the Differential Evolution algorithm. We observed a better fit for Marcellin's entropy compared to Renyi's entropy. The fitted asymmetric entropy measure also showed good predictive properties when applied to different datasets of randomness-related tasks. We concluded that Marcellin's entropy can be a parsimonious and effective measure of subjective randomness that can be useful in psychological research about randomness perception. PMID:27458418

  7. Effect of entropy change of lithium intercalation in cathodes and anodes on Li-ion battery thermal management

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vilayanur V.; Choi, Daiwon; Wang, Donghai; Xu, Wu; Towne, Silas; Williford, Ralph E.; Zhang, Ji-Guang; Liu, Jun; Yang, Zhenguo

    The entropy changes (Δ S) in various cathode and anode materials, as well as in complete Li-ion batteries, were measured using an electrochemical thermodynamic measurement system (ETMS). LiCoO 2 has a much larger entropy change than electrodes based on LiNi xCo yMn zO 2 and LiFePO 4, while lithium titanate based anodes have lower entropy change compared to graphite anodes. The reversible heat generation rate was found to be a significant portion of the total heat generation rate. The appropriate combinations of cathode and anode were investigated to minimize reversible heat generation rate across the 0-100% state of charge (SOC) range. In addition to screening for battery electrode materials with low reversible heat, the techniques described in this paper can be a useful engineering tool for battery thermal management in stationary and transportation applications.

  8. Necessary conditions for the optimality of variable rate residual vector quantizers

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Smith, Mark J. T.; Barnes, Christopher F.

    1993-01-01

    Residual vector quantization (RVQ), or multistage VQ, as it is also called, has recently been shown to be a competitive technique for data compression. The competitive performance of RVQ reported in results from the joint optimization of variable rate encoding and RVQ direct-sum code books. In this paper, necessary conditions for the optimality of variable rate RVQ's are derived, and an iterative descent algorithm based on a Lagrangian formulation is introduced for designing RVQ's having minimum average distortion subject to an entropy constraint. Simulation results for these entropy-constrained RVQ's (EC-RVQ's) are presented for memory less Gaussian, Laplacian, and uniform sources. A Gauss-Markov source is also considered. The performance is superior to that of entropy-constrained scalar quantizers (EC-SQ's) and practical entropy-constrained vector quantizers (EC-VQ's), and is competitive with that of some of the best source coding techniques that have appeared in the literature.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattari, Sulimon, E-mail: ssattari2@ucmerced.edu; Chen, Qianting, E-mail: qchen2@ucmerced.edu; Mitchell, Kevin A., E-mail: kmitchell@ucmerced.edu

    Topological approaches to mixing are important tools to understand chaotic fluid flows, ranging from oceanic transport to the design of micro-mixers. Typically, topological entropy, the exponential growth rate of material lines, is used to quantify topological mixing. Computing topological entropy from the direct stretching rate is computationally expensive and sheds little light on the source of the mixing. Earlier approaches emphasized that topological entropy could be viewed as generated by the braiding of virtual, or “ghost,” rods stirring the fluid in a periodic manner. Here, we demonstrate that topological entropy can also be viewed as generated by the braiding ofmore » ghost rods following heteroclinic orbits instead. We use the machinery of homotopic lobe dynamics, which extracts symbolic dynamics from finite-length pieces of stable and unstable manifolds attached to fixed points of the fluid flow. As an example, we focus on the topological entropy of a bounded, chaotic, two-dimensional, double-vortex cavity flow. Over a certain parameter range, the topological entropy is primarily due to the braiding of a period-three orbit. However, this orbit does not explain the topological entropy for parameter values where it does not exist, nor does it explain the excess of topological entropy for the entire range of its existence. We show that braiding by heteroclinic orbits provides an accurate computation of topological entropy when the period-three orbit does not exist, and that it provides an explanation for some of the excess topological entropy when the period-three orbit does exist. Furthermore, the computation of symbolic dynamics using heteroclinic orbits has been automated and can be used to compute topological entropy for a general 2D fluid flow.« less

  10. How to Calculate Renyi Entropy from Heart Rate Variability, and Why it Matters for Detecting Cardiac Autonomic Neuropathy.

    PubMed

    Cornforth, David J; Tarvainen, Mika P; Jelinek, Herbert F

    2014-01-01

    Cardiac autonomic neuropathy (CAN) is a disease that involves nerve damage leading to an abnormal control of heart rate. An open question is to what extent this condition is detectable from heart rate variability (HRV), which provides information only on successive intervals between heart beats, yet is non-invasive and easy to obtain from a three-lead ECG recording. A variety of measures may be extracted from HRV, including time domain, frequency domain, and more complex non-linear measures. Among the latter, Renyi entropy has been proposed as a suitable measure that can be used to discriminate CAN from controls. However, all entropy methods require estimation of probabilities, and there are a number of ways in which this estimation can be made. In this work, we calculate Renyi entropy using several variations of the histogram method and a density method based on sequences of RR intervals. In all, we calculate Renyi entropy using nine methods and compare their effectiveness in separating the different classes of participants. We found that the histogram method using single RR intervals yields an entropy measure that is either incapable of discriminating CAN from controls, or that it provides little information that could not be gained from the SD of the RR intervals. In contrast, probabilities calculated using a density method based on sequences of RR intervals yield an entropy measure that provides good separation between groups of participants and provides information not available from the SD. The main contribution of this work is that different approaches to calculating probability may affect the success of detecting disease. Our results bring new clarity to the methods used to calculate the Renyi entropy in general, and in particular, to the successful detection of CAN.

  11. How to Calculate Renyi Entropy from Heart Rate Variability, and Why it Matters for Detecting Cardiac Autonomic Neuropathy

    PubMed Central

    Cornforth, David J.;  Tarvainen, Mika P.; Jelinek, Herbert F.

    2014-01-01

    Cardiac autonomic neuropathy (CAN) is a disease that involves nerve damage leading to an abnormal control of heart rate. An open question is to what extent this condition is detectable from heart rate variability (HRV), which provides information only on successive intervals between heart beats, yet is non-invasive and easy to obtain from a three-lead ECG recording. A variety of measures may be extracted from HRV, including time domain, frequency domain, and more complex non-linear measures. Among the latter, Renyi entropy has been proposed as a suitable measure that can be used to discriminate CAN from controls. However, all entropy methods require estimation of probabilities, and there are a number of ways in which this estimation can be made. In this work, we calculate Renyi entropy using several variations of the histogram method and a density method based on sequences of RR intervals. In all, we calculate Renyi entropy using nine methods and compare their effectiveness in separating the different classes of participants. We found that the histogram method using single RR intervals yields an entropy measure that is either incapable of discriminating CAN from controls, or that it provides little information that could not be gained from the SD of the RR intervals. In contrast, probabilities calculated using a density method based on sequences of RR intervals yield an entropy measure that provides good separation between groups of participants and provides information not available from the SD. The main contribution of this work is that different approaches to calculating probability may affect the success of detecting disease. Our results bring new clarity to the methods used to calculate the Renyi entropy in general, and in particular, to the successful detection of CAN. PMID:25250311

  12. Entropy information of heart rate variability and its power spectrum during day and night

    NASA Astrophysics Data System (ADS)

    Jin, Li; Jun, Wang

    2013-07-01

    Physiologic systems generate complex fluctuations in their output signals that reflect the underlying dynamics. We employed the base-scale entropy method and the power spectral analysis to study the 24 hours heart rate variability (HRV) signals. The results show that such profound circadian-, age- and pathologic-dependent changes are accompanied by changes in base-scale entropy and power spectral distribution. Moreover, the base-scale entropy changes reflect the corresponding changes in the autonomic nerve outflow. With the suppression of the vagal tone and dominance of the sympathetic tone in congestive heart failure (CHF) subjects, there is more variability in the date fluctuation mode. So the higher base-scale entropy belongs to CHF subjects. With the decrease of the sympathetic tone and the respiratory frequency (RSA) becoming more pronounced with slower breathing during sleeping, the base-scale entropy drops in CHF subjects. The HRV series of the two healthy groups have the same diurnal/nocturnal trend as the CHF series. The fluctuation dynamics trend of data in the three groups can be described as “HF effect”.

  13. Irreversible entropy model for damage diagnosis in resistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuadras, Angel, E-mail: angel.cuadras@upc.edu; Crisóstomo, Javier; Ovejas, Victoria J.

    2015-10-28

    We propose a method to characterize electrical resistor damage based on entropy measurements. Irreversible entropy and the rate at which it is generated are more convenient parameters than resistance for describing damage because they are essentially positive in virtue of the second law of thermodynamics, whereas resistance may increase or decrease depending on the degradation mechanism. Commercial resistors were tested in order to characterize the damage induced by power surges. Resistors were biased with constant and pulsed voltage signals, leading to power dissipation in the range of 4–8 W, which is well above the 0.25 W nominal power to initiate failure. Entropymore » was inferred from the added power and temperature evolution. A model is proposed to understand the relationship among resistance, entropy, and damage. The power surge dissipates into heat (Joule effect) and damages the resistor. The results show a correlation between entropy generation rate and resistor failure. We conclude that damage can be conveniently assessed from irreversible entropy generation. Our results for resistors can be easily extrapolated to other systems or machines that can be modeled based on their resistance.« less

  14. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  15. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    PubMed Central

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  16. Recommendations and illustrations for the evaluation of photonic random number generators

    NASA Astrophysics Data System (ADS)

    Hart, Joseph D.; Terashima, Yuta; Uchida, Atsushi; Baumgartner, Gerald B.; Murphy, Thomas E.; Roy, Rajarshi

    2017-09-01

    The never-ending quest to improve the security of digital information combined with recent improvements in hardware technology has caused the field of random number generation to undergo a fundamental shift from relying solely on pseudo-random algorithms to employing optical entropy sources. Despite these significant advances on the hardware side, commonly used statistical measures and evaluation practices remain ill-suited to understand or quantify the optical entropy that underlies physical random number generation. We review the state of the art in the evaluation of optical random number generation and recommend a new paradigm: quantifying entropy generation and understanding the physical limits of the optical sources of randomness. In order to do this, we advocate for the separation of the physical entropy source from deterministic post-processing in the evaluation of random number generators and for the explicit consideration of the impact of the measurement and digitization process on the rate of entropy production. We present the Cohen-Procaccia estimate of the entropy rate h (𝜖 ,τ ) as one way to do this. In order to provide an illustration of our recommendations, we apply the Cohen-Procaccia estimate as well as the entropy estimates from the new NIST draft standards for physical random number generators to evaluate and compare three common optical entropy sources: single photon time-of-arrival detection, chaotic lasers, and amplified spontaneous emission.

  17. Entropy change of biological dynamics in COPD.

    PubMed

    Jin, Yu; Chen, Chang; Cao, Zhixin; Sun, Baoqing; Lo, Iek Long; Liu, Tzu-Ming; Zheng, Jun; Sun, Shixue; Shi, Yan; Zhang, Xiaohua Douglas

    2017-01-01

    In this century, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of large amount of data in human physiological signals. Entropy is a key metric for quantifying the irregularity contained in physiological signals. In this review, we focus on how entropy changes in various physiological signals in COPD. Our review concludes that the entropy change relies on the types of physiological signals under investigation. For major physiological signals related to respiratory diseases, such as airflow, heart rate variability, and gait variability, the entropy of a patient with COPD is lower than that of a healthy person. However, in case of hormone secretion and respiratory sound, the entropy of a patient is higher than that of a healthy person. For mechanomyogram signal, the entropy increases with the increased severity of COPD. This result should give valuable guidance for the use of entropy for physiological signals measured by wearable medical device as well as for further research on entropy in COPD.

  18. Energy conservation and maximal entropy production in enzyme reactions.

    PubMed

    Dobovišek, Andrej; Vitas, Marko; Brumen, Milan; Fajmut, Aleš

    2017-08-01

    A procedure for maximization of the density of entropy production in a single stationary two-step enzyme reaction is developed. Under the constraints of mass conservation, fixed equilibrium constant of a reaction and fixed products of forward and backward enzyme rate constants the existence of maximum in the density of entropy production is demonstrated. In the state with maximal density of entropy production the optimal enzyme rate constants, the stationary concentrations of the substrate and the product, the stationary product yield as well as the stationary reaction flux are calculated. The test, whether these calculated values of the reaction parameters are consistent with their corresponding measured values, is performed for the enzyme Glucose Isomerase. It is found that calculated and measured rate constants agree within an order of magnitude, whereas the calculated reaction flux and the product yield differ from their corresponding measured values for less than 20 % and 5 %, respectively. This indicates that the enzyme Glucose Isomerase, considered in a non-equilibrium stationary state, as found in experiments using the continuous stirred tank reactors, possibly operates close to the state with the maximum in the density of entropy production. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Universal bounds on the time evolution of entanglement entropy.

    PubMed

    Avery, Steven G; Paulos, Miguel F

    2014-12-05

    Using relative entropy, we derive bounds on the time rate of change of geometric entanglement entropy for any relativistic quantum field theory in any dimension. The bounds apply to both mixed and pure states, and may be extended to curved space. We illustrate the bounds in a few examples and comment on potential applications and future extensions.

  20. Overlap Cycles for Permutations: Necessary and Sufficient Conditions

    DTIC Science & Technology

    2013-09-19

    for Weak Orders, To appear in SIAM Journal of Discrete Math . [9] G. Hurlbert and G. Isaak, Equivalence class universal cycles for permutations, Discrete ... Math . 149 (1996), pp. 123–129. [10] J. R. Johnson, Universal cycles for permutations, Discrete Math . 309 (2009), pp. 5264– 5270. [11] E. A. Ragland

  1. Multi-response permutation procedure as an alternative to the analysis of variance: an SPSS implementation.

    PubMed

    Cai, Li

    2006-02-01

    A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.

  2. Entropy production and nonlinear Fokker-Planck equations.

    PubMed

    Casas, G A; Nobre, F D; Curado, E M F

    2012-12-01

    The entropy time rate of systems described by nonlinear Fokker-Planck equations--which are directly related to generalized entropic forms--is analyzed. Both entropy production, associated with irreversible processes, and entropy flux from the system to its surroundings are studied. Some examples of known generalized entropic forms are considered, and particularly, the flux and production of the Boltzmann-Gibbs entropy, obtained from the linear Fokker-Planck equation, are recovered as particular cases. Since nonlinear Fokker-Planck equations are appropriate for the dynamical behavior of several physical phenomena in nature, like many within the realm of complex systems, the present analysis should be applicable to irreversible processes in a large class of nonlinear systems, such as those described by Tsallis and Kaniadakis entropies.

  3. Use and validity of principles of extremum of entropy production in the study of complex systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitor Reis, A., E-mail: ahr@uevora.pt

    2014-07-15

    It is shown how both the principles of extremum of entropy production, which are often used in the study of complex systems, follow from the maximization of overall system conductivities, under appropriate constraints. In this way, the maximum rate of entropy production (MEP) occurs when all the forces in the system are kept constant. On the other hand, the minimum rate of entropy production (mEP) occurs when all the currents that cross the system are kept constant. A brief discussion on the validity of the application of the mEP and MEP principles in several cases, and in particular to themore » Earth’s climate is also presented. -- Highlights: •The principles of extremum of entropy production are not first principles. •They result from the maximization of conductivities under appropriate constraints. •The conditions of their validity are set explicitly. •Some long-standing controversies are discussed and clarified.« less

  4. Logarithmic entropy of Kehagias-Sfetsos black hole with self-gravitation in asymptotically flat IR modified Hořava gravity

    NASA Astrophysics Data System (ADS)

    Liu, Molin; Lu, Junwang

    2011-05-01

    Motivated by recent logarithmic entropy of Hořava-Lifshitz gravity, we investigate Hawking radiation for Kehagias-Sfetsos black hole from tunneling perspective. After considering the effect of self-gravitation, we calculate the emission rate and entropy of quantum tunneling by using Kraus-Parikh-Wilczek method. Meanwhile, both massless and massive particles are considered in this Letter. Interestingly, two types tunneling particles have the same emission rate Γ and entropy Sb whose analytical formulae are Γ=exp[π(rin2-rout2)/2+π/αln rin/rout] and Sb=A/4+π/αln(A/4), respectively. Here, α is the Hořava-Lifshitz field parameter. The results show that the logarithmic entropy of Hořava-Lifshitz gravity could be explained well by the self-gravitation, which is totally different from other methods. The study of this semiclassical tunneling process may shed light on understanding the Hořava-Lifshitz gravity.

  5. Using R to Simulate Permutation Distributions for Some Elementary Experimental Designs

    ERIC Educational Resources Information Center

    Eudey, T. Lynn; Kerr, Joshua D.; Trumbo, Bruce E.

    2010-01-01

    Null distributions of permutation tests for two-sample, paired, and block designs are simulated using the R statistical programming language. For each design and type of data, permutation tests are compared with standard normal-theory and nonparametric tests. These examples (often using real data) provide for classroom discussion use of metrics…

  6. Is the catalytic activity of triosephosphate isomerase fully optimized? An investigation based on maximization of entropy production.

    PubMed

    Bonačić Lošić, Željana; Donđivić, Tomislav; Juretić, Davor

    2017-03-01

    Triosephosphate isomerase (TIM) is often described as a fully evolved housekeeping enzyme with near-maximal possible reaction rate. The assumption that an enzyme is perfectly evolved has not been easy to confirm or refute. In this paper, we use maximization of entropy production within known constraints to examine this assumption by calculating steady-state cyclic flux, corresponding entropy production, and catalytic activity in a reversible four-state scheme of TIM functional states. The maximal entropy production (MaxEP) requirement for any of the first three transitions between TIM functional states leads to decreased total entropy production. Only the MaxEP requirement for the product (R-glyceraldehyde-3-phosphate) release step led to a 30% increase in enzyme activity, specificity constant k cat /K M , and overall entropy production. The product release step, due to the TIM molecular machine working in the physiological direction of glycolysis, has not been identified before as the rate-limiting step by using irreversible thermodynamics. Together with structural studies, our results open the possibility for finding amino acid substitutions leading to an increased frequency of loop six opening and product release.

  7. Circular permutation of a WW domain: Folding still occurs after excising the turn of the folding-nucleating hairpin

    PubMed Central

    Kier, Brandon L.; Anderson, Jordan M.; Andersen, Niels H.

    2014-01-01

    A hyperstable Pin1 WW domain has been circularly permuted via excision of the fold-nucleating turn; it still folds to form the native three-strand sheet and hydrophobic core features. Multiprobe folding dynamics studies of the normal and circularly permuted sequences, as well as their constituent hairpin fragments and comparable-length β-strand-loop-β-strand models, indicate 2-state folding for all topologies. N-terminal hairpin formation is the fold nucleating event for the wild-type sequence; the slower folding circular permutant has a more distributed folding transition state. PMID:24350581

  8. An estimator for the relative entropy rate of path measures for stochastic differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opper, Manfred, E-mail: manfred.opper@tu-berlin.de

    2017-02-01

    We address the problem of estimating the relative entropy rate (RER) for two stochastic processes described by stochastic differential equations. For the case where the drift of one process is known analytically, but one has only observations from the second process, we use a variational bound on the RER to construct an estimator.

  9. Image coding using entropy-constrained residual vector quantization

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Smith, Mark J. T.; Barnes, Christopher F.

    1993-01-01

    The residual vector quantization (RVQ) structure is exploited to produce a variable length codeword RVQ. Necessary conditions for the optimality of this RVQ are presented, and a new entropy-constrained RVQ (ECRVQ) design algorithm is shown to be very effective in designing RVQ codebooks over a wide range of bit rates and vector sizes. The new EC-RVQ has several important advantages. It can outperform entropy-constrained VQ (ECVQ) in terms of peak signal-to-noise ratio (PSNR), memory, and computation requirements. It can also be used to design high rate codebooks and codebooks with relatively large vector sizes. Experimental results indicate that when the new EC-RVQ is applied to image coding, very high quality is achieved at relatively low bit rates.

  10. The increase of the functional entropy of the human brain with age.

    PubMed

    Yao, Y; Lu, W L; Xu, B; Li, C B; Lin, C P; Waxman, D; Feng, J F

    2013-10-09

    We use entropy to characterize intrinsic ageing properties of the human brain. Analysis of fMRI data from a large dataset of individuals, using resting state BOLD signals, demonstrated that a functional entropy associated with brain activity increases with age. During an average lifespan, the entropy, which was calculated from a population of individuals, increased by approximately 0.1 bits, due to correlations in BOLD activity becoming more widely distributed. We attribute this to the number of excitatory neurons and the excitatory conductance decreasing with age. Incorporating these properties into a computational model leads to quantitatively similar results to the fMRI data. Our dataset involved males and females and we found significant differences between them. The entropy of males at birth was lower than that of females. However, the entropies of the two sexes increase at different rates, and intersect at approximately 50 years; after this age, males have a larger entropy.

  11. Exploring stability of entropy analysis for signal with different trends

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Li, Jin; Wang, Jun

    2017-03-01

    Considering the effects of environment disturbances and instrument systems, the actual detecting signals always are carrying different trends, which result in that it is difficult to accurately catch signals complexity. So choosing steady and effective analysis methods is very important. In this paper, we applied entropy measures-the base-scale entropy and approximate entropy to analyze signal complexity, and studied the effect of trends on the ideal signal and the heart rate variability (HRV) signals, that is, linear, periodic, and power-law trends which are likely to occur in actual signals. The results show that approximate entropy is unsteady when we embed different trends into the signals, so it is not suitable to analyze signal with trends. However, the base-scale entropy has preferable stability and accuracy for signal with different trends. So the base-scale entropy is an effective method to analyze the actual signals.

  12. The Increase of the Functional Entropy of the Human Brain with Age

    PubMed Central

    Yao, Y.; Lu, W. L.; Xu, B.; Li, C. B.; Lin, C. P.; Waxman, D.; Feng, J. F.

    2013-01-01

    We use entropy to characterize intrinsic ageing properties of the human brain. Analysis of fMRI data from a large dataset of individuals, using resting state BOLD signals, demonstrated that a functional entropy associated with brain activity increases with age. During an average lifespan, the entropy, which was calculated from a population of individuals, increased by approximately 0.1 bits, due to correlations in BOLD activity becoming more widely distributed. We attribute this to the number of excitatory neurons and the excitatory conductance decreasing with age. Incorporating these properties into a computational model leads to quantitatively similar results to the fMRI data. Our dataset involved males and females and we found significant differences between them. The entropy of males at birth was lower than that of females. However, the entropies of the two sexes increase at different rates, and intersect at approximately 50 years; after this age, males have a larger entropy. PMID:24103922

  13. Physical Connectivity Mapping by Circular Permutation of Human Telomerase RNA Reveals New Regions Critical for Activity and Processivity.

    PubMed

    Mefford, Melissa A; Zappulla, David C

    2016-01-15

    Telomerase is a specialized ribonucleoprotein complex that extends the 3' ends of chromosomes to counteract telomere shortening. However, increased telomerase activity is associated with ∼90% of human cancers. The telomerase enzyme minimally requires an RNA (hTR) and a specialized reverse transcriptase protein (TERT) for activity in vitro. Understanding the structure-function relationships within hTR has important implications for human disease. For the first time, we have tested the physical-connectivity requirements in the 451-nucleotide hTR RNA using circular permutations, which reposition the 5' and 3' ends. Our extensive in vitro analysis identified three classes of hTR circular permutants with altered function. First, circularly permuting 3' of the template causes specific defects in repeat-addition processivity, revealing that the template recognition element found in ciliates is conserved in human telomerase RNA. Second, seven circular permutations residing within the catalytically important core and CR4/5 domains completely abolish telomerase activity, unveiling mechanistically critical portions of these domains. Third, several circular permutations between the core and CR4/5 significantly increase telomerase activity. Our extensive circular permutation results provide insights into the architecture and coordination of human telomerase RNA and highlight where the RNA could be targeted for the development of antiaging and anticancer therapeutics. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  14. Physical Connectivity Mapping by Circular Permutation of Human Telomerase RNA Reveals New Regions Critical for Activity and Processivity

    PubMed Central

    Mefford, Melissa A.

    2015-01-01

    Telomerase is a specialized ribonucleoprotein complex that extends the 3′ ends of chromosomes to counteract telomere shortening. However, increased telomerase activity is associated with ∼90% of human cancers. The telomerase enzyme minimally requires an RNA (hTR) and a specialized reverse transcriptase protein (TERT) for activity in vitro. Understanding the structure-function relationships within hTR has important implications for human disease. For the first time, we have tested the physical-connectivity requirements in the 451-nucleotide hTR RNA using circular permutations, which reposition the 5′ and 3′ ends. Our extensive in vitro analysis identified three classes of hTR circular permutants with altered function. First, circularly permuting 3′ of the template causes specific defects in repeat-addition processivity, revealing that the template recognition element found in ciliates is conserved in human telomerase RNA. Second, seven circular permutations residing within the catalytically important core and CR4/5 domains completely abolish telomerase activity, unveiling mechanistically critical portions of these domains. Third, several circular permutations between the core and CR4/5 significantly increase telomerase activity. Our extensive circular permutation results provide insights into the architecture and coordination of human telomerase RNA and highlight where the RNA could be targeted for the development of antiaging and anticancer therapeutics. PMID:26503788

  15. Set-Based Discrete Particle Swarm Optimization Based on Decomposition for Permutation-Based Multiobjective Combinatorial Optimization Problems.

    PubMed

    Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun

    2018-07-01

    This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.

  16. Entropy change of biological dynamics in COPD

    PubMed Central

    Cao, Zhixin; Sun, Baoqing; Lo, Iek Long; Liu, Tzu-Ming; Zheng, Jun; Sun, Shixue; Shi, Yan; Zhang, Xiaohua Douglas

    2017-01-01

    In this century, the rapid development of large data storage technologies, mobile network technology, and portable medical devices makes it possible to measure, record, store, and track analysis of large amount of data in human physiological signals. Entropy is a key metric for quantifying the irregularity contained in physiological signals. In this review, we focus on how entropy changes in various physiological signals in COPD. Our review concludes that the entropy change relies on the types of physiological signals under investigation. For major physiological signals related to respiratory diseases, such as airflow, heart rate variability, and gait variability, the entropy of a patient with COPD is lower than that of a healthy person. However, in case of hormone secretion and respiratory sound, the entropy of a patient is higher than that of a healthy person. For mechanomyogram signal, the entropy increases with the increased severity of COPD. This result should give valuable guidance for the use of entropy for physiological signals measured by wearable medical device as well as for further research on entropy in COPD. PMID:29066881

  17. Correlations of multiscale entropy in the FX market

    NASA Astrophysics Data System (ADS)

    Stosic, Darko; Stosic, Dusan; Ludermir, Teresa; Stosic, Tatijana

    2016-09-01

    The regularity of price fluctuations in exchange rates plays a crucial role in FX market dynamics. Distinct variations in regularity arise from economic, social and political events, such as interday trading and financial crisis. This paper applies a multiscale time-dependent entropy method on thirty-three exchange rates to analyze price fluctuations in the FX. Correlation matrices of entropy values, termed entropic correlations, are in turn used to describe global behavior of the market. Empirical results suggest a weakly correlated market with pronounced collective behavior at bi-weekly trends. Correlations arise from cycles of low and high regularity in long-term trends. Eigenvalues of the correlation matrix also indicate a dominant European market, followed by shifting American, Asian, African, and Pacific influences. As a result, we find that entropy is a powerful tool for extracting important information from the FX market.

  18. Analysis of the phase transition in the two-dimensional Ising ferromagnet using a Lempel-Ziv string-parsing scheme and black-box data-compression utilities

    NASA Astrophysics Data System (ADS)

    Melchert, O.; Hartmann, A. K.

    2015-02-01

    In this work we consider information-theoretic observables to analyze short symbolic sequences, comprising time series that represent the orientation of a single spin in a two-dimensional (2D) Ising ferromagnet on a square lattice of size L2=1282 for different system temperatures T . The latter were chosen from an interval enclosing the critical point Tc of the model. At small temperatures the sequences are thus very regular; at high temperatures they are maximally random. In the vicinity of the critical point, nontrivial, long-range correlations appear. Here we implement estimators for the entropy rate, excess entropy (i.e., "complexity"), and multi-information. First, we implement a Lempel-Ziv string-parsing scheme, providing seemingly elaborate entropy rate and multi-information estimates and an approximate estimator for the excess entropy. Furthermore, we apply easy-to-use black-box data-compression utilities, providing approximate estimators only. For comparison and to yield results for benchmarking purposes, we implement the information-theoretic observables also based on the well-established M -block Shannon entropy, which is more tedious to apply compared to the first two "algorithmic" entropy estimation procedures. To test how well one can exploit the potential of such data-compression techniques, we aim at detecting the critical point of the 2D Ising ferromagnet. Among the above observables, the multi-information, which is known to exhibit an isolated peak at the critical point, is very easy to replicate by means of both efficient algorithmic entropy estimation procedures. Finally, we assess how good the various algorithmic entropy estimates compare to the more conventional block entropy estimates and illustrate a simple modification that yields enhanced results.

  19. The Evolution of Gas Giant Entropy During Formation by Runaway Accretion

    NASA Astrophysics Data System (ADS)

    Berardo, David; Cumming, Andrew; Marleau, Gabriel-Dominique

    2017-01-01

    We calculate the evolution of gas giant planets during the runaway gas accretion phase of formation, to understand how the luminosity of young giant planets depends on the accretion conditions. We construct steady-state envelope models, and run time-dependent simulations of accreting planets with the code Modules for Experiments in Stellar Astrophysics. We show that the evolution of the internal entropy depends on the contrast between the internal adiabat and the entropy of the accreted material, parametrized by the shock temperature T 0 and pressure P 0. At low temperatures ({T}0≲ 300-1000 {{K}}, depending on model parameters), the accreted material has a lower entropy than the interior. The convection zone extends to the surface and can drive a high luminosity, leading to rapid cooling and cold starts. For higher temperatures, the accreted material has a higher entropy than the interior, giving a radiative zone that stalls cooling. For {T}0≳ 2000 {{K}}, the surface-interior entropy contrast cannot be accommodated by the radiative envelope, and the accreted matter accumulates with high entropy, forming a hot start. The final state of the planet depends on the shock temperature, accretion rate, and starting entropy at the onset of runaway accretion. Cold starts with L≲ 5× {10}-6 {L}⊙ require low accretion rates and starting entropy, and the temperature of the accreting material needs to be maintained close to the nebula temperature. If instead the temperature is near the value required to radiate the accretion luminosity, 4π {R}2σ {T}04˜ ({GM}\\dot{M}/R), as suggested by previous work on radiative shocks in the context of star formation, gas giant planets form in a hot start with L˜ {10}-4 {L}⊙ .

  20. It is not the entropy you produce, rather, how you produce it

    PubMed Central

    Volk, Tyler; Pauluis, Olivier

    2010-01-01

    The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise the entropy production rate. Our analysis makes explicit fundamental questions for MEP that should be brought into focus: can MEP predict not just the overall state of entropy production of a system but also the details of the sub-systems of dissipaters within the system? Which fluxes of the system are those that are most likely to be maximized? How it is possible for MEP theory to be so domain-neutral that it can claim to apply equally to both purely physical–chemical systems and also systems governed by the ‘laws’ of biological evolution? We conclude that the principle of MEP needs to take on the issue of exactly how entropy is produced. PMID:20368249

  1. Altering the orientation of a fused protein to the RNA-binding ribosomal protein L7Ae and its derivatives through circular permutation.

    PubMed

    Ohuchi, Shoji J; Sagawa, Fumihiko; Sakamoto, Taiichi; Inoue, Tan

    2015-10-23

    RNA-protein complexes (RNPs) are useful for constructing functional nano-objects because a variety of functional proteins can be displayed on a designed RNA scaffold. Here, we report circular permutations of an RNA-binding protein L7Ae based on the three-dimensional structure information to alter the orientation of the displayed proteins on the RNA scaffold. An electrophoretic mobility shift assay and atomic force microscopy (AFM) analysis revealed that most of the designed circular permutants formed an RNP nano-object. Moreover, the alteration of the enhanced green fluorescent protein (EGFP) orientation was confirmed with AFM by employing EGFP on the L7Ae permutant on the RNA. The results demonstrate that targeted fine-tuning of the stereo-specific fixation of a protein on a protein-binding RNA is feasible by using the circular permutation technique. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Altering the orientation of a fused protein to the RNA-binding ribosomal protein L7Ae and its derivatives through circular permutation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohuchi, Shoji J.; Sagawa, Fumihiko; Sakamoto, Taiichi

    RNA-protein complexes (RNPs) are useful for constructing functional nano-objects because a variety of functional proteins can be displayed on a designed RNA scaffold. Here, we report circular permutations of an RNA-binding protein L7Ae based on the three-dimensional structure information to alter the orientation of the displayed proteins on the RNA scaffold. An electrophoretic mobility shift assay and atomic force microscopy (AFM) analysis revealed that most of the designed circular permutants formed an RNP nano-object. Moreover, the alteration of the enhanced green fluorescent protein (EGFP) orientation was confirmed with AFM by employing EGFP on the L7Ae permutant on the RNA. Themore » results demonstrate that targeted fine-tuning of the stereo-specific fixation of a protein on a protein-binding RNA is feasible by using the circular permutation technique.« less

  3. Statistical Mechanics of the Human Placenta: A Stationary State of a Near-Equilibrium System in a Linear Regime.

    PubMed

    Lecarpentier, Yves; Claes, Victor; Hébert, Jean-Louis; Krokidis, Xénophon; Blanc, François-Xavier; Michel, Francine; Timbely, Oumar

    2015-01-01

    All near-equilibrium systems under linear regime evolve to stationary states in which there is constant entropy production rate. In an open chemical system that exchanges matter and energy with the exterior, we can identify both the energy and entropy flows associated with the exchange of matter and energy. This can be achieved by applying statistical mechanics (SM), which links the microscopic properties of a system to its bulk properties. In the case of contractile tissues such as human placenta, Huxley's equations offer a phenomenological formalism for applying SM. SM was investigated in human placental stem villi (PSV) (n = 40). PSV were stimulated by means of KCl exposure (n = 20) and tetanic electrical stimulation (n = 20). This made it possible to determine statistical entropy (S), internal energy (E), affinity (A), thermodynamic force (A / T) (T: temperature), thermodynamic flow (v) and entropy production rate (A / T x v). We found that PSV operated near equilibrium, i.e., A ≺≺ 2500 J/mol and in a stationary linear regime, i.e., (A / T) varied linearly with v. As v was dramatically low, entropy production rate which quantified irreversibility of chemical processes appeared to be the lowest ever observed in any contractile system.

  4. Computation, prediction, and experimental tests of fitness for bacteriophage T7 mutants with permuted genomes

    NASA Astrophysics Data System (ADS)

    Endy, Drew; You, Lingchong; Yin, John; Molineux, Ian J.

    2000-05-01

    We created a simulation based on experimental data from bacteriophage T7 that computes the developmental cycle of the wild-type phage and also of mutants that have an altered genome order. We used the simulation to compute the fitness of more than 105 mutants. We tested these computations by constructing and experimentally characterizing T7 mutants in which we repositioned gene 1, coding for T7 RNA polymerase. Computed protein synthesis rates for ectopic gene 1 strains were in moderate agreement with observed rates. Computed phage-doubling rates were close to observations for two of four strains, but significantly overestimated those of the other two. Computations indicate that the genome organization of wild-type T7 is nearly optimal for growth: only 2.8% of random genome permutations were computed to grow faster, the highest 31% faster, than wild type. Specific discrepancies between computations and observations suggest that a better understanding of the translation efficiency of individual mRNAs and the functions of qualitatively "nonessential" genes will be needed to improve the T7 simulation. In silico representations of biological systems can serve to assess and advance our understanding of the underlying biology. Iteration between computation, prediction, and observation should increase the rate at which biological hypotheses are formulated and tested.

  5. Dimensionality and entropy of spontaneous and evoked rate activity

    NASA Astrophysics Data System (ADS)

    Engelken, Rainer; Wolf, Fred

    Cortical circuits exhibit complex activity patterns both spontaneously and evoked by external stimuli. Finding low-dimensional structure in population activity is a challenge. What is the diversity of the collective neural activity and how is it affected by an external stimulus? Using concepts from ergodic theory, we calculate the attractor dimensionality and dynamical entropy production of these networks. We obtain these two canonical measures of the collective network dynamics from the full set of Lyapunov exponents. We consider a randomly-wired firing-rate network that exhibits chaotic rate fluctuations for sufficiently strong synaptic weights. We show that dynamical entropy scales logarithmically with synaptic coupling strength, while the attractor dimensionality saturates. Thus, despite the increasing uncertainty, the diversity of collective activity saturates for strong coupling. We find that a time-varying external stimulus drastically reduces both entropy and dimensionality. Finally, we analytically approximate the full Lyapunov spectrum in several limiting cases by random matrix theory. Our study opens a novel avenue to characterize the complex dynamics of rate networks and the geometric structure of the corresponding high-dimensional chaotic attractor. received funding from Evangelisches Studienwerk Villigst, DFG through CRC 889 and Volkswagen Foundation.

  6. On Entropy Generation and the Effect of Heat and Mass Transfer Coupling in a Distillation Process

    NASA Astrophysics Data System (ADS)

    Burgos-Madrigal, Paulina; Mendoza, Diego F.; López de Haro, Mariano

    2018-01-01

    The entropy production rates as obtained from the exergy analysis, entropy balance and the nonequilibrium thermodynamics approach are compared for two distillation columns. The first case is a depropanizer column involving a mixture of ethane, propane, n-butane and n-pentane. The other is a weighed sample of Mexican crude oil distilled with a pilot scale fractionating column. The composition, temperature and flow profiles, for a given duty and operating conditions in each column, are obtained with the Aspen Plus V8.4 software by using the RateFrac model with a rate-based nonequilibrium column. For the depropanizer column the highest entropy production rate is found in the central trays where most of the mass transfer occurs, while in the second column the highest values correspond to the first three stages (where the vapor mixture is in contact with the cold liquid reflux), and to the last three stages (where the highest temperatures take place). The importance of the explicit inclusion of thermal diffusion in these processes is evaluated. In the depropanizer column, the effect of the coupling between heat and mass transfer is found to be negligible, while for the fractionating column it becomes appreciable.

  7. Selection of entropy-measure parameters for knowledge discovery in heart rate variability data

    PubMed Central

    2014-01-01

    Background Heart rate variability is the variation of the time interval between consecutive heartbeats. Entropy is a commonly used tool to describe the regularity of data sets. Entropy functions are defined using multiple parameters, the selection of which is controversial and depends on the intended purpose. This study describes the results of tests conducted to support parameter selection, towards the goal of enabling further biomarker discovery. Methods This study deals with approximate, sample, fuzzy, and fuzzy measure entropies. All data were obtained from PhysioNet, a free-access, on-line archive of physiological signals, and represent various medical conditions. Five tests were defined and conducted to examine the influence of: varying the threshold value r (as multiples of the sample standard deviation σ, or the entropy-maximizing rChon), the data length N, the weighting factors n for fuzzy and fuzzy measure entropies, and the thresholds rF and rL for fuzzy measure entropy. The results were tested for normality using Lilliefors' composite goodness-of-fit test. Consequently, the p-value was calculated with either a two sample t-test or a Wilcoxon rank sum test. Results The first test shows a cross-over of entropy values with regard to a change of r. Thus, a clear statement that a higher entropy corresponds to a high irregularity is not possible, but is rather an indicator of differences in regularity. N should be at least 200 data points for r = 0.2 σ and should even exceed a length of 1000 for r = rChon. The results for the weighting parameters n for the fuzzy membership function show different behavior when coupled with different r values, therefore the weighting parameters have been chosen independently for the different threshold values. The tests concerning rF and rL showed that there is no optimal choice, but r = rF = rL is reasonable with r = rChon or r = 0.2σ. Conclusions Some of the tests showed a dependency of the test significance on the data at hand. Nevertheless, as the medical conditions are unknown beforehand, compromises had to be made. Optimal parameter combinations are suggested for the methods considered. Yet, due to the high number of potential parameter combinations, further investigations of entropy for heart rate variability data will be necessary. PMID:25078574

  8. Selection of entropy-measure parameters for knowledge discovery in heart rate variability data.

    PubMed

    Mayer, Christopher C; Bachler, Martin; Hörtenhuber, Matthias; Stocker, Christof; Holzinger, Andreas; Wassertheurer, Siegfried

    2014-01-01

    Heart rate variability is the variation of the time interval between consecutive heartbeats. Entropy is a commonly used tool to describe the regularity of data sets. Entropy functions are defined using multiple parameters, the selection of which is controversial and depends on the intended purpose. This study describes the results of tests conducted to support parameter selection, towards the goal of enabling further biomarker discovery. This study deals with approximate, sample, fuzzy, and fuzzy measure entropies. All data were obtained from PhysioNet, a free-access, on-line archive of physiological signals, and represent various medical conditions. Five tests were defined and conducted to examine the influence of: varying the threshold value r (as multiples of the sample standard deviation σ, or the entropy-maximizing rChon), the data length N, the weighting factors n for fuzzy and fuzzy measure entropies, and the thresholds rF and rL for fuzzy measure entropy. The results were tested for normality using Lilliefors' composite goodness-of-fit test. Consequently, the p-value was calculated with either a two sample t-test or a Wilcoxon rank sum test. The first test shows a cross-over of entropy values with regard to a change of r. Thus, a clear statement that a higher entropy corresponds to a high irregularity is not possible, but is rather an indicator of differences in regularity. N should be at least 200 data points for r = 0.2 σ and should even exceed a length of 1000 for r = rChon. The results for the weighting parameters n for the fuzzy membership function show different behavior when coupled with different r values, therefore the weighting parameters have been chosen independently for the different threshold values. The tests concerning rF and rL showed that there is no optimal choice, but r = rF = rL is reasonable with r = rChon or r = 0.2σ. Some of the tests showed a dependency of the test significance on the data at hand. Nevertheless, as the medical conditions are unknown beforehand, compromises had to be made. Optimal parameter combinations are suggested for the methods considered. Yet, due to the high number of potential parameter combinations, further investigations of entropy for heart rate variability data will be necessary.

  9. Permutation inference for the general linear model

    PubMed Central

    Winkler, Anderson M.; Ridgway, Gerard R.; Webster, Matthew A.; Smith, Stephen M.; Nichols, Thomas E.

    2014-01-01

    Permutation methods can provide exact control of false positives and allow the use of non-standard statistics, making only weak assumptions about the data. With the availability of fast and inexpensive computing, their main limitation would be some lack of flexibility to work with arbitrary experimental designs. In this paper we report on results on approximate permutation methods that are more flexible with respect to the experimental design and nuisance variables, and conduct detailed simulations to identify the best method for settings that are typical for imaging research scenarios. We present a generic framework for permutation inference for complex general linear models (glms) when the errors are exchangeable and/or have a symmetric distribution, and show that, even in the presence of nuisance effects, these permutation inferences are powerful while providing excellent control of false positives in a wide range of common and relevant imaging research scenarios. We also demonstrate how the inference on glm parameters, originally intended for independent data, can be used in certain special but useful cases in which independence is violated. Detailed examples of common neuroimaging applications are provided, as well as a complete algorithm – the “randomise” algorithm – for permutation inference with the glm. PMID:24530839

  10. Force-Time Entropy of Isometric Impulse.

    PubMed

    Hsieh, Tsung-Yu; Newell, Karl M

    2016-01-01

    The relation between force and temporal variability in discrete impulse production has been viewed as independent (R. A. Schmidt, H. Zelaznik, B. Hawkins, J. S. Frank, & J. T. Quinn, 1979 ) or dependent on the rate of force (L. G. Carlton & K. M. Newell, 1993 ). Two experiments in an isometric single finger force task investigated the joint force-time entropy with (a) fixed time to peak force and different percentages of force level and (b) fixed percentage of force level and different times to peak force. The results showed that the peak force variability increased either with the increment of force level or through a shorter time to peak force that also reduced timing error variability. The peak force entropy and entropy of time to peak force increased on the respective dimension as the parameter conditions approached either maximum force or a minimum rate of force production. The findings show that force error and timing error are dependent but complementary when considered in the same framework with the joint force-time entropy at a minimum in the middle parameter range of discrete impulse.

  11. Thermodynamic criteria for estimating the kinetic parameters of catalytic reactions

    NASA Astrophysics Data System (ADS)

    Mitrichev, I. I.; Zhensa, A. V.; Kol'tsova, E. M.

    2017-01-01

    Kinetic parameters are estimated using two criteria in addition to the traditional criterion that considers the consistency between experimental and modeled conversion data: thermodynamic consistency and the consistency with entropy production (i.e., the absolute rate of the change in entropy due to exchange with the environment is consistent with the rate of entropy production in the steady state). A special procedure is developed and executed on a computer to achieve the thermodynamic consistency of a set of kinetic parameters with respect to both the standard entropy of a reaction and the standard enthalpy of a reaction. A problem of multi-criterion optimization, reduced to a single-criterion problem by summing weighted values of the three criteria listed above, is solved. Using the reaction of NO reduction with CO on a platinum catalyst as an example, it is shown that the set of parameters proposed by D.B. Mantri and P. Aghalayam gives much worse agreement with experimental values than the set obtained on the basis of three criteria: the sum of the squares of deviations for conversion, the thermodynamic consistency, and the consistency with entropy production.

  12. Toward a general theory of conical intersections in systems of identical nuclei

    NASA Astrophysics Data System (ADS)

    Keating, Sean P.; Mead, C. Alden

    1987-02-01

    It has been shown previously that the Herzberg-Longuet-Higgins sign change produced in Born-Oppenheimer electronic wave functions when the nuclei traverse a closed path around a conical intersection has implications for the symmetry of wave functions under permutations of identical nuclei. For systems of three or four identical nuclei, there are special features present which have facilitated the detailed analysis. The present paper reports progress toward a general theory for systems of n nuclei. For n=3 or 4, the two key functions which locate conical intersections and define compensating phase factors can conveniently be defined so as to transform under permutations according to a two-dimensional irreducible representation of the permutation group. Since such representations do not exist for n>4, we have chosen to develop a formalism in terms of lab-fixed electronic basis functions, and we show how to define the two key functions in principle. The functions so defined both turn out to be totally symmetric under permutations. We show how they can be used to define compensating phase factors so that all modified electronic wave functions are either totally symmetric or totally antisymmetric under permutations. A detailed analysis is made to cyclic permutations in the neighborhood of Dnh symmetry, which can be extended by continuity arguments to more general configurations, and criteria are obtained for sign changes. There is a qualitative discussion of the treatment of more general permutations.

  13. Hawking radiation and entropy of a black hole in Lovelock-Born-Infeld gravity from the quantum tunneling approach

    NASA Astrophysics Data System (ADS)

    Li, Gu-Qiang

    2017-04-01

    The tunneling radiation of particles from black holes in Lovelock-Born-Infeld (LBI) gravity is studied by using the Parikh-Wilczek (PW) method, and the emission rate of a particle is calculated. It is shown that the emission spectrum deviates from the purely thermal spectrum but is consistent with an underlying unitary theory. Compared to the conventional tunneling rate related to the increment of black hole entropy, the entropy of the black hole in LBI gravity is obtained. The entropy does not obey the area law unless all the Lovelock coefficients equal zero, but it satisfies the first law of thermodynamics and is in accordance with earlier results. It is distinctly shown that the PW tunneling framework is related to the thermodynamic laws of the black hole. Supported by Guangdong Natural Science Foundation (2016A030307051, 2015A030313789)

  14. A permutation-based non-parametric analysis of CRISPR screen data.

    PubMed

    Jia, Gaoxiang; Wang, Xinlei; Xiao, Guanghua

    2017-07-19

    Clustered regularly-interspaced short palindromic repeats (CRISPR) screens are usually implemented in cultured cells to identify genes with critical functions. Although several methods have been developed or adapted to analyze CRISPR screening data, no single specific algorithm has gained popularity. Thus, rigorous procedures are needed to overcome the shortcomings of existing algorithms. We developed a Permutation-Based Non-Parametric Analysis (PBNPA) algorithm, which computes p-values at the gene level by permuting sgRNA labels, and thus it avoids restrictive distributional assumptions. Although PBNPA is designed to analyze CRISPR data, it can also be applied to analyze genetic screens implemented with siRNAs or shRNAs and drug screens. We compared the performance of PBNPA with competing methods on simulated data as well as on real data. PBNPA outperformed recent methods designed for CRISPR screen analysis, as well as methods used for analyzing other functional genomics screens, in terms of Receiver Operating Characteristics (ROC) curves and False Discovery Rate (FDR) control for simulated data under various settings. Remarkably, the PBNPA algorithm showed better consistency and FDR control on published real data as well. PBNPA yields more consistent and reliable results than its competitors, especially when the data quality is low. R package of PBNPA is available at: https://cran.r-project.org/web/packages/PBNPA/ .

  15. Entropic bounds on currents in Langevin systems

    NASA Astrophysics Data System (ADS)

    Dechant, Andreas; Sasa, Shin-ichi

    2018-06-01

    We derive a bound on generalized currents for Langevin systems in terms of the total entropy production in the system and its environment. For overdamped dynamics, any generalized current is bounded by the total rate of entropy production. We show that this entropic bound on the magnitude of generalized currents imposes power-efficiency tradeoff relations for ratchets in contact with a heat bath: Maximum efficiency—Carnot efficiency for a Smoluchowski-Feynman ratchet and unity for a flashing or rocking ratchet—can only be reached at vanishing power output. For underdamped dynamics, while there may be reversible currents that are not bounded by the entropy production rate, we show that the output power and heat absorption rate are irreversible currents and thus obey the same bound. As a consequence, a power-efficiency tradeoff relation holds not only for underdamped ratchets but also for periodically driven heat engines. For weak driving, the bound results in additional constraints on the Onsager matrix beyond those imposed by the second law. Finally, we discuss the connection between heat and entropy in a nonthermal situation where the friction and noise intensity are state dependent.

  16. Hidden cross-correlation patterns in stock markets based on permutation cross-sample entropy and PCA

    NASA Astrophysics Data System (ADS)

    Lin, Aijing; Shang, Pengjian; Zhong, Bo

    2014-12-01

    In this article, we investigate the hidden cross-correlation structures in Chinese stock markets and US stock markets by performing PCSE combined with PCA approach. It is suggested that PCSE can provide a more faithful and more interpretable description of the dynamic mechanism between time series than cross-correlation matrix. We show that this new technique can be adapted to observe stock markets especially during financial crisis. In order to identify and compare the interactions and structures of stock markets during financial crisis, as well as in normal periods, all the samples are divided into four sub-periods. The results imply that the cross-correlations between Chinese group are stronger than the US group in the most sub-periods. In particular, it is likely that the US stock markets are more integrated with each other during global financial crisis than during Asian financial crisis. However, our results illustrate that Chinese stock markets are not immune from the global financial crisis, although less integrated with other markets if they are compared with US stock markets.

  17. Permutation parity machines for neural cryptography.

    PubMed

    Reyes, Oscar Mauricio; Zimmermann, Karl-Heinz

    2010-06-01

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  18. Inference for Distributions over the Permutation Group

    DTIC Science & Technology

    2008-05-01

    world problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n...problems, such as voting , ranking, and data association. Representing uncertainty over permutations is challenging, since there are n! possibilities...the Krone ker (or Tensor ) Produ t Representation.In general, the Krone ker produ t representation is redu ible, and so it ande omposed into a dire t

  19. Students' Errors in Solving the Permutation and Combination Problems Based on Problem Solving Steps of Polya

    ERIC Educational Resources Information Center

    Sukoriyanto; Nusantara, Toto; Subanji; Chandra, Tjang Daniel

    2016-01-01

    This article was written based on the results of a study evaluating students' errors in problem solving of permutation and combination in terms of problem solving steps according to Polya. Twenty-five students were asked to do four problems related to permutation and combination. The research results showed that the students still did a mistake in…

  20. Permutation parity machines for neural cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyes, Oscar Mauricio; Escuela de Ingenieria Electrica, Electronica y Telecomunicaciones, Universidad Industrial de Santander, Bucaramanga; Zimmermann, Karl-Heinz

    2010-06-15

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  1. Sorting signed permutations by short operations.

    PubMed

    Galvão, Gustavo Rodrigues; Lee, Orlando; Dias, Zanoni

    2015-01-01

    During evolution, global mutations may alter the order and the orientation of the genes in a genome. Such mutations are referred to as rearrangement events, or simply operations. In unichromosomal genomes, the most common operations are reversals, which are responsible for reversing the order and orientation of a sequence of genes, and transpositions, which are responsible for switching the location of two contiguous portions of a genome. The problem of computing the minimum sequence of operations that transforms one genome into another - which is equivalent to the problem of sorting a permutation into the identity permutation - is a well-studied problem that finds application in comparative genomics. There are a number of works concerning this problem in the literature, but they generally do not take into account the length of the operations (i.e. the number of genes affected by the operations). Since it has been observed that short operations are prevalent in the evolution of some species, algorithms that efficiently solve this problem in the special case of short operations are of interest. In this paper, we investigate the problem of sorting a signed permutation by short operations. More precisely, we study four flavors of this problem: (i) the problem of sorting a signed permutation by reversals of length at most 2; (ii) the problem of sorting a signed permutation by reversals of length at most 3; (iii) the problem of sorting a signed permutation by reversals and transpositions of length at most 2; and (iv) the problem of sorting a signed permutation by reversals and transpositions of length at most 3. We present polynomial-time solutions for problems (i) and (iii), a 5-approximation for problem (ii), and a 3-approximation for problem (iv). Moreover, we show that the expected approximation ratio of the 5-approximation algorithm is not greater than 3 for random signed permutations with more than 12 elements. Finally, we present experimental results that show that the approximation ratios of the approximation algorithms cannot be smaller than 3. In particular, this means that the approximation ratio of the 3-approximation algorithm is tight.

  2. Simulating the component counts of combinatorial structures.

    PubMed

    Arratia, Richard; Barbour, A D; Ewens, W J; Tavaré, Simon

    2018-02-09

    This article describes and compares methods for simulating the component counts of random logarithmic combinatorial structures such as permutations and mappings. We exploit the Feller coupling for simulating permutations to provide a very fast method for simulating logarithmic assemblies more generally. For logarithmic multisets and selections, this approach is replaced by an acceptance/rejection method based on a particular conditioning relationship that represents the distribution of the combinatorial structure as that of independent random variables conditioned on a weighted sum. We show how to improve its acceptance rate. We illustrate the method by estimating the probability that a random mapping has no repeated component sizes, and establish the asymptotic distribution of the difference between the number of components and the number of distinct component sizes for a very general class of logarithmic structures. Copyright © 2018. Published by Elsevier Inc.

  3. Coherence and entanglement measures based on Rényi relative entropies

    NASA Astrophysics Data System (ADS)

    Zhu, Huangjun; Hayashi, Masahito; Chen, Lin

    2017-11-01

    We study systematically resource measures of coherence and entanglement based on Rényi relative entropies, which include the logarithmic robustness of coherence, geometric coherence, and conventional relative entropy of coherence together with their entanglement analogues. First, we show that each Rényi relative entropy of coherence is equal to the corresponding Rényi relative entropy of entanglement for any maximally correlated state. By virtue of this observation, we establish a simple operational connection between entanglement measures and coherence measures based on Rényi relative entropies. We then prove that all these coherence measures, including the logarithmic robustness of coherence, are additive. Accordingly, all these entanglement measures are additive for maximally correlated states. In addition, we derive analytical formulas for Rényi relative entropies of entanglement of maximally correlated states and bipartite pure states, which reproduce a number of classic results on the relative entropy of entanglement and logarithmic robustness of entanglement in a unified framework. Several nontrivial bounds for Rényi relative entropies of coherence (entanglement) are further derived, which improve over results known previously. Moreover, we determine all states whose relative entropy of coherence is equal to the logarithmic robustness of coherence. As an application, we provide an upper bound for the exact coherence distillation rate, which is saturated for pure states.

  4. Random versus maximum entropy models of neural population activity

    NASA Astrophysics Data System (ADS)

    Ferrari, Ulisse; Obuchi, Tomoyuki; Mora, Thierry

    2017-04-01

    The principle of maximum entropy provides a useful method for inferring statistical mechanics models from observations in correlated systems, and is widely used in a variety of fields where accurate data are available. While the assumptions underlying maximum entropy are intuitive and appealing, its adequacy for describing complex empirical data has been little studied in comparison to alternative approaches. Here, data from the collective spiking activity of retinal neurons is reanalyzed. The accuracy of the maximum entropy distribution constrained by mean firing rates and pairwise correlations is compared to a random ensemble of distributions constrained by the same observables. For most of the tested networks, maximum entropy approximates the true distribution better than the typical or mean distribution from that ensemble. This advantage improves with population size, with groups as small as eight being almost always better described by maximum entropy. Failure of maximum entropy to outperform random models is found to be associated with strong correlations in the population.

  5. A statistical method for the conservative adjustment of false discovery rate (q-value).

    PubMed

    Lai, Yinglei

    2017-03-14

    q-value is a widely used statistical method for estimating false discovery rate (FDR), which is a conventional significance measure in the analysis of genome-wide expression data. q-value is a random variable and it may underestimate FDR in practice. An underestimated FDR can lead to unexpected false discoveries in the follow-up validation experiments. This issue has not been well addressed in literature, especially in the situation when the permutation procedure is necessary for p-value calculation. We proposed a statistical method for the conservative adjustment of q-value. In practice, it is usually necessary to calculate p-value by a permutation procedure. This was also considered in our adjustment method. We used simulation data as well as experimental microarray or sequencing data to illustrate the usefulness of our method. The conservativeness of our approach has been mathematically confirmed in this study. We have demonstrated the importance of conservative adjustment of q-value, particularly in the situation that the proportion of differentially expressed genes is small or the overall differential expression signal is weak.

  6. Quantum key distribution with finite resources: Secret key rates via Renyi entropies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abruzzo, Silvestre; Kampermann, Hermann; Mertz, Markus

    A realistic quantum key distribution (QKD) protocol necessarily deals with finite resources, such as the number of signals exchanged by the two parties. We derive a bound on the secret key rate which is expressed as an optimization problem over Renyi entropies. Under the assumption of collective attacks by an eavesdropper, a computable estimate of our bound for the six-state protocol is provided. This bound leads to improved key rates in comparison to previous results.

  7. Sample entropy analysis for the estimating depth of anaesthesia through human EEG signal at different levels of unconsciousness during surgeries

    PubMed Central

    Fan, Shou-Zen; Abbod, Maysam F.

    2018-01-01

    Estimating the depth of anaesthesia (DoA) in operations has always been a challenging issue due to the underlying complexity of the brain mechanisms. Electroencephalogram (EEG) signals are undoubtedly the most widely used signals for measuring DoA. In this paper, a novel EEG-based index is proposed to evaluate DoA for 24 patients receiving general anaesthesia with different levels of unconsciousness. Sample Entropy (SampEn) algorithm was utilised in order to acquire the chaotic features of the signals. After calculating the SampEn from the EEG signals, Random Forest was utilised for developing learning regression models with Bispectral index (BIS) as the target. Correlation coefficient, mean absolute error, and area under the curve (AUC) were used to verify the perioperative performance of the proposed method. Validation comparisons with typical nonstationary signal analysis methods (i.e., recurrence analysis and permutation entropy) and regression methods (i.e., neural network and support vector machine) were conducted. To further verify the accuracy and validity of the proposed methodology, the data is divided into four unconsciousness-level groups on the basis of BIS levels. Subsequently, analysis of variance (ANOVA) was applied to the corresponding index (i.e., regression output). Results indicate that the correlation coefficient improved to 0.72 ± 0.09 after filtering and to 0.90 ± 0.05 after regression from the initial values of 0.51 ± 0.17. Similarly, the final mean absolute error dramatically declined to 5.22 ± 2.12. In addition, the ultimate AUC increased to 0.98 ± 0.02, and the ANOVA analysis indicates that each of the four groups of different anaesthetic levels demonstrated significant difference from the nearest levels. Furthermore, the Random Forest output was extensively linear in relation to BIS, thus with better DoA prediction accuracy. In conclusion, the proposed method provides a concrete basis for monitoring patients’ anaesthetic level during surgeries. PMID:29844970

  8. Conditional Entropy-Constrained Residual VQ with Application to Image Coding

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Chung, Wilson C.; Smith, Mark J. T.

    1996-01-01

    This paper introduces an extension of entropy-constrained residual vector quantization (VQ) where intervector dependencies are exploited. The method, which we call conditional entropy-constrained residual VQ, employs a high-order entropy conditioning strategy that captures local information in the neighboring vectors. When applied to coding images, the proposed method is shown to achieve better rate-distortion performance than that of entropy-constrained residual vector quantization with less computational complexity and lower memory requirements. Moreover, it can be designed to support progressive transmission in a natural way. It is also shown to outperform some of the best predictive and finite-state VQ techniques reported in the literature. This is due partly to the joint optimization between the residual vector quantizer and a high-order conditional entropy coder as well as the efficiency of the multistage residual VQ structure and the dynamic nature of the prediction.

  9. Determining distinct circuit in complete graphs using permutation

    NASA Astrophysics Data System (ADS)

    Karim, Sharmila; Ibrahim, Haslinda; Darus, Maizon Mohd

    2017-11-01

    A Half Butterfly Method (HBM) is a method introduced to construct the distinct circuits in complete graphs where used the concept of isomorphism. The Half Butterfly Method was applied in the field of combinatorics such as in listing permutations of n elements. However the method of determining distinct circuit using HBM for n > 4 is become tedious. Thus, in this paper, we present the method of generating distinct circuit using permutation.

  10. Relative entropy of entanglement and restricted measurements.

    PubMed

    Piani, M

    2009-10-16

    We introduce variants of relative entropy of entanglement based on the optimal distinguishability from unentangled states by means of restricted measurements. In this way we are able to prove that the standard regularized entropy of entanglement is strictly positive for all multipartite entangled states. This implies that the asymptotic creation of a multipartite entangled state by means of local operations and classical communication always requires the consumption of a nonlocal resource at a strictly positive rate.

  11. A Versatile Platform for Nanotechnology Based on Circular Permutation of a Chaperonin Protein

    NASA Technical Reports Server (NTRS)

    Paavola, Chad; McMillan, Andrew; Trent, Jonathan; Chan, Suzanne; Mazzarella, Kellen; Li, Yi-Fen

    2004-01-01

    A number of protein complexes have been developed as nanoscale templates. These templates can be functionalized using the peptide sequences that bind inorganic materials. However, it is difficult to integrate peptides into a specific position within a protein template. Integrating intact proteins with desirable binding or catalytic activities is an even greater challenge. We present a general method for modifying protein templates using circular permutation so that additional peptide sequence can be added in a wide variety of specific locations. Circular permutation is a reordering of the polypeptide chain such that the original termini are joined and new termini are created elsewhere in the protein. New sequence can be joined to the protein termini without perturbing the protein structure and with minimal limitation on the size and conformation of the added sequence. We have used this approach to modify a chaperonin protein template, placing termini at five different locations distributed across the surface of the protein complex. These permutants are competent to form the double-ring structures typical of chaperonin proteins. The permuted double-rings also form the same assemblies as the unmodified protein. We fused a fluorescent protein to two representative permutants and demonstrated that it assumes its active structure and does not interfere with assembly of chaperonin double-rings.

  12. Determination of LEDs degradation with entropy generation rate

    NASA Astrophysics Data System (ADS)

    Cuadras, Angel; Yao, Jiaqiang; Quilez, Marcos

    2017-10-01

    We propose a method to assess the degradation and aging of light emitting diodes (LEDs) based on irreversible entropy generation rate. We degraded several LEDs and monitored their entropy generation rate ( S ˙ ) in accelerated tests. We compared the thermoelectrical results with the optical light emission evolution during degradation. We find a good relationship between aging and S ˙ (t), because S ˙ is both related to device parameters and optical performance. We propose a threshold of S ˙ (t) as a reliable damage indicator of LED end-of-life that can avoid the need to perform optical measurements to assess optical aging. The method lays beyond the typical statistical laws for lifetime prediction provided by manufacturers. We tested different LED colors and electrical stresses to validate the electrical LED model and we analyzed the degradation mechanisms of the devices.

  13. The effect of orthostatic stress on multiscale entropy of heart rate and blood pressure.

    PubMed

    Turianikova, Zuzana; Javorka, Kamil; Baumert, Mathias; Calkovska, Andrea; Javorka, Michal

    2011-09-01

    Cardiovascular control acts over multiple time scales, which introduces a significant amount of complexity to heart rate and blood pressure time series. Multiscale entropy (MSE) analysis has been developed to quantify the complexity of a time series over multiple time scales. In previous studies, MSE analyses identified impaired cardiovascular control and increased cardiovascular risk in various pathological conditions. Despite the increasing acceptance of the MSE technique in clinical research, information underpinning the involvement of the autonomic nervous system in the MSE of heart rate and blood pressure is lacking. The objective of this study is to investigate the effect of orthostatic challenge on the MSE of heart rate and blood pressure variability (HRV, BPV) and the correlation between MSE (complexity measures) and traditional linear (time and frequency domain) measures. MSE analysis of HRV and BPV was performed in 28 healthy young subjects on 1000 consecutive heart beats in the supine and standing positions. Sample entropy values were assessed on scales of 1-10. We found that MSE of heart rate and blood pressure signals is sensitive to changes in autonomic balance caused by postural change from the supine to the standing position. The effect of orthostatic challenge on heart rate and blood pressure complexity depended on the time scale under investigation. Entropy values did not correlate with the mean values of heart rate and blood pressure and showed only weak correlations with linear HRV and BPV measures. In conclusion, the MSE analysis of heart rate and blood pressure provides a sensitive tool to detect changes in autonomic balance as induced by postural change.

  14. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  15. Rank score and permutation testing alternatives for regression quantile estimates

    USGS Publications Warehouse

    Cade, B.S.; Richards, J.D.; Mielke, P.W.

    2006-01-01

    Performance of quantile rank score tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1) were evaluated by simulation for models with p = 2 and 6 predictors, moderate collinearity among predictors, homogeneous and hetero-geneous errors, small to moderate samples (n = 20–300), and central to upper quantiles (0.50–0.99). Test statistics evaluated were the conventional quantile rank score T statistic distributed as χ2 random variable with q degrees of freedom (where q parameters are constrained by H 0:) and an F statistic with its sampling distribution approximated by permutation. The permutation F-test maintained better Type I errors than the T-test for homogeneous error models with smaller n and more extreme quantiles τ. An F distributional approximation of the F statistic provided some improvements in Type I errors over the T-test for models with > 2 parameters, smaller n, and more extreme quantiles but not as much improvement as the permutation approximation. Both rank score tests required weighting to maintain correct Type I errors when heterogeneity under the alternative model increased to 5 standard deviations across the domain of X. A double permutation procedure was developed to provide valid Type I errors for the permutation F-test when null models were forced through the origin. Power was similar for conditions where both T- and F-tests maintained correct Type I errors but the F-test provided some power at smaller n and extreme quantiles when the T-test had no power because of excessively conservative Type I errors. When the double permutation scheme was required for the permutation F-test to maintain valid Type I errors, power was less than for the T-test with decreasing sample size and increasing quantiles. Confidence intervals on parameters and tolerance intervals for future predictions were constructed based on test inversion for an example application relating trout densities to stream channel width:depth.

  16. Rapid and Accurate Multiple Testing Correction and Power Estimation for Millions of Correlated Markers

    PubMed Central

    Han, Buhm; Kang, Hyun Min; Eskin, Eleazar

    2009-01-01

    With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255

  17. The right thalamus may play an important role in anesthesia-awakening regulation in frogs

    PubMed Central

    Fan, Yanzhu; Yue, Xizi; Xue, Fei; Brauth, Steven E.; Tang, Yezhong

    2018-01-01

    Background Previous studies have shown that the mammalian thalamus is a key structure for anesthesia-induced unconsciousness and anesthesia-awakening regulation. However, both the dynamic characteristics and probable lateralization of thalamic functioning during anesthesia-awakening regulation are not fully understood, and little is known of the evolutionary basis of the role of the thalamus in anesthesia-awakening regulation. Methods An amphibian species, the South African clawed frog (Xenopus laevis) was used in the present study. The frogs were immersed in triciane methanesulfonate (MS-222) for general anesthesia. Electroencephalogram (EEG) signals were recorded continuously from both sides of the telencephalon, diencephalon (thalamus) and mesencephalon during the pre-anesthesia stage, administration stage, recovery stage and post-anesthesia stage. EEG data was analyzed including calculation of approximate entropy (ApEn) and permutation entropy (PE). Results Both ApEn and PE values differed significantly between anesthesia stages, with the highest values occurring during the awakening period and the lowest values during the anesthesia period. There was a significant correlation between the stage durations and ApEn or PE values during anesthesia-awakening cycle primarily for the right diencephalon (right thalamus). ApEn and PE values for females were significantly higher than those for males. Discussion ApEn and PE measurements are suitable for estimating depth of anesthesia and complexity of amphibian brain activity. The right thalamus appears physiologically positioned to play an important role in anesthesia-awakening regulation in frogs indicating an early evolutionary origin of the role of the thalamus in arousal and consciousness in land vertebrates. Sex differences exist in the neural regulation of general anesthesia in frogs. PMID:29576980

  18. Segment swapping aided the evolution of enzyme function: The case of uroporphyrinogen III synthase.

    PubMed

    Szilágyi, András; Györffy, Dániel; Závodszky, Péter

    2017-01-01

    In an earlier study, we showed that two-domain segment-swapped proteins can evolve by domain swapping and fusion, resulting in a protein with two linkers connecting its domains. We proposed that a potential evolutionary advantage of this topology may be the restriction of interdomain motions, which may facilitate domain closure by a hinge-like movement, crucial for the function of many enzymes. Here, we test this hypothesis computationally on uroporphyrinogen III synthase, a two-domain segment-swapped enzyme essential in porphyrin metabolism. To compare the interdomain flexibility between the wild-type, segment-swapped enzyme (having two interdomain linkers) and circular permutants of the same enzyme having only one interdomain linker, we performed geometric and molecular dynamics simulations for these species in their ligand-free and ligand-bound forms. We find that in the ligand-free form, interdomain motions in the wild-type enzyme are significantly more restricted than they would be with only one interdomain linker, while the flexibility difference is negligible in the ligand-bound form. We also estimated the entropy costs of ligand binding associated with the interdomain motions, and find that the change in domain connectivity due to segment swapping results in a reduction of this entropy cost, corresponding to ∼20% of the total ligand binding free energy. In addition, the restriction of interdomain motions may also help the functional domain-closure motion required for catalysis. This suggests that the evolution of the segment-swapped topology facilitated the evolution of enzyme function for this protein by influencing its dynamic properties. Proteins 2016; 85:46-53. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. An analog scrambler for speech based on sequential permutations in time and frequency

    NASA Astrophysics Data System (ADS)

    Cox, R. V.; Jayant, N. S.; McDermott, B. J.

    Permutation of speech segments is an operation that is frequently used in the design of scramblers for analog speech privacy. In this paper, a sequential procedure for segment permutation is considered. This procedure can be extended to two dimensional permutation of time segments and frequency bands. By subjective testing it is shown that this combination gives a residual intelligibility for spoken digits of 20 percent with a delay of 256 ms. (A lower bound for this test would be 10 percent). The complexity of implementing such a system is considered and the issues of synchronization and channel equalization are addressed. The computer simulation results for the system using both real and simulated channels are examined.

  20. A 1.375-approximation algorithm for sorting by transpositions.

    PubMed

    Elias, Isaac; Hartman, Tzvika

    2006-01-01

    Sorting permutations by transpositions is an important problem in genome rearrangements. A transposition is a rearrangement operation in which a segment is cut out of the permutation and pasted in a different location. The complexity of this problem is still open and it has been a 10-year-old open problem to improve the best known 1.5-approximation algorithm. In this paper, we provide a 1.375-approximation algorithm for sorting by transpositions. The algorithm is based on a new upper bound on the diameter of 3-permutations. In addition, we present some new results regarding the transposition diameter: we improve the lower bound for the transposition diameter of the symmetric group and determine the exact transposition diameter of simple permutations.

  1. Permutational distribution of the log-rank statistic under random censorship with applications to carcinogenicity assays.

    PubMed

    Heimann, G; Neuhaus, G

    1998-03-01

    In the random censorship model, the log-rank test is often used for comparing a control group with different dose groups. If the number of tumors is small, so-called exact methods are often applied for computing critical values from a permutational distribution. Two of these exact methods are discussed and shown to be incorrect. The correct permutational distribution is derived and studied with respect to its behavior under unequal censoring in the light of recent results proving that the permutational version and the unconditional version of the log-rank test are asymptotically equivalent even under unequal censoring. The log-rank test is studied by simulations of a realistic scenario from a bioassay with small numbers of tumors.

  2. The Shannon entropy as a measure of diffusion in multidimensional dynamical systems

    NASA Astrophysics Data System (ADS)

    Giordano, C. M.; Cincotta, P. M.

    2018-05-01

    In the present work, we introduce two new estimators of chaotic diffusion based on the Shannon entropy. Using theoretical, heuristic and numerical arguments, we show that the entropy, S, provides a measure of the diffusion extent of a given small initial ensemble of orbits, while an indicator related with the time derivative of the entropy, S', estimates the diffusion rate. We show that in the limiting case of near ergodicity, after an appropriate normalization, S' coincides with the standard homogeneous diffusion coefficient. The very first application of this formulation to a 4D symplectic map and to the Arnold Hamiltonian reveals very successful and encouraging results.

  3. Maximum Entropy Production As a Framework for Understanding How Living Systems Evolve, Organize and Function

    NASA Astrophysics Data System (ADS)

    Vallino, J. J.; Algar, C. K.; Huber, J. A.; Fernandez-Gonzalez, N.

    2014-12-01

    The maximum entropy production (MEP) principle holds that non equilibrium systems with sufficient degrees of freedom will likely be found in a state that maximizes entropy production or, analogously, maximizes potential energy destruction rate. The theory does not distinguish between abiotic or biotic systems; however, we will show that systems that can coordinate function over time and/or space can potentially dissipate more free energy than purely Markovian processes (such as fire or a rock rolling down a hill) that only maximize instantaneous entropy production. Biological systems have the ability to store useful information acquired via evolution and curated by natural selection in genomic sequences that allow them to execute temporal strategies and coordinate function over space. For example, circadian rhythms allow phototrophs to "predict" that sun light will return and can orchestrate metabolic machinery appropriately before sunrise, which not only gives them a competitive advantage, but also increases the total entropy production rate compared to systems that lack such anticipatory control. Similarly, coordination over space, such a quorum sensing in microbial biofilms, can increase acquisition of spatially distributed resources and free energy and thereby enhance entropy production. In this talk we will develop a modeling framework to describe microbial biogeochemistry based on the MEP conjecture constrained by information and resource availability. Results from model simulations will be compared to laboratory experiments to demonstrate the usefulness of the MEP approach.

  4. Gender-specific heart rate dynamics in severe intrauterine growth-restricted fetuses.

    PubMed

    Gonçalves, Hernâni; Bernardes, João; Ayres-de-Campos, Diogo

    2013-06-01

    Management of intrauterine growth restriction (IUGR) remains a major issue in perinatology. The objective of this paper was the assessment of gender-specific fetal heart rate (FHR) dynamics as a diagnostic tool in severe IUGR. FHR was analyzed in the antepartum period in 15 severe IUGR fetuses and 18 controls, matched for gestational age, in relation to fetal gender. Linear and entropy methods, such as mean FHR (mFHR), low (LF), high (HF) and movement frequency (MF), approximate, sample and multiscale entropy. Sensitivities and specificities were estimated using Fisher linear discriminant analysis and the leave-one-out method. Overall, IUGR fetuses presented significantly lower mFHR and entropy compared with controls. However, gender-specific analysis showed that significantly lower mFHR was only evident in IUGR males and lower entropy in IUGR females. In addition, lower LF/(MF+HF) was patent in IUGR females compared with controls, but not in males. Rather high sensitivities and specificities were achieved in the detection of the FHR recordings related with IUGR male fetuses, when gender-specific analysis was performed at gestational ages less than 34 weeks. Severe IUGR fetuses present gender-specific linear and entropy FHR changes, compared with controls, characterized by a significantly lower entropy and sympathetic-vagal balance in females than in males. These findings need to be considered in order to achieve better diagnostic results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Object-Based Land Use Classification of Agricultural Land by Coupling Multi-Temporal Spectral Characteristics and Phenological Events in Germany

    NASA Astrophysics Data System (ADS)

    Knoefel, Patrick; Loew, Fabian; Conrad, Christopher

    2015-04-01

    Crop maps based on classification of remotely sensed data are of increased attendance in agricultural management. This induces a more detailed knowledge about the reliability of such spatial information. However, classification of agricultural land use is often limited by high spectral similarities of the studied crop types. More, spatially and temporally varying agro-ecological conditions can introduce confusion in crop mapping. Classification errors in crop maps in turn may have influence on model outputs, like agricultural production monitoring. One major goal of the PhenoS project ("Phenological structuring to determine optimal acquisition dates for Sentinel-2 data for field crop classification"), is the detection of optimal phenological time windows for land cover classification purposes. Since many crop species are spectrally highly similar, accurate classification requires the right selection of satellite images for a certain classification task. In the course of one growing season, phenological phases exist where crops are separable with higher accuracies. For this purpose, coupling of multi-temporal spectral characteristics and phenological events is promising. The focus of this study is set on the separation of spectrally similar cereal crops like winter wheat, barley, and rye of two test sites in Germany called "Harz/Central German Lowland" and "Demmin". However, this study uses object based random forest (RF) classification to investigate the impact of image acquisition frequency and timing on crop classification uncertainty by permuting all possible combinations of available RapidEye time series recorded on the test sites between 2010 and 2014. The permutations were applied to different segmentation parameters. Then, classification uncertainty was assessed and analysed, based on the probabilistic soft-output from the RF algorithm at the per-field basis. From this soft output, entropy was calculated as a spatial measure of classification uncertainty. The results indicate that uncertainty estimates provide a valuable addition to traditional accuracy assessments and helps the user to allocate error in crop maps.

  6. Coexpression of Human α- and Circularly Permuted β-Globins Yields a Hemoglobin with Normal R State but Modified T State Properties†

    PubMed Central

    Asmundson, Anna L.; Taber, Alexandria M.; van der Walde, Adella; Lin, Danielle H.; Olson, John S.; Anthony-Cahill, Spencer J.

    2009-01-01

    For the first time, a circularly permuted human β-globin (cpβ) has been coexpressed with human α-globin in bacterial cells and shown to associate to form α-cpβ hemoglobin in solution. Flash photolysis studies of α-cpβ show markedly biphasic CO and O2 kinetics with the amplitudes for the fast association phases being dominant due the presence of large amounts of high-affinity liganded hemoglobin dimers. Extensive dimerization of liganded but not deoxygenated α-cpβ was observed by gel chromatography. The rate constants for O2 and CO binding to the R state forms of α-cpβ are almost identical to those of native HbA (k′R(CO) ≈ 5.0 μM−1 s−1; k′R(O2) ≈ 50 μM−1 s−1), and the rate of O2 dissociation from fully oxygenated α-cpβ is also very similar to that observed for HbA (kR(O2) ≈ 21–28 s−1). When the equilibrium deoxyHb form of α-cpβ is reacted with CO in rapid mixing experiments, the observed time courses are monophasic and the observed bimolecular association rate constant is ∼1.0 μM−1 s−1, which is intermediate between the R state rate measured in partial photolysis experiments (∼5 μM−1 s−1) and that observed for T state deoxyHbA (k′T(CO) ≈ 0.1 to 0.2 μM−1 s−1). Thus the deoxygenated permutated β subunits generate an intermediate, higher affinity, deoxyHb quaternary state. This conclusion is supported by equilibrium oxygen binding measurements in which α-cpβ exhibits a P50 of ∼1.5 mmHg and a low n-value (∼1.3) at pH 7, 20 °C, compared to 8.5 mmHg and n ≈ 2.8 for native HbA under identical, dilute conditions. PMID:19397368

  7. An efficient genome-wide association test for mixed binary and continuous phenotypes with applications to substance abuse research.

    PubMed

    Buu, Anne; Williams, L Keoki; Yang, James J

    2018-03-01

    We propose a new genome-wide association test for mixed binary and continuous phenotypes that uses an efficient numerical method to estimate the empirical distribution of the Fisher's combination statistic under the null hypothesis. Our simulation study shows that the proposed method controls the type I error rate and also maintains its power at the level of the permutation method. More importantly, the computational efficiency of the proposed method is much higher than the one of the permutation method. The simulation results also indicate that the power of the test increases when the genetic effect increases, the minor allele frequency increases, and the correlation between responses decreases. The statistical analysis on the database of the Study of Addiction: Genetics and Environment demonstrates that the proposed method combining multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests.

  8. Accurate Memory for Object Location by Individuals with Intellectual Disability: Absolute Spatial Tagging Instead of Configural Processing?

    ERIC Educational Resources Information Center

    Giuliani, Fabienne; Favrod, Jerome; Grasset, Francois; Schenk, Francoise

    2011-01-01

    Using head-mounted eye tracker material, we assessed spatial recognition abilities (e.g., reaction to object permutation, removal or replacement with a new object) in participants with intellectual disabilities. The "Intellectual Disabilities (ID)" group (n = 40) obtained a score totalling a 93.7% success rate, whereas the "Normal Control" group…

  9. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  10. Classification based upon gene expression data: bias and precision of error rates.

    PubMed

    Wood, Ian A; Visscher, Peter M; Mengersen, Kerrie L

    2007-06-01

    Gene expression data offer a large number of potentially useful predictors for the classification of tissue samples into classes, such as diseased and non-diseased. The predictive error rate of classifiers can be estimated using methods such as cross-validation. We have investigated issues of interpretation and potential bias in the reporting of error rate estimates. The issues considered here are optimization and selection biases, sampling effects, measures of misclassification rate, baseline error rates, two-level external cross-validation and a novel proposal for detection of bias using the permutation mean. Reporting an optimal estimated error rate incurs an optimization bias. Downward bias of 3-5% was found in an existing study of classification based on gene expression data and may be endemic in similar studies. Using a simulated non-informative dataset and two example datasets from existing studies, we show how bias can be detected through the use of label permutations and avoided using two-level external cross-validation. Some studies avoid optimization bias by using single-level cross-validation and a test set, but error rates can be more accurately estimated via two-level cross-validation. In addition to estimating the simple overall error rate, we recommend reporting class error rates plus where possible the conditional risk incorporating prior class probabilities and a misclassification cost matrix. We also describe baseline error rates derived from three trivial classifiers which ignore the predictors. R code which implements two-level external cross-validation with the PAMR package, experiment code, dataset details and additional figures are freely available for non-commercial use from http://www.maths.qut.edu.au/profiles/wood/permr.jsp

  11. Estrogen pathway polymorphisms in relation to primary open angle glaucoma: An analysis accounting for gender from the United States

    PubMed Central

    Loomis, Stephanie J.; Weinreb, Robert N.; Kang, Jae H.; Yaspan, Brian L.; Bailey, Jessica Cooke; Gaasterland, Douglas; Gaasterland, Terry; Lee, Richard K.; Scott, William K.; Lichter, Paul R.; Budenz, Donald L.; Liu, Yutao; Realini, Tony; Friedman, David S.; McCarty, Catherine A.; Moroi, Sayoko E.; Olson, Lana; Schuman, Joel S.; Singh, Kuldev; Vollrath, Douglas; Wollstein, Gadi; Zack, Donald J.; Brilliant, Murray; Sit, Arthur J.; Christen, William G.; Fingert, John; Kraft, Peter; Zhang, Kang; Allingham, R. Rand; Pericak-Vance, Margaret A.; Richards, Julia E.; Hauser, Michael A.; Haines, Jonathan L.; Wiggs, Janey L.

    2013-01-01

    Purpose Circulating estrogen levels are relevant in glaucoma phenotypic traits. We assessed the association between an estrogen metabolism single nucleotide polymorphism (SNP) panel in relation to primary open angle glaucoma (POAG), accounting for gender. Methods We included 3,108 POAG cases and 3,430 controls of both genders from the Glaucoma Genes and Environment (GLAUGEN) study and the National Eye Institute Glaucoma Human Genetics Collaboration (NEIGHBOR) consortium genotyped on the Illumina 660W-Quad platform. We assessed the relation between the SNP panels representative of estrogen metabolism and POAG using pathway- and gene-based approaches with the Pathway Analysis by Randomization Incorporating Structure (PARIS) software. PARIS executes a permutation algorithm to assess statistical significance relative to the pathways and genes of comparable genetic architecture. These analyses were performed using the meta-analyzed results from the GLAUGEN and NEIGHBOR data sets. We evaluated POAG overall as well as two subtypes of POAG defined as intraocular pressure (IOP) ≥22 mmHg (high-pressure glaucoma [HPG]) or IOP <22 mmHg (normal pressure glaucoma [NPG]) at diagnosis. We conducted these analyses for each gender separately and then jointly in men and women. Results Among women, the estrogen SNP pathway was associated with POAG overall (permuted p=0.006) and HPG (permuted p<0.001) but not NPG (permuted p=0.09). Interestingly, there was no relation between the estrogen SNP pathway and POAG when men were considered alone (permuted p>0.99). Among women, gene-based analyses revealed that the catechol-O-methyltransferase gene showed strong associations with HTG (permuted gene p≤0.001) and NPG (permuted gene p=0.01). Conclusions The estrogen SNP pathway was associated with POAG among women. PMID:23869166

  12. Error-free holographic frames encryption with CA pixel-permutation encoding algorithm

    NASA Astrophysics Data System (ADS)

    Li, Xiaowei; Xiao, Dan; Wang, Qiong-Hua

    2018-01-01

    The security of video data is necessary in network security transmission hence cryptography is technique to make video data secure and unreadable to unauthorized users. In this paper, we propose a holographic frames encryption technique based on the cellular automata (CA) pixel-permutation encoding algorithm. The concise pixel-permutation algorithm is used to address the drawbacks of the traditional CA encoding methods. The effectiveness of the proposed video encoding method is demonstrated by simulation examples.

  13. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    PubMed

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  14. On the rates of decay to equilibrium in degenerate and defective Fokker-Planck equations

    NASA Astrophysics Data System (ADS)

    Arnold, Anton; Einav, Amit; Wöhrer, Tobias

    2018-06-01

    We establish sharp long time asymptotic behaviour for a family of entropies to defective Fokker-Planck equations and show that, much like defective finite dimensional ODEs, their decay rate is an exponential multiplied by a polynomial in time. The novelty of our study lies in the amalgamation of spectral theory and a quantitative non-symmetric hypercontractivity result, as opposed to the usual approach of the entropy method.

  15. Photographs and Committees: Activities That Help Students Discover Permutations and Combinations.

    ERIC Educational Resources Information Center

    Szydlik, Jennifer Earles

    2000-01-01

    Presents problem situations that support students when discovering the multiplication principle, permutations, combinations, Pascal's triangle, and relationships among those objects in a concrete context. (ASK)

  16. A permutation characterization of Sturm global attractors of Hamiltonian type

    NASA Astrophysics Data System (ADS)

    Fiedler, Bernold; Rocha, Carlos; Wolfrum, Matthias

    We consider Neumann boundary value problems of the form u=u+f on the interval 0⩽x⩽π for dissipative nonlinearities f=f(u). A permutation characterization for the global attractors of the semiflows generated by these equations is well known, even in the much more general case f=f(x,u,u). We present a permutation characterization for the global attractors in the restrictive class of nonlinearities f=f(u). In this class the stationary solutions of the parabolic equation satisfy the second order ODE v+f(v)=0 and we obtain the permutation characterization from a characterization of the set of 2 π-periodic orbits of this planar Hamiltonian system. Our results are based on a diligent discussion of this mere pendulum equation.

  17. Statistical Entropy of Vaidya-de Sitter Black Hole to All Orders in Planck Length

    NASA Astrophysics Data System (ADS)

    Sun, HangBin; He, Feng; Huang, Hai

    2012-06-01

    Considering corrections to all orders in Planck length on the quantum state density from generalized uncertainty principle, we calculate the statistical entropy of scalar field near event horizon and cosmological horizon of Vaidya-de Sitter black hole without any artificial cutoff. It is shown that the entropy is linear sum of event horizon area and cosmological horizon area and there are similar proportional parameters related to changing rate of the horizon position. This is different from the static and stationary cases.

  18. Cycling-Induced Changes in the Entropy Profiles of Lithium Cobalt Oxide Electrodes

    DOE PAGES

    Hudak, N. S.; Davis, L. E.; Nagasubramanian, G.

    2014-12-09

    Entropy profiles of lithium cobalt oxide (LiCoO2) electrodes were measured at various stages in the cycle life to examine performance degradation and cycling-induced changes, or lack thereof, in thermodynamics. LiCoO 2 electrodes were cycled at C/2 rate in half-cells (vs. lithium anodes) up to 20 cycles or C/5 rate in full cells (vs. MCMB anodes) up to 500 cycles. The electrodes were then subjected to entropy measurements (∂E/∂T, where E is open-circuit potential and T is temperature) in half-cells at regular intervals over the approximate range 0.5 ≤ x ≤ 1 in LixCoO 2. Despite significant losses in capacity uponmore » cycling, neither cycling rate resulted in any change to the overall shape of the entropy profile relative to an uncycled electrode, indicating retention of the basic LiCoO 2 structure, lithium insertion mechanism, and thermodynamics. This confirms that cycling-induced performance degradation in LiCoO 2 electrodes is primarily caused by kinetic barriers that increase with cycling. In the case of electrodes cycled at C/5, there was a subtle, quantitative, and gradual change in the entropy profile in the narrow potential range of the hexagonal-to-monoclinic phase transition. The observed change is indicative of a decrease in the intralayer lithium ordering that occurs at these potentials, and it demonstrates that a cyclinginduced structural disorder accompanies the kinetic degradation mechanisms.« less

  19. The third order correction on Hawking radiation and entropy conservation during black hole evaporation process

    NASA Astrophysics Data System (ADS)

    Yan, Hao-Peng; Liu, Wen-Biao

    2016-08-01

    Using Parikh-Wilczek tunneling framework, we calculate the tunneling rate from a Schwarzschild black hole under the third order WKB approximation, and then obtain the expressions for emission spectrum and black hole entropy to the third order correction. The entropy contains four terms including the Bekenstein-Hawking entropy, the logarithmic term, the inverse area term, and the square of inverse area term. In addition, we analyse the correlation between sequential emissions under this approximation. It is shown that the entropy is conserved during the process of black hole evaporation, which consists with the request of quantum mechanics and implies the information is conserved during this process. We also compare the above result with that of pure thermal spectrum case, and find that the non-thermal correction played an important role.

  20. PBOOST: a GPU-based tool for parallel permutation tests in genome-wide association studies.

    PubMed

    Yang, Guangyuan; Jiang, Wei; Yang, Qiang; Yu, Weichuan

    2015-05-01

    The importance of testing associations allowing for interactions has been demonstrated by Marchini et al. (2005). A fast method detecting associations allowing for interactions has been proposed by Wan et al. (2010a). The method is based on likelihood ratio test with the assumption that the statistic follows the χ(2) distribution. Many single nucleotide polymorphism (SNP) pairs with significant associations allowing for interactions have been detected using their method. However, the assumption of χ(2) test requires the expected values in each cell of the contingency table to be at least five. This assumption is violated in some identified SNP pairs. In this case, likelihood ratio test may not be applicable any more. Permutation test is an ideal approach to checking the P-values calculated in likelihood ratio test because of its non-parametric nature. The P-values of SNP pairs having significant associations with disease are always extremely small. Thus, we need a huge number of permutations to achieve correspondingly high resolution for the P-values. In order to investigate whether the P-values from likelihood ratio tests are reliable, a fast permutation tool to accomplish large number of permutations is desirable. We developed a permutation tool named PBOOST. It is based on GPU with highly reliable P-value estimation. By using simulation data, we found that the P-values from likelihood ratio tests will have relative error of >100% when 50% cells in the contingency table have expected count less than five or when there is zero expected count in any of the contingency table cells. In terms of speed, PBOOST completed 10(7) permutations for a single SNP pair from the Wellcome Trust Case Control Consortium (WTCCC) genome data (Wellcome Trust Case Control Consortium, 2007) within 1 min on a single Nvidia Tesla M2090 device, while it took 60 min in a single CPU Intel Xeon E5-2650 to finish the same task. More importantly, when simultaneously testing 256 SNP pairs for 10(7) permutations, our tool took only 5 min, while the CPU program took 10 h. By permuting on a GPU cluster consisting of 40 nodes, we completed 10(12) permutations for all 280 SNP pairs reported with P-values smaller than 1.6 × 10⁻¹² in the WTCCC datasets in 1 week. The source code and sample data are available at http://bioinformatics.ust.hk/PBOOST.zip. gyang@ust.hk; eeyu@ust.hk Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Engineering calculations for solving the orbital allotment problem

    NASA Technical Reports Server (NTRS)

    Reilly, C.; Walton, E. K.; Mount-Campbell, C.; Caldecott, R.; Aebker, E.; Mata, F.

    1988-01-01

    Four approaches for calculating downlink interferences for shaped-beam antennas are described. An investigation of alternative mixed-integer programming models for satellite synthesis is summarized. Plans for coordinating the various programs developed under this grant are outlined. Two procedures for ordering satellites to initialize the k-permutation algorithm are proposed. Results are presented for the k-permutation algorithms. Feasible solutions are found for 5 of the 6 problems considered. Finally, it is demonstrated that the k-permutation algorithm can be used to solve arc allotment problems.

  2. Quantifying selection and diversity in viruses by entropy methods, with application to the haemagglutinin of H3N2 influenza

    PubMed Central

    Pan, Keyao; Deem, Michael W.

    2011-01-01

    Many viruses evolve rapidly. For example, haemagglutinin (HA) of the H3N2 influenza A virus evolves to escape antibody binding. This evolution of the H3N2 virus means that people who have previously been exposed to an influenza strain may be infected by a newly emerged virus. In this paper, we use Shannon entropy and relative entropy to measure the diversity and selection pressure by an antibody in each amino acid site of H3 HA between the 1992–1993 season and the 2009–2010 season. Shannon entropy and relative entropy are two independent state variables that we use to characterize H3N2 evolution. The entropy method estimates future H3N2 evolution and migration using currently available H3 HA sequences. First, we show that the rate of evolution increases with the virus diversity in the current season. The Shannon entropy of the sequence in the current season predicts relative entropy between sequences in the current season and those in the next season. Second, a global migration pattern of H3N2 is assembled by comparing the relative entropy flows of sequences sampled in China, Japan, the USA and Europe. We verify this entropy method by describing two aspects of historical H3N2 evolution. First, we identify 54 amino acid sites in HA that have evolved in the past to evade the immune system. Second, the entropy method shows that epitopes A and B on the top of HA evolve most vigorously to escape antibody binding. Our work provides a novel entropy-based method to predict and quantify future H3N2 evolution and to describe the evolutionary history of H3N2. PMID:21543352

  3. Prediction of Metabolite Concentrations, Rate Constants and Post-Translational Regulation Using Maximum Entropy-Based Simulations with Application to Central Metabolism of Neurospora crassa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, William; Zucker, Jeremy; Baxter, Douglas

    We report the application of a recently proposed approach for modeling biological systems using a maximum entropy production rate principle in lieu of having in vivo rate constants. The method is applied in four steps: (1) a new ODE-based optimization approach based on Marcelin’s 1910 mass action equation is used to obtain the maximum entropy distribution, (2) the predicted metabolite concentrations are compared to those generally expected from experiment using a loss function from which post-translational regulation of enzymes is inferred, (3) the system is re-optimized with the inferred regulation from which rate constants are determined from the metabolite concentrationsmore » and reaction fluxes, and finally (4) a full ODE-based, mass action simulation with rate parameters and allosteric regulation is obtained. From the last step, the power characteristics and resistance of each reaction can be determined. The method is applied to the central metabolism of Neurospora crassa and the flow of material through the three competing pathways of upper glycolysis, the non-oxidative pentose phosphate pathway, and the oxidative pentose phosphate pathway are evaluated as a function of the NADP/NADPH ratio. It is predicted that regulation of phosphofructokinase (PFK) and flow through the pentose phosphate pathway are essential for preventing an extreme level of fructose 1, 6-bisphophate accumulation. Such an extreme level of fructose 1,6-bisphophate would otherwise result in a glassy cytoplasm with limited diffusion, dramatically decreasing the entropy and energy production rate and, consequently, biological competitiveness.« less

  4. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  5. Influence of hypobaric hypoxia on bispectral index and spectral entropy in volunteers.

    PubMed

    Ikeda, T; Yamada, S; Imada, T; Matsuda, H; Kazama, T

    2009-08-01

    Hypoxia has been shown to change electroencephalogram parameters including frequency and amplitude, and may thus change bispectral index (BIS) and spectral entropy values. If hypoxia per se changes BIS and spectral entropy values, BIS and spectral entropy values may not correctly reflect the depth of anaesthesia during hypoxia. The aim of this study was to examine the changes in BIS and spectral entropy values during hypobaric hypoxia in volunteers. The study was conducted in a high-altitude chamber with 11 volunteers. After the subjects breathed 100% oxygen for 15 min at the ground level, the simulated altitude increased gradually to the 7620 m (25,000 ft) level while the subjects continued to breathe oxygen. Then, the subjects discontinued to breath oxygen and breathed room air at the 7620 m level for up to 5 min until they requested to stop hypoxic exposure. Oxygen saturation (SpO2), heart rate, 95% spectral edge frequency (SEF), BIS, response entropy (RE), and state entropy (SE) of spectral entropy were recorded throughout the study period. Of the 11 subjects, seven subjects who underwent hypoxic exposure for 4 min were analysed. SpO2 decreased to 69% at the 7620 m level without oxygen. However, SEF, BIS, RE, and SE before and during hypoxic exposure were almost identical. These data suggest that hypoxia of oxygen saturation around 70% does not have a strong effect on BIS and spectral entropy.

  6. Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level

    NASA Astrophysics Data System (ADS)

    Silva, Carlos; Annamalai, Kalyan

    2008-06-01

    The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized.

  7. Enzyme catalysis by entropy without Circe effect.

    PubMed

    Kazemi, Masoud; Himo, Fahmi; Åqvist, Johan

    2016-03-01

    Entropic effects have often been invoked to explain the extraordinary catalytic power of enzymes. In particular, the hypothesis that enzymes can use part of the substrate-binding free energy to reduce the entropic penalty associated with the subsequent chemical transformation has been very influential. The enzymatic reaction of cytidine deaminase appears to be a distinct example. Here, substrate binding is associated with a significant entropy loss that closely matches the activation entropy penalty for the uncatalyzed reaction in water, whereas the activation entropy for the rate-limiting catalytic step in the enzyme is close to zero. Herein, we report extensive computer simulations of the cytidine deaminase reaction and its temperature dependence. The energetics of the catalytic reaction is first evaluated by density functional theory calculations. These results are then used to parametrize an empirical valence bond description of the reaction, which allows efficient sampling by molecular dynamics simulations and computation of Arrhenius plots. The thermodynamic activation parameters calculated by this approach are in excellent agreement with experimental data and indeed show an activation entropy close to zero for the rate-limiting transition state. However, the origin of this effect is a change of reaction mechanism compared the uncatalyzed reaction. The enzyme operates by hydroxide ion attack, which is intrinsically associated with a favorable activation entropy. Hence, this has little to do with utilization of binding free energy to pay the entropic penalty but rather reflects how a preorganized active site can stabilize a reaction path that is not operational in solution.

  8. Enzyme catalysis by entropy without Circe effect

    PubMed Central

    Kazemi, Masoud; Himo, Fahmi; Åqvist, Johan

    2016-01-01

    Entropic effects have often been invoked to explain the extraordinary catalytic power of enzymes. In particular, the hypothesis that enzymes can use part of the substrate-binding free energy to reduce the entropic penalty associated with the subsequent chemical transformation has been very influential. The enzymatic reaction of cytidine deaminase appears to be a distinct example. Here, substrate binding is associated with a significant entropy loss that closely matches the activation entropy penalty for the uncatalyzed reaction in water, whereas the activation entropy for the rate-limiting catalytic step in the enzyme is close to zero. Herein, we report extensive computer simulations of the cytidine deaminase reaction and its temperature dependence. The energetics of the catalytic reaction is first evaluated by density functional theory calculations. These results are then used to parametrize an empirical valence bond description of the reaction, which allows efficient sampling by molecular dynamics simulations and computation of Arrhenius plots. The thermodynamic activation parameters calculated by this approach are in excellent agreement with experimental data and indeed show an activation entropy close to zero for the rate-limiting transition state. However, the origin of this effect is a change of reaction mechanism compared the uncatalyzed reaction. The enzyme operates by hydroxide ion attack, which is intrinsically associated with a favorable activation entropy. Hence, this has little to do with utilization of binding free energy to pay the entropic penalty but rather reflects how a preorganized active site can stabilize a reaction path that is not operational in solution. PMID:26755610

  9. Physical and Biological Regulation of Carbon Sequestration in Tidal Marshes

    NASA Astrophysics Data System (ADS)

    Morris, J. T.; Callaway, J.

    2017-12-01

    The rate of carbon sequestration in tidal marshes is regulated by complex feedbacks among biological and physical factors including the rate of sea-level rise (SLR), biomass production, tidal amplitude, and the concentration of suspended sediment. We used the Marsh Equilibrium Model (MEM) to explore the effects on C-sequestration across a wide range of permutations of these variables. C-sequestration increased with the rate of SLR to a maximum, then down to a vanishing point at higher SLR when marshes convert to mudflats. An acceleration in SLR will increase C-sequestration in marshes that can keep pace, but at high rates of SLR this is only possible with high biomass and suspended sediment concentrations. We found that there were no feasible solutions at SLR >13 mm/yr for permutations of variables that characterize the great majority of tidal marshes, i.e., the equilibrium elevation exists below the lower vertical limit for survival of marsh vegetation. The rate of SLR resulting in maximum C-sequestration varies with biomass production. C-sequestration rates at SLR=1 mm/yr averaged only 36 g C m-2 yr-1, but at the highest maximum biomass tested (5000 g/m2) the mean C-sequestration reached 399 g C m-2 yr-1 at SLR = 14 mm/yr. The empirical estimate of C-sequestration in a core dated 50-years overestimates the theoretical long-term rate by 34% for realistic values of decomposition rate and belowground production. The overestimate of the empirical method arises from the live and decaying biomass contained within the carbon inventory above the marker horizon, and overestimates were even greater for shorter surface cores.

  10. Permutation invariant polynomial neural network approach to fitting potential energy surfaces. II. Four-atom systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jun; Jiang, Bin; Guo, Hua, E-mail: hguo@unm.edu

    2013-11-28

    A rigorous, general, and simple method to fit global and permutation invariant potential energy surfaces (PESs) using neural networks (NNs) is discussed. This so-called permutation invariant polynomial neural network (PIP-NN) method imposes permutation symmetry by using in its input a set of symmetry functions based on PIPs. For systems with more than three atoms, it is shown that the number of symmetry functions in the input vector needs to be larger than the number of internal coordinates in order to include both the primary and secondary invariant polynomials. This PIP-NN method is successfully demonstrated in three atom-triatomic reactive systems, resultingmore » in full-dimensional global PESs with average errors on the order of meV. These PESs are used in full-dimensional quantum dynamical calculations.« less

  11. Naturalistic stimulation changes the dynamic response of action potential encoding in a mechanoreceptor

    PubMed Central

    Pfeiffer, Keram; French, Andrew S.

    2015-01-01

    Naturalistic signals were created from vibrations made by locusts walking on a Sansevieria plant. Both naturalistic and Gaussian noise signals were used to mechanically stimulate VS-3 slit-sense mechanoreceptor neurons of the spider, Cupiennius salei, with stimulus amplitudes adjusted to give similar firing rates for either stimulus. Intracellular microelectrodes recorded action potentials, receptor potential, and receptor current, using current clamp and voltage clamp. Frequency response analysis showed that naturalistic stimulation contained relatively more power at low frequencies, and caused increased neuronal sensitivity to higher frequencies. In contrast, varying the amplitude of Gaussian stimulation did not change neuronal dynamics. Naturalistic stimulation contained less entropy than Gaussian, but signal entropy was higher than stimulus in the resultant receptor current, indicating addition of uncorrelated noise during transduction. The presence of added noise was supported by measuring linear information capacity in the receptor current. Total entropy and information capacity in action potentials produced by either stimulus were much lower than in earlier stages, and limited to the maximum entropy of binary signals. We conclude that the dynamics of action potential encoding in VS-3 neurons are sensitive to the form of stimulation, but entropy and information capacity of action potentials are limited by firing rate. PMID:26578975

  12. Spatio-temporal scaling effects on longshore sediment transport pattern along the nearshore zone

    NASA Astrophysics Data System (ADS)

    Khorram, Saeed; Ergil, Mustafa

    2018-03-01

    A measure of uncertainties, entropy has been employed in such different applications as coastal engineering probability inferences. Entropy sediment transport integration theories present novel visions in coastal analyses/modeling the application and development of which are still far-reaching. Effort has been made in the present paper to propose a method that needs an entropy-power index for spatio-temporal patterns analyses. Results have shown that the index is suitable for marine/hydrological ecosystem components analyses based on a beach area case study. The method makes use of six Makran Coastal monthly data (1970-2015) and studies variables such as spatio-temporal patterns, LSTR (long-shore sediment transport rate), wind speed, and wave height all of which are time-dependent and play considerable roles in terrestrial coastal investigations; the mentioned variables show meaningful spatio-temporal variability most of the time, but explanation of their combined performance is not easy. Accordingly, the use of an entropy-power index can show considerable signals that facilitate the evaluation of water resources and will provide an insight regarding hydrological parameters' interactions at scales as large as beach areas. Results have revealed that an STDDPI (entropy based spatio-temporal disorder dynamics power index) can simulate wave, long-shore sediment transport rate, and wind when granulometry, concentration, and flow conditions vary.

  13. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    NASA Astrophysics Data System (ADS)

    Gao, Yun; Kontoyiannis, Ioannis; Bienenstock, Elie

    2008-06-01

    Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases.

  14. Entropy emission properties of near-extremal Reissner-Nordström black holes

    NASA Astrophysics Data System (ADS)

    Hod, Shahar

    2016-05-01

    Bekenstein and Mayo have revealed an interesting property of evaporating (3 +1 )-dimensional Schwarzschild black holes: their entropy emission rates S˙Sch are related to their energy emission rates P by the simple relation S˙Sch=CSch×(P /ℏ)1/2, where CSch is a numerically computed dimensionless coefficient. Remembering that (1 +1 )-dimensional perfect black-body emitters are characterized by the same functional relation, S˙1 +1=C1 +1×(P /ℏ)1/2 [with C1 +1=(π /3 )1/2], Bekenstein and Mayo have concluded that, in their entropy emission properties, (3 +1 )-dimensional Schwarzschild black holes behave effectively as (1 +1 )-dimensional entropy emitters. Later studies have shown that this intriguing property is actually a generic feature of all radiating (D +1 )-dimensional Schwarzschild black holes. One naturally wonders whether all black holes behave as simple (1 +1 )-dimensional entropy emitters? In order to address this interesting question, we shall study in this paper the entropy emission properties of Reissner-Nordström black holes. We shall show, in particular, that the physical properties which characterize the neutral sector of the Hawking emission spectra of these black holes can be studied analytically in the near-extremal TBH→0 regime (here TBH is the Bekenstein-Hawking temperature of the black hole). We find that the Hawking radiation spectra of massless neutral scalar fields and coupled electromagnetic-gravitational fields are characterized by the nontrivial entropy-energy relations S˙RNScalar=-CRNScalar×(A P3/ℏ3)1/4ln (A P /ℏ) and S˙RN Elec -Grav=-CRNElec -Grav×(A4P9/ℏ9)1 /10ln (A P /ℏ) in the near-extremal TBH→0 limit (here {CRNScalar,CRNElec -Grav} are analytically calculated dimensionless coefficients and A is the surface area of the Reissner-Nordström black hole). Our analytical results therefore indicate that not all black holes behave as simple (1 +1 )-dimensional entropy emitters.

  15. Pattern formation in nonextensive thermodynamics: selection criterion based on the Renyi entropy production.

    PubMed

    Cybulski, Olgierd; Matysiak, Daniel; Babin, Volodymyr; Holyst, Robert

    2005-05-01

    We analyze a system of two different types of Brownian particles confined in a cubic box with periodic boundary conditions. Particles of different types annihilate when they come into close contact. The annihilation rate is matched by the birth rate, thus the total number of each kind of particles is conserved. When in a stationary state, the system is divided by an interface into two subregions, each occupied by one type of particles. All possible stationary states correspond to the Laplacian eigenfunctions. We show that the system evolves towards those stationary distributions of particles which minimize the Renyi entropy production. In all cases, the Renyi entropy production decreases monotonically during the evolution despite the fact that the topology and geometry of the interface exhibit abrupt and violent changes.

  16. Inferring Markov chains: Bayesian estimation, model comparison, entropy rate, and out-of-class modeling.

    PubMed

    Strelioff, Christopher C; Crutchfield, James P; Hübler, Alfred W

    2007-07-01

    Markov chains are a natural and well understood tool for describing one-dimensional patterns in time or space. We show how to infer kth order Markov chains, for arbitrary k , from finite data by applying Bayesian methods to both parameter estimation and model-order selection. Extending existing results for multinomial models of discrete data, we connect inference to statistical mechanics through information-theoretic (type theory) techniques. We establish a direct relationship between Bayesian evidence and the partition function which allows for straightforward calculation of the expectation and variance of the conditional relative entropy and the source entropy rate. Finally, we introduce a method that uses finite data-size scaling with model-order comparison to infer the structure of out-of-class processes.

  17. Influence of measurement error on Maxwell's demon

    NASA Astrophysics Data System (ADS)

    Sørdal, Vegard; Bergli, Joakim; Galperin, Y. M.

    2017-06-01

    In any general cycle of measurement, feedback, and erasure, the measurement will reduce the entropy of the system when information about the state is obtained, while erasure, according to Landauer's principle, is accompanied by a corresponding increase in entropy due to the compression of logical and physical phase space. The total process can in principle be fully reversible. A measurement error reduces the information obtained and the entropy decrease in the system. The erasure still gives the same increase in entropy, and the total process is irreversible. Another consequence of measurement error is that a bad feedback is applied, which further increases the entropy production if the proper protocol adapted to the expected error rate is not applied. We consider the effect of measurement error on a realistic single-electron box Szilard engine, and we find the optimal protocol for the cycle as a function of the desired power P and error ɛ .

  18. Effect of extreme data loss on heart rate signals quantified by entropy analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao

    2015-02-01

    The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.

  19. Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers

    NASA Astrophysics Data System (ADS)

    Barnes, Eric I.; Williams, Liliya L. R.

    2012-04-01

    We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.

  20. Trends in entropy production during ecosystem development in the Amazon Basin.

    PubMed

    Holdaway, Robert J; Sparrow, Ashley D; Coomes, David A

    2010-05-12

    Understanding successional trends in energy and matter exchange across the ecosystem-atmosphere boundary layer is an essential focus in ecological research; however, a general theory describing the observed pattern remains elusive. This paper examines whether the principle of maximum entropy production could provide the solution. A general framework is developed for calculating entropy production using data from terrestrial eddy covariance and micrometeorological studies. We apply this framework to data from eight tropical forest and pasture flux sites in the Amazon Basin and show that forest sites had consistently higher entropy production rates than pasture sites (0.461 versus 0.422 W m(-2) K(-1), respectively). It is suggested that during development, changes in canopy structure minimize surface albedo, and development of deeper root systems optimizes access to soil water and thus potential transpiration, resulting in lower surface temperatures and increased entropy production. We discuss our results in the context of a theoretical model of entropy production versus ecosystem developmental stage. We conclude that, although further work is required, entropy production could potentially provide a much-needed theoretical basis for understanding the effects of deforestation and land-use change on the land-surface energy balance.

  1. Comparison of entropy and bispectral index during propofol and fentanyl sedation in monitored anaesthesia care.

    PubMed

    Balci, Canan; Karabekir, H S; Kahraman, F; Sivaci, R G

    2009-01-01

    Comparison of entropy (state entropy [SE] and response entropy [RE]) with the bispectral index (BIS) during propofol sedation in monitored anaesthesia care (MAC) was carried out in patients undergoing hand surgery. Thirty candidates for elective hand surgery were pre-medicated with midazolam 0.06 mg/kg and atropine 0.01 mg/kg. Sedation was induced with intravenous propofol and fentanyl was also administered. The Modified Observer's Assessment of Alertness/Sedation Scale (MOAA/S) was used to determine sedation level and pain was maintained at < 4 on a 0 - 10 verbal rating scale. The BIS, entropy, MOAA/S and pain values were recorded before initiation of sedation (control), during initiation of sedation, during surgery, and for 30 min after the end of surgery and anaesthesia. On initiation of sedation, entropy decreased more rapidly than BIS. At 10 min after initiation of sedation, the mean +/- SD values for MOAA/S, BIS, RE and SE were 3.00 +/- 0.36, 85.45 +/- 0.15, 74.00 +/- 0.60 and 72.02 +/- 0.12, respectively. During recovery, BIS and RE and SE increased in parallel with MOAA/S. It is concluded that entropy monitoring is as reliable as BIS monitoring in MAC.

  2. Approximate convective heating equations for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Zoby, E. V.; Moss, J. N.; Sutton, K.

    1979-01-01

    Laminar and turbulent heating-rate equations appropriate for engineering predictions of the convective heating rates about blunt reentry spacecraft at hypersonic conditions are developed. The approximate methods are applicable to both nonreacting and reacting gas mixtures for either constant or variable-entropy edge conditions. A procedure which accounts for variable-entropy effects and is not based on mass balancing is presented. Results of the approximate heating methods are in good agreement with existing experimental results as well as boundary-layer and viscous-shock-layer solutions.

  3. Entropy in Postmerger and Acquisition Integration from an Information Technology Perspective

    ERIC Educational Resources Information Center

    Williams, Gloria S.

    2012-01-01

    Mergers and acquisitions have historically experienced failure rates from 50% to more than 80%. Successful integration of information technology (IT) systems can be the difference between postmerger success or failure. The purpose of this phenomenological study was to explore the entropy phenomenon during postmerger IT integration. To that end, a…

  4. Multiple comparisons permutation test for image based data mining in radiotherapy.

    PubMed

    Chen, Chun; Witte, Marnix; Heemsbergen, Wilma; van Herk, Marcel

    2013-12-23

    : Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy.

  5. Combining early post-resuscitation EEG and HRV features improves the prognostic performance in cardiac arrest model of rats.

    PubMed

    Dai, Chenxi; Wang, Zhi; Wei, Liang; Chen, Gang; Chen, Bihua; Zuo, Feng; Li, Yongqin

    2018-04-09

    Early and reliable prediction of neurological outcome remains a challenge for comatose survivors of cardiac arrest (CA). The purpose of this study was to evaluate the predictive ability of EEG, heart rate variability (HRV) features and the combination of them for outcome prognostication in CA model of rats. Forty-eight male Sprague-Dawley rats were randomized into 6 groups (n=8 each) with different cause and duration of untreated arrest. Cardiopulmonary resuscitation was initiated after 5, 6 and 7min of ventricular fibrillation or 4, 6 and 8min of asphyxia. EEG and ECG were continuously recorded for 4h under normothermia after resuscitation. The relationships between features of early post-resuscitation EEG, HRV and 96-hour outcome were investigated. Prognostic performances were evaluated using the area under receiver operating characteristic curve (AUC). All of the animals were successfully resuscitated and 27 of them survived to 96h. Weighted-permutation entropy (WPE) and normalized high frequency (nHF) outperformed other EEG and HRV features for the prediction of survival. The AUC of WPE was markedly higher than that of nHF (0.892 vs. 0.759, p<0.001). The AUC was 0.954 when WPE and nHF were combined using a logistic regression model, which was significantly higher than the individual EEG (p=0.018) and HRV (p<0.001) features. Earlier post-resuscitation HRV provided prognostic information complementary to quantitative EEG in the CA model of rats. The combination of EEG and HRV features leads to improving performance of outcome prognostication compared to either EEG or HRV based features alone. Copyright © 2018. Published by Elsevier Inc.

  6. Complex-enhanced chaotic signals with time-delay signature suppression based on vertical-cavity surface-emitting lasers subject to chaotic optical injection

    NASA Astrophysics Data System (ADS)

    Chen, Jianjun; Duan, Yingni; Zhong, Zhuqiang

    2018-06-01

    A chaotic system is constructed on the basis of vertical-cavity surface-emitting lasers (VCSELs), where a slave VCSEL subject to chaotic optical injection (COI) from a master VCSEL with the external feedback. The complex degree (CD) and time-delay signature (TDS) of chaotic signals generated by this chaotic system are investigated numerically via permutation entropy (PE) and self-correlation function (SF) methods, respectively. The results show that, compared with master VCSEL subject to optical feedback, complex-enhanced chaotic signals with TDS suppression can be achieved for S-VCSEL subject to COI. Meanwhile, the influences of several controllable parameters on the evolution maps of CD of chaotic signals are carefully considered. It is shown that the CD of chaotic signals for S-VCSEL is always higher than that for M-VCSEL due to the CIO effect. The TDS of chaotic signals can be significantly suppressed by choosing the reasonable parameters in this system. Furthermore, TDS suppression and high CD chaos can be obtained simultaneously in the specific parameter ranges. The results confirm that this chaotic system may effectively improve the security of a chaos-based communication scheme.

  7. Complex-enhanced chaotic signals with time-delay signature suppression based on vertical-cavity surface-emitting lasers subject to chaotic optical injection

    NASA Astrophysics Data System (ADS)

    Chen, Jianjun; Duan, Yingni; Zhong, Zhuqiang

    2018-03-01

    A chaotic system is constructed on the basis of vertical-cavity surface-emitting lasers (VCSELs), where a slave VCSEL subject to chaotic optical injection (COI) from a master VCSEL with the external feedback. The complex degree (CD) and time-delay signature (TDS) of chaotic signals generated by this chaotic system are investigated numerically via permutation entropy (PE) and self-correlation function (SF) methods, respectively. The results show that, compared with master VCSEL subject to optical feedback, complex-enhanced chaotic signals with TDS suppression can be achieved for S-VCSEL subject to COI. Meanwhile, the influences of several controllable parameters on the evolution maps of CD of chaotic signals are carefully considered. It is shown that the CD of chaotic signals for S-VCSEL is always higher than that for M-VCSEL due to the CIO effect. The TDS of chaotic signals can be significantly suppressed by choosing the reasonable parameters in this system. Furthermore, TDS suppression and high CD chaos can be obtained simultaneously in the specific parameter ranges. The results confirm that this chaotic system may effectively improve the security of a chaos-based communication scheme.

  8. A novel algorithm for thermal image encryption.

    PubMed

    Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen

    2018-04-16

    Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.

  9. A new feedback image encryption scheme based on perturbation with dynamical compound chaotic sequence cipher generator

    NASA Astrophysics Data System (ADS)

    Tong, Xiaojun; Cui, Minggen; Wang, Zhu

    2009-07-01

    The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.

  10. Entropy production in photovoltaic-thermoelectric nanodevices from the non-equilibrium Green’s function formalism

    NASA Astrophysics Data System (ADS)

    Michelini, Fabienne; Crépieux, Adeline; Beltako, Katawoura

    2017-05-01

    We discuss some thermodynamic aspects of energy conversion in electronic nanosystems able to convert light energy into electrical or/and thermal energy using the non-equilibrium Green’s function formalism. In a first part, we derive the photon energy and particle currents inside a nanosystem interacting with light and in contact with two electron reservoirs at different temperatures. Energy conservation is verified, and radiation laws are discussed from electron non-equilibrium Green’s functions. We further use the photon currents to formulate the rate of entropy production for steady-state nanosystems, and we recast this rate in terms of efficiency for specific photovoltaic-thermoelectric nanodevices. In a second part, a quantum dot based nanojunction is closely examined using a two-level model. We show analytically that the rate of entropy production is always positive, but we find numerically that it can reach negative values when the derived particule and energy currents are empirically modified as it is usually done for modeling realistic photovoltaic systems.

  11. Entropy production in photovoltaic-thermoelectric nanodevices from the non-equilibrium Green's function formalism.

    PubMed

    Michelini, Fabienne; Crépieux, Adeline; Beltako, Katawoura

    2017-05-04

    We discuss some thermodynamic aspects of energy conversion in electronic nanosystems able to convert light energy into electrical or/and thermal energy using the non-equilibrium Green's function formalism. In a first part, we derive the photon energy and particle currents inside a nanosystem interacting with light and in contact with two electron reservoirs at different temperatures. Energy conservation is verified, and radiation laws are discussed from electron non-equilibrium Green's functions. We further use the photon currents to formulate the rate of entropy production for steady-state nanosystems, and we recast this rate in terms of efficiency for specific photovoltaic-thermoelectric nanodevices. In a second part, a quantum dot based nanojunction is closely examined using a two-level model. We show analytically that the rate of entropy production is always positive, but we find numerically that it can reach negative values when the derived particule and energy currents are empirically modified as it is usually done for modeling realistic photovoltaic systems.

  12. Thermodynamics of Terrestrial Evolution

    PubMed Central

    Kirkaldy, J. S.

    1965-01-01

    The causal element of biological evolution and development can be understood in terms of a potential function which is generalized from the variational principles of irreversible thermodynamics. This potential function is approximated by the rate of entropy production in a configuration space which admits of macroscopic excursions by fluctuation and regression as well as microscopic ones. Analogously to Onsager's dissipation function, the potential takes the form of a saddle surface in this configuration space. The path of evolution following from an initial high dissipation state within the fixed constraint provided by the invariant energy flux from the sun tends toward the stable saddle point by a series of spontaneous regressions which lower the entropy production rate and by an alternating series of spontaneous fluctuations which introduce new internal constraints and lead to a higher entropy production rate. The potential thus rationalizes the system's observed tendency toward “chemical imperialism” (high dissipation) while simultaneously accommodating the development of “dynamic efficiency” and complication (low dissipation). PMID:5884019

  13. Memory behaviors of entropy production rates in heat conduction

    NASA Astrophysics Data System (ADS)

    Li, Shu-Nan; Cao, Bing-Yang

    2018-02-01

    Based on the relaxation time approximation and first-order expansion, memory behaviors in heat conduction are found between the macroscopic and Boltzmann-Gibbs-Shannon (BGS) entropy production rates with exponentially decaying memory kernels. In the frameworks of classical irreversible thermodynamics (CIT) and BGS statistical mechanics, the memory dependency on the integrated history is unidirectional, while for the extended irreversible thermodynamics (EIT) and BGS entropy production rates, the memory dependences are bidirectional and coexist with the linear terms. When macroscopic and microscopic relaxation times satisfy a specific relationship, the entropic memory dependences will be eliminated. There also exist initial effects in entropic memory behaviors, which decay exponentially. The second-order term are also discussed, which can be understood as the global non-equilibrium degree. The effects of the second-order term are consisted of three parts: memory dependency, initial value and linear term. The corresponding memory kernels are still exponential and the initial effects of the global non-equilibrium degree also decay exponentially.

  14. Entropy Generation Minimization in Dimethyl Ether Synthesis: A Case Study

    NASA Astrophysics Data System (ADS)

    Kingston, Diego; Razzitte, Adrián César

    2018-04-01

    Entropy generation minimization is a method that helps improve the efficiency of real processes and devices. In this article, we study the entropy production (due to chemical reactions, heat exchange and friction) in a conventional reactor that synthesizes dimethyl ether and minimize it by modifying different operating variables of the reactor, such as composition, temperature and pressure, while aiming at a fixed production of dimethyl ether. Our results indicate that it is possible to reduce the entropy production rate by nearly 70 % and that, by changing only the inlet composition, it is possible to cut it by nearly 40 %, though this comes at the expense of greater dissipation due to heat transfer. We also study the alternative of coupling the reactor with another, where dehydrogenation of methylcyclohexane takes place. In that case, entropy generation can be reduced by 54 %, when pressure, temperature and inlet molar flows are varied. These examples show that entropy generation analysis can be a valuable tool in engineering design and applications aiming at process intensification and efficient operation of plant equipment.

  15. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  16. Entropy Measurement for Biometric Verification Systems.

    PubMed

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  17. Higher order explicit symmetric integrators for inseparable forms of coordinates and momenta

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Wu, Xin; Huang, Guoqing; Liu, Fuyao

    2016-06-01

    Pihajoki proposed the extended phase-space second-order explicit symmetric leapfrog methods for inseparable Hamiltonian systems. On the basis of this work, we survey a critical problem on how to mix the variables in the extended phase space. Numerical tests show that sequent permutations of coordinates and momenta can make the leapfrog-like methods yield the most accurate results and the optimal long-term stabilized error behaviour. We also present a novel method to construct many fourth-order extended phase-space explicit symmetric integration schemes. Each scheme represents the symmetric production of six usual second-order leapfrogs without any permutations. This construction consists of four segments: the permuted coordinates, triple product of the usual second-order leapfrog without permutations, the permuted momenta and the triple product of the usual second-order leapfrog without permutations. Similarly, extended phase-space sixth, eighth and other higher order explicit symmetric algorithms are available. We used several inseparable Hamiltonian examples, such as the post-Newtonian approach of non-spinning compact binaries, to show that one of the proposed fourth-order methods is more efficient than the existing methods; examples include the fourth-order explicit symplectic integrators of Chin and the fourth-order explicit and implicit mixed symplectic integrators of Zhong et al. Given a moderate choice for the related mixing and projection maps, the extended phase-space explicit symplectic-like methods are well suited for various inseparable Hamiltonian problems. Samples of these problems involve the algorithmic regularization of gravitational systems with velocity-dependent perturbations in the Solar system and post-Newtonian Hamiltonian formulations of spinning compact objects.

  18. Quantitative comparison of entropy analysis of fetal heart rate variability related to the different stages of labor.

    PubMed

    Lim, Jongil; Kwon, Ji Young; Song, Juhee; Choi, Hosoon; Shin, Jong Chul; Park, In Yang

    2014-02-01

    The interpretation of the fetal heart rate (FHR) signal considering labor progression may improve perinatal morbidity and mortality. However, there have been few studies that evaluate the fetus in each labor stage quantitatively. To evaluate whether the entropy indices of FHR are different according to labor progression. A retrospective comparative study of FHR recordings in three groups: 280 recordings in the second stage of labor before vaginal delivery, 31 recordings in the first stage of labor before emergency cesarean delivery, and 23 recordings in the pre-labor before elective cesarean delivery. The stored FHR recordings of external cardiotocography during labor. Approximate entropy (ApEn) and sample entropy (SampEn) for the final 2000 RR intervals. The median ApEn and SampEn for the 2000 RR intervals showed the lowest values in the second stage of labor, followed by the emergency cesarean group and the elective cesarean group for all time segments (all P<0.001). Also, in the second stage of labor, the final 5 min of 2000 RR intervals had a significantly lower median ApEn (0.49 vs. 0.44, P=0.001) and lower median SampEn (0.34 vs. 0.29, P<0.001) than the initial 5 min of 2000 RR intervals. Entropy indices of FHR were significantly different according to labor progression. This result supports the necessity of considering labor progression when developing intrapartum fetal monitoring using the entropy indices of FHR. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Glass transition memorized by the enthalpy-entropy compensation in the shear thinning of supercooled metallic liquids.

    PubMed

    Zhang, Meng; Liu, Lin

    2018-05-03

    To unravel the true nature of glass transition, broader insights into glass forming have been gained by examining the stress-driven glassy systems, where strong shear thinning, i.e., a reduced viscosity under increasing shear rate, is encountered. It is argued that arbitrarily small stress-driven shear rates would "melt" the glass and erase any memory of its thermal history. In this work, we report a glass transition memorized by the enthalpy-entropy compensation in strongly shear-thinned supercooled metallic liquids, which coincides with the thermal glass transition in both the transition temperature and the activation Gibbs free energy. Our findings provide distinctive insights into both glass forming and shear thinning, and enrich current knowledge on the ubiquitous enthalpy-entropy compensation empirical law in condensed matter physics. © 2018 IOP Publishing Ltd.

  20. Glass transition memorized by the enthalpy-entropy compensation in the shear thinning of supercooled metallic liquids

    NASA Astrophysics Data System (ADS)

    Zhang, Meng; Liu, Lin

    2018-06-01

    To unravel the true nature of glass transition, broader insights into glass forming have been gained by examining the stress-driven glassy systems, where strong shear thinning, i.e. a reduced viscosity under increasing shear rate, is encountered. It is argued that arbitrarily small stress-driven shear rates would ‘melt’ the glass and erase any memory of its thermal history. In this work, we report a glass transition memorized by the enthalpy-entropy compensation in strongly shear-thinned supercooled metallic liquids, which coincides with the thermal glass transition in both the transition temperature and the activation Gibbs free energy. Our findings provide distinctive insights into both glass forming and shear thinning, and enrich current knowledge on the ubiquitous enthalpy-entropy compensation empirical law in condensed matter physics.

  1. Photospheric Magnetic Field Properties of Flaring versus Flare-quiet Active Regions. II. Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, G.

    2003-10-01

    We apply statistical tests based on discriminant analysis to the wide range of photospheric magnetic parameters described in a companion paper by Leka & Barnes, with the goal of identifying those properties that are important for the production of energetic events such as solar flares. The photospheric vector magnetic field data from the University of Hawai'i Imaging Vector Magnetograph are well sampled both temporally and spatially, and we include here data covering 24 flare-event and flare-quiet epochs taken from seven active regions. The mean value and rate of change of each magnetic parameter are treated as separate variables, thus evaluating both the parameter's state and its evolution, to determine which properties are associated with flaring. Considering single variables first, Hotelling's T2-tests show small statistical differences between flare-producing and flare-quiet epochs. Even pairs of variables considered simultaneously, which do show a statistical difference for a number of properties, have high error rates, implying a large degree of overlap of the samples. To better distinguish between flare-producing and flare-quiet populations, larger numbers of variables are simultaneously considered; lower error rates result, but no unique combination of variables is clearly the best discriminator. The sample size is too small to directly compare the predictive power of large numbers of variables simultaneously. Instead, we rank all possible four-variable permutations based on Hotelling's T2-test and look for the most frequently appearing variables in the best permutations, with the interpretation that they are most likely to be associated with flaring. These variables include an increasing kurtosis of the twist parameter and a larger standard deviation of the twist parameter, but a smaller standard deviation of the distribution of the horizontal shear angle and a horizontal field that has a smaller standard deviation but a larger kurtosis. To support the ``sorting all permutations'' method of selecting the most frequently occurring variables, we show that the results of a single 10-variable discriminant analysis are consistent with the ranking. We demonstrate that individually, the variables considered here have little ability to differentiate between flaring and flare-quiet populations, but with multivariable combinations, the populations may be distinguished.

  2. Linear algebra of the permutation invariant Crow-Kimura model of prebiotic evolution.

    PubMed

    Bratus, Alexander S; Novozhilov, Artem S; Semenov, Yuri S

    2014-10-01

    A particular case of the famous quasispecies model - the Crow-Kimura model with a permutation invariant fitness landscape - is investigated. Using the fact that the mutation matrix in the case of a permutation invariant fitness landscape has a special tridiagonal form, a change of the basis is suggested such that in the new coordinates a number of analytical results can be obtained. In particular, using the eigenvectors of the mutation matrix as the new basis, we show that the quasispecies distribution approaches a binomial one and give simple estimates for the speed of convergence. Another consequence of the suggested approach is a parametric solution to the system of equations determining the quasispecies. Using this parametric solution we show that our approach leads to exact asymptotic results in some cases, which are not covered by the existing methods. In particular, we are able to present not only the limit behavior of the leading eigenvalue (mean population fitness), but also the exact formulas for the limit quasispecies eigenvector for special cases. For instance, this eigenvector has a geometric distribution in the case of the classical single peaked fitness landscape. On the biological side, we propose a mathematical definition, based on the closeness of the quasispecies to the binomial distribution, which can be used as an operational definition of the notorious error threshold. Using this definition, we suggest two approximate formulas to estimate the critical mutation rate after which the quasispecies delocalization occurs. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Sylow p-groups of polynomial permutations on the integers mod pn☆

    PubMed Central

    Frisch, Sophie; Krenn, Daniel

    2013-01-01

    We enumerate and describe the Sylow p-groups of the groups of polynomial permutations of the integers mod pn for n⩾1 and of the pro-finite group which is the projective limit of these groups. PMID:26869732

  4. Storage and computationally efficient permutations of factorized covariance and square-root information arrays

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.

  5. Note on new KLT relations

    NASA Astrophysics Data System (ADS)

    Feng, Bo; He, Song; Huang, Rijun; Jia, Yin

    2010-10-01

    In this short note, we present two results about KLT relations discussed in recent several papers. Our first result is the re-derivation of Mason-Skinner MHV amplitude by applying the S n-3 permutation symmetric KLT relations directly to MHV amplitude. Our second result is the equivalence proof of the newly discovered S n-2 permutation symmetric KLT relations and the well-known S n-3 permutation symmetric KLT relations. Although both formulas have been shown to be correct by BCFW recursion relations, our result is the first direct check using the regularized definition of the new formula.

  6. Combating HER2-overexpressing breast cancer through induction of calreticulin exposure by Tras-Permut CrossMab

    PubMed Central

    Zhang, Fan; Zhang, Jie; Liu, Moyan; Zhao, Lichao; LingHu, RuiXia; Feng, Fan; Gao, Xudong; Jiao, Shunchang; Zhao, Lei; Hu, Yi; Yang, Junlan

    2015-01-01

    Although trastuzumab has succeeded in breast cancer treatment, acquired resistance is one of the prime obstacles for breast cancer therapies. There is an urgent need to develop novel HER2 antibodies against trastuzumab resistance. Here, we first rational designed avidity-imporved trastuzumab and pertuzumab variants, and explored the correlation between the binding avidity improvement and their antitumor activities. After characterization of a pertuzumab variant L56TY with potent antitumor activities, a bispecific immunoglobulin G-like CrossMab (Tras-Permut CrossMab) was generated from trastuzumab and binding avidity-improved pertuzumab variant L56TY. Although, the antitumor efficacy of trastuzumab was not enhanced by improving its binding avidity, binding avidity improvement could significantly increase the anti-proliferative and antibody-dependent cellular cytotoxicity (ADCC) activities of pertuzumab. Further studies showed that Tras-Permut CrossMab exhibited exceptional high efficiency to inhibit the progression of trastuzumab-resistant breast cancer. Notably, we found that calreticulin (CRT) exposure induced by Tras-Permut CrossMab was essential for induction of tumor-specific T cell immunity against tumor recurrence. These data indicated that simultaneous blockade of HER2 protein by Tras-Permut CrossMab could trigger CRT exposure and subsequently induce potent tumor-specific T cell immunity, suggesting it could be a promising therapeutic strategy against trastuzumab resistance. PMID:25949918

  7. Periodic matrix population models: growth rate, basic reproduction number, and entropy.

    PubMed

    Bacaër, Nicolas

    2009-10-01

    This article considers three different aspects of periodic matrix population models. First, a formula for the sensitivity analysis of the growth rate lambda is obtained that is simpler than the one obtained by Caswell and Trevisan. Secondly, the formula for the basic reproduction number R0 in a constant environment is generalized to the case of a periodic environment. Some inequalities between lambda and R0 proved by Cushing and Zhou are also generalized to the periodic case. Finally, we add some remarks on Demetrius' notion of evolutionary entropy H and its relationship to the growth rate lambda in the periodic case.

  8. An entropy-based statistic for genomewide association studies.

    PubMed

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-07-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard chi2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the differences in allele and haplotype frequencies to maintain statistical power with large numbers of marker loci. We investigate the relationship between the entropy-based test statistic and the standard chi2 statistic and show that, in most cases, the power of the entropy-based statistic is greater than that of the standard chi2 statistic. The distribution of the entropy-based statistic and the type I error rates are validated using simulation studies. Finally, we apply the new entropy-based test statistic to two real data sets, one for the COMT gene and schizophrenia and one for the MMP-2 gene and esophageal carcinoma, to evaluate the performance of the new method for genetic association studies. The results show that the entropy-based statistic obtained smaller P values than did the standard chi2 statistic.

  9. Entropy production and optimization of geothermal power plants

    NASA Astrophysics Data System (ADS)

    Michaelides, Efstathios E.

    2012-09-01

    Geothermal power plants are currently producing reliable and low-cost, base load electricity. Three basic types of geothermal power plants are currently in operation: single-flashing, dual-flashing, and binary power plants. Typically, the single-flashing and dual-flashing geothermal power plants utilize geothermal water (brine) at temperatures in the range of 550-430 K. Binary units utilize geothermal resources at lower temperatures, typically 450-380 K. The entropy production in the various components of the three types of geothermal power plants determines the efficiency of the plants. It is axiomatic that a lower entropy production would improve significantly the energy utilization factor of the corresponding power plant. For this reason, the entropy production in the major components of the three types of geothermal power plants has been calculated. It was observed that binary power plants generate the lowest amount of entropy and, thus, convert the highest rate of geothermal energy into mechanical energy. The single-flashing units generate the highest amount of entropy, primarily because they re-inject fluid at relatively high temperature. The calculations for entropy production provide information on the equipment where the highest irreversibilities occur, and may be used to optimize the design of geothermal processes in future geothermal power plants and thermal cycles used for the harnessing of geothermal energy.

  10. Solid-solution CrCoCuFeNi high-entropy alloy thin films synthesized by sputter deposition

    DOE PAGES

    An, Zhinan; Jia, Haoling; Wu, Yueying; ...

    2015-05-04

    The concept of high configurational entropy requires that the high-entropy alloys (HEAs) yield single-phase solid solutions. However, phase separations are quite common in bulk HEAs. A five-element alloy, CrCoCuFeNi, was deposited via radio frequency magnetron sputtering and confirmed to be a single-phase solid solution through the high-energy synchrotron X-ray diffraction, energy-dispersive spectroscopy, wavelength-dispersive spectroscopy, and transmission electron microscopy. The formation of the solid-solution phase is presumed to be due to the high cooling rate of the sputter-deposition process.

  11. Automatic NEPHIS Coding of Descriptive Titles for Permuted Index Generation.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    1982-01-01

    Describes a system for the automatic coding of most descriptive titles which generates Nested Phrase Indexing System (NEPHIS) input strings of sufficient quality for permuted index production. A series of examples and an 11-item reference list accompany the text. (JL)

  12. Evaluation of spectral entropy to measure anaesthetic depth and antinociception in sevoflurane-anaesthetised Beagle dogs.

    PubMed

    Morgaz, Juan; Granados, María del Mar; Domínguez, Juan Manuel; Navarrete, Rocío; Fernández, Andrés; Galán, Alba; Muñoz, Pilar; Gómez-Villamandos, Rafael J

    2011-06-01

    The use of spectral entropy to determine anaesthetic depth and antinociception was evaluated in sevoflurane-anaesthetised Beagle dogs. Dogs were anaesthetised at each of five multiples of their individual minimum alveolar concentrations (MAC; 0.75, 1, 1.25, 1.5 and 1.75 MAC), and response entropy (RE), state entropy (SE), RE-SE difference, burst suppression rate (BSR) and cardiorespiratory parameters were recorded before and after a painful stimulus. RE, SE and RE-SE difference did not change significantly after the stimuli. The correlation between MAC-entropy parameters was weak, but these values increased when 1.75 MAC results were excluded from the analysis. BSR was different to zero at 1.5 and 1.75 MAC. It was concluded that RE and RE-SE differences were not adequate indicators of antinociception and SE and RE were unable to detect deep planes of anaesthesia in dogs, although they both distinguished the awake and unconscious states. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. [Identification of special quality eggs with NIR spectroscopy technology based on symbol entropy feature extraction method].

    PubMed

    Zhao, Yong; Hong, Wen-Xue

    2011-11-01

    Fast, nondestructive and accurate identification of special quality eggs is an urgent problem. The present paper proposed a new feature extraction method based on symbol entropy to identify near infrared spectroscopy of special quality eggs. The authors selected normal eggs, free range eggs, selenium-enriched eggs and zinc-enriched eggs as research objects and measured the near-infrared diffuse reflectance spectra in the range of 12 000-4 000 cm(-1). Raw spectra were symbolically represented with aggregation approximation algorithm and symbolic entropy was extracted as feature vector. An error-correcting output codes multiclass support vector machine classifier was designed to identify the spectrum. Symbolic entropy feature is robust when parameter changed and the highest recognition rate reaches up to 100%. The results show that the identification method of special quality eggs using near-infrared is feasible and the symbol entropy can be used as a new feature extraction method of near-infrared spectra.

  14. Entropy Information of Cardiorespiratory Dynamics in Neonates during Sleep.

    PubMed

    Lucchini, Maristella; Pini, Nicolò; Fifer, William P; Burtchen, Nina; Signorini, Maria G

    2017-05-01

    Sleep is a central activity in human adults and characterizes most of the newborn infant life. During sleep, autonomic control acts to modulate heart rate variability (HRV) and respiration. Mechanisms underlying cardiorespiratory interactions in different sleep states have been studied but are not yet fully understood. Signal processing approaches have focused on cardiorespiratory analysis to elucidate this co-regulation. This manuscript proposes to analyze heart rate (HR), respiratory variability and their interrelationship in newborn infants to characterize cardiorespiratory interactions in different sleep states (active vs. quiet). We are searching for indices that could detect regulation alteration or malfunction, potentially leading to infant distress. We have analyzed inter-beat (RR) interval series and respiration in a population of 151 newborns, and followed up with 33 at 1 month of age. RR interval series were obtained by recognizing peaks of the QRS complex in the electrocardiogram (ECG), corresponding to the ventricles depolarization. Univariate time domain, frequency domain and entropy measures were applied. In addition, Transfer Entropy was considered as a bivariate approach able to quantify the bidirectional information flow from one signal (respiration) to another (RR series). Results confirm the validity of the proposed approach. Overall, HRV is higher in active sleep, while high frequency (HF) power characterizes more quiet sleep. Entropy analysis provides higher indices for SampEn and Quadratic Sample entropy (QSE) in quiet sleep. Transfer Entropy values were higher in quiet sleep and point to a major influence of respiration on the RR series. At 1 month of age, time domain parameters show an increase in HR and a decrease in variability. No entropy differences were found across ages. The parameters employed in this study help to quantify the potential for infants to adapt their cardiorespiratory responses as they mature. Thus, they could be useful as early markers of risk for infant cardiorespiratory vulnerabilities.

  15. Statistical mechanical theory for steady state systems. VI. Variational principles

    NASA Astrophysics Data System (ADS)

    Attard, Phil

    2006-12-01

    Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.

  16. Global Existence Analysis of Cross-Diffusion Population Systems for Multiple Species

    NASA Astrophysics Data System (ADS)

    Chen, Xiuqing; Daus, Esther S.; Jüngel, Ansgar

    2018-02-01

    The existence of global-in-time weak solutions to reaction-cross-diffusion systems for an arbitrary number of competing population species is proved. The equations can be derived from an on-lattice random-walk model with general transition rates. In the case of linear transition rates, it extends the two-species population model of Shigesada, Kawasaki, and Teramoto. The equations are considered in a bounded domain with homogeneous Neumann boundary conditions. The existence proof is based on a refined entropy method and a new approximation scheme. Global existence follows under a detailed balance or weak cross-diffusion condition. The detailed balance condition is related to the symmetry of the mobility matrix, which mirrors Onsager's principle in thermodynamics. Under detailed balance (and without reaction) the entropy is nonincreasing in time, but counter-examples show that the entropy may increase initially if detailed balance does not hold.

  17. Entropy Inequalities for Stable Densities and Strengthened Central Limit Theorems

    NASA Astrophysics Data System (ADS)

    Toscani, Giuseppe

    2016-10-01

    We consider the central limit theorem for stable laws in the case of the standardized sum of independent and identically distributed random variables with regular probability density function. By showing decay of different entropy functionals along the sequence we prove convergence with explicit rate in various norms to a Lévy centered density of parameter λ >1 . This introduces a new information-theoretic approach to the central limit theorem for stable laws, in which the main argument is shown to be the relative fractional Fisher information, recently introduced in Toscani (Ricerche Mat 65(1):71-91, 2016). In particular, it is proven that, with respect to the relative fractional Fisher information, the Lévy density satisfies an analogous of the logarithmic Sobolev inequality, which allows to pass from the monotonicity and decay to zero of the relative fractional Fisher information in the standardized sum to the decay to zero in relative entropy with an explicit decay rate.

  18. Creation of a Ligand-Dependent Enzyme by Fusing Circularly Permuted Antibody Variable Region Domains.

    PubMed

    Iwai, Hiroto; Kojima-Misaizu, Miki; Dong, Jinhua; Ueda, Hiroshi

    2016-04-20

    Allosteric control of enzyme activity with exogenous substances has been hard to achieve, especially using antibody domains that potentially allow control by any antigens of choice. Here, in order to attain this goal, we developed a novel antibody variable region format introduced with circular permutations, called Clampbody. The two variable-region domains of the antibone Gla protein (BGP) antibody were each circularly permutated to have novel termini at the loops near their domain interface. Through their attachment to the N- and C-termini of a circularly permutated TEM-1 β-lactamase (cpBLA), we created a molecular switch that responds to the antigen peptide. The fusion protein specifically recognized the antigen, and in the presence of some detergent or denaturant, its catalytic activity was enhanced up to 4.7-fold in an antigen-dependent manner, due to increased resistance to these reagents. Hence, Clampbody will be a powerful tool for the allosteric regulation of enzyme and other protein activities and especially useful to design robust biosensors.

  19. Multiple comparisons permutation test for image based data mining in radiotherapy

    PubMed Central

    2013-01-01

    Comparing incidental dose distributions (i.e. images) of patients with different outcomes is a straightforward way to explore dose-response hypotheses in radiotherapy. In this paper, we introduced a permutation test that compares images, such as dose distributions from radiotherapy, while tackling the multiple comparisons problem. A test statistic Tmax was proposed that summarizes the differences between the images into a single value and a permutation procedure was employed to compute the adjusted p-value. We demonstrated the method in two retrospective studies: a prostate study that relates 3D dose distributions to failure, and an esophagus study that relates 2D surface dose distributions of the esophagus to acute esophagus toxicity. As a result, we were able to identify suspicious regions that are significantly associated with failure (prostate study) or toxicity (esophagus study). Permutation testing allows direct comparison of images from different patient categories and is a useful tool for data mining in radiotherapy. PMID:24365155

  20. Quantum one-way permutation over the finite field of two elements

    NASA Astrophysics Data System (ADS)

    de Castro, Alexandre

    2017-06-01

    In quantum cryptography, a one-way permutation is a bounded unitary operator U:{H} → {H} on a Hilbert space {H} that is easy to compute on every input, but hard to invert given the image of a random input. Levin (Probl Inf Transm 39(1):92-103, 2003) has conjectured that the unitary transformation g(a,x)=(a,f(x)+ax), where f is any length-preserving function and a,x \\in {GF}_{{2}^{\\Vert x\\Vert }}, is an information-theoretically secure operator within a polynomial factor. Here, we show that Levin's one-way permutation is provably secure because its output values are four maximally entangled two-qubit states, and whose probability of factoring them approaches zero faster than the multiplicative inverse of any positive polynomial poly( x) over the Boolean ring of all subsets of x. Our results demonstrate through well-known theorems that existence of classical one-way functions implies existence of a universal quantum one-way permutation that cannot be inverted in subexponential time in the worst case.

  1. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Permutation modulation for quantization and information reconciliation in CV-QKD systems

    NASA Astrophysics Data System (ADS)

    Daneshgaran, Fred; Mondin, Marina; Olia, Khashayar

    2017-08-01

    This paper is focused on the problem of Information Reconciliation (IR) for continuous variable Quantum Key Distribution (QKD). The main problem is quantization and assignment of labels to the samples of the Gaussian variables observed at Alice and Bob. Trouble is that most of the samples, assuming that the Gaussian variable is zero mean which is de-facto the case, tend to have small magnitudes and are easily disturbed by noise. Transmission over longer and longer distances increases the losses corresponding to a lower effective Signal to Noise Ratio (SNR) exasperating the problem. Here we propose to use Permutation Modulation (PM) as a means of quantization of Gaussian vectors at Alice and Bob over a d-dimensional space with d ≫ 1. The goal is to achieve the necessary coding efficiency to extend the achievable range of continuous variable QKD by quantizing over larger and larger dimensions. Fractional bit rate per sample is easily achieved using PM at very reasonable computational cost. Ordered statistics is used extensively throughout the development from generation of the seed vector in PM to analysis of error rates associated with the signs of the Gaussian samples at Alice and Bob as a function of the magnitude of the observed samples at Bob.

  3. Mass, Energy, Entropy and Exergy Rate Balance in a Ranque-Hilsh Vortex Tube

    ERIC Educational Resources Information Center

    Carrascal Lecumberri, Edorta; Sala Lizarraga, José María

    2013-01-01

    The objective of this paper is to present a laboratory program designed for the Thermodynamics course offered in the Department of Thermal Engineering at the University of the Basque Country. With reference to one of the examples given in the textbook by Moran, Shapiro, Boettner and Bailey (2012), the balances of mass, energy, entropy and exergy…

  4. Prediction of G-protein-coupled receptor classes in low homology using Chou's pseudo amino acid composition with approximate entropy and hydrophobicity patterns.

    PubMed

    Gu, Q; Ding, Y S; Zhang, T L

    2010-05-01

    We use approximate entropy and hydrophobicity patterns to predict G-protein-coupled receptors. Adaboost classifier is adopted as the prediction engine. A low homology dataset is used to validate the proposed method. Compared with the results reported, the successful rate is encouraging. The source code is written by Matlab.

  5. Towards the minimization of thermodynamic irreversibility in an electrically actuated microflow of a viscoelastic fluid under electrical double layer phenomenon

    NASA Astrophysics Data System (ADS)

    Sarma, Rajkumar; Jain, Manish; Mondal, Pranab Kumar

    2017-10-01

    We discuss the entropy generation minimization for electro-osmotic flow of a viscoelastic fluid through a parallel plate microchannel under the combined influences of interfacial slip and conjugate transport of heat. We use in this study the simplified Phan-Thien-Tanner model to describe the rheological behavior of the viscoelastic fluid. Using Navier's slip law and thermal boundary conditions of the third kind, we solve the transport equations analytically and evaluate the global entropy generation rate of the system. We examine the influential role of the following parameters on the entropy generation rate of the system, viz., the viscoelastic parameter (ɛDe2), Debye-Hückel parameter ( κ ¯ ) , channel wall thickness (δ), thermal conductivity of the wall (γ), Biot number (Bi), Peclet number (Pe), and axial temperature gradient (B). This investigation finally establishes the optimum values of the abovementioned parameters, leading to the minimum entropy generation of the system. We believe that results of this analysis could be helpful in optimizing the second-law performance of microscale thermal management devices, including the micro-heat exchangers, micro-reactors, and micro-heat pipes.

  6. Stationary gaze entropy predicts lane departure events in sleep-deprived drivers.

    PubMed

    Shiferaw, Brook A; Downey, Luke A; Westlake, Justine; Stevens, Bronwyn; Rajaratnam, Shantha M W; Berlowitz, David J; Swann, Phillip; Howard, Mark E

    2018-02-02

    Performance decrement associated with sleep deprivation is a leading contributor to traffic accidents and fatalities. While current research has focused on eye blink parameters as physiological indicators of driver drowsiness, little is understood of how gaze behaviour alters as a result of sleep deprivation. In particular, the effect of sleep deprivation on gaze entropy has not been previously examined. In this randomised, repeated measures study, 9 (4 male, 5 female) healthy participants completed two driving sessions in a fully instrumented vehicle (1 after a night of sleep deprivation and 1 after normal sleep) on a closed track, during which eye movement activity and lane departure events were recorded. Following sleep deprivation, the rate of fixations reduced while blink rate and duration as well as saccade amplitude increased. In addition, stationary and transition entropy of gaze also increased following sleep deprivation as well as with amount of time driven. An increase in stationary gaze entropy in particular was associated with higher odds of a lane departure event occurrence. These results highlight how fatigue induced by sleep deprivation and time-on-task effects can impair drivers' visual awareness through disruption of gaze distribution and scanning patterns.

  7. Charged Dirac Particles' Hawking Radiation via Tunneling of Both Horizons and Thermodynamics Properties of Kerr-Newman-Kasuya-Taub-NUT-AdS Black Holes

    NASA Astrophysics Data System (ADS)

    Ali, M. Hossain; Sultana, Kausari

    2013-12-01

    We investigate Hawking radiation of electrically and magnetically charged Dirac particles from a dyonic Kerr-Newman-Kasuya-Taub-NUT-Anti-de Sitter (KNKTN-AdS) black hole by considering thermal characters of both the outer and inner horizons. We apply Damour-Ruffini method and membrane method to calculate the temperature and the entropy of the inner horizon of the KNKTN-AdS black hole. The inner horizon admits thermal character with positive temperature and entropy proportional to its area. The inner horizon entropy contributes to the total entropy of the black hole in the context of Nernst theorem. Considering conservation of energy, charges, angular momentum, and the back-reaction of emitting particles to the spacetime, we obtain the emission spectra for both the inner and outer horizons. The total emission rate is obtained as the product of the emission rates of the inner and outer horizons. It deviates from the purely thermal spectrum with the leading term exactly the Boltzman factor and can bring some information out. The result thus can be treated as an explanation to the information loss paradox.

  8. Entropy generation minimization (EGM) of nanofluid flow by a thin moving needle with nonlinear thermal radiation

    NASA Astrophysics Data System (ADS)

    Waleed Ahmed Khan, M.; Ijaz Khan, M.; Hayat, T.; Alsaedi, A.

    2018-04-01

    Entropy generation minimization (EGM) and heat transport in nonlinear radiative flow of nanomaterials over a thin moving needle has been discussed. Nonlinear thermal radiation and viscous dissipation terms are merged in the energy expression. Water is treated as ordinary fluid while nanomaterials comprise titanium dioxide, copper and aluminum oxide. The nonlinear governing expressions of flow problems are transferred to ordinary ones and then tackled for numerical results by Built-in-shooting technique. In first section of this investigation, the entropy expression is derived as a function of temperature and velocity gradients. Geometrical and physical flow field variables are utilized to make it nondimensionalized. An entropy generation analysis is utilized through second law of thermodynamics. The results of temperature, velocity, concentration, surface drag force and heat transfer rate are explored. Our outcomes reveal that surface drag force and Nusselt number (heat transfer) enhanced linearly for higher nanoparticle volume fraction. Furthermore drag force decays for aluminum oxide and it enhances for copper nanoparticles. In addition, the lowest heat transfer rate is achieved for higher radiative parameter. Temperature field is enhanced with increase in temperature ratio parameter.

  9. MHD effects on heat transfer and entropy generation of nanofluid flow in an open cavity

    NASA Astrophysics Data System (ADS)

    Mehrez, Zouhaier; El Cafsi, Afif; Belghith, Ali; Le Quéré, Patrick

    2015-01-01

    The present numerical work investigates the effect of an external oriented magnetic field on heat transfer and entropy generation of Cu-water nanofluid flow in an open cavity heated from below. The governing equations are solved numerically by the finite-volume method. The study has been carried out for a wide range of solid volume fraction 0≤φ≤0.06, Hartmann number 0≤Ha≤100, Reynolds number 100≤Re≤500 and Richardson number 0.001≤Ri≤1 at three inclination angles of magnetic field γ: 0°, 45° and 90°. The numerical results are given by streamlines, isotherms, average Nusselt number, average entropy generation and Bejan number. The results show that flow behavior, temperature distribution, heat transfer and entropy generation are strongly affected by the presence of a magnetic field. The average Nusselt number and entropy generation, which increase by increasing volume fraction of nanoparticles, depend mainly on the Hartmann number and inclination angle of the magnetic field. The variation rates of heat transfer and entropy generation while adding nanoparticles or applying a magnetic field depend on the Richardson and Reynolds numbers.

  10. Discrete Bat Algorithm for Optimal Problem of Permutation Flow Shop Scheduling

    PubMed Central

    Luo, Qifang; Zhou, Yongquan; Xie, Jian; Ma, Mingzhi; Li, Liangliang

    2014-01-01

    A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem. PMID:25243220

  11. Discrete bat algorithm for optimal problem of permutation flow shop scheduling.

    PubMed

    Luo, Qifang; Zhou, Yongquan; Xie, Jian; Ma, Mingzhi; Li, Liangliang

    2014-01-01

    A discrete bat algorithm (DBA) is proposed for optimal permutation flow shop scheduling problem (PFSP). Firstly, the discrete bat algorithm is constructed based on the idea of basic bat algorithm, which divide whole scheduling problem into many subscheduling problems and then NEH heuristic be introduced to solve subscheduling problem. Secondly, some subsequences are operated with certain probability in the pulse emission and loudness phases. An intensive virtual population neighborhood search is integrated into the discrete bat algorithm to further improve the performance. Finally, the experimental results show the suitability and efficiency of the present discrete bat algorithm for optimal permutation flow shop scheduling problem.

  12. Levels of Conceptual Development in Melodic Permutation Concepts Based on Piaget's Theory

    ERIC Educational Resources Information Center

    Larn, Ronald L.

    1973-01-01

    Article considered different ways in which subjects at different age levels solved a musical task involving melodic permutation. The differences in responses to the musical task between age groups were judged to be compatible with Piaget's theory of cognitive development. (Author/RK)

  13. In Response to Rowland on "Realism and Debateability in Policy Advocacy."

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    1986-01-01

    Argues that Robert Rowland has overstated the case against the permutation process for assessing counterplan competitiveness. Claims that the permutation standard is a viable method for ascertaining counterplan competitiveness. Examines Rowland's alternative and argues that it is an unsatisfactory method for determining counterplan…

  14. EPEPT: A web service for enhanced P-value estimation in permutation tests

    PubMed Central

    2011-01-01

    Background In computational biology, permutation tests have become a widely used tool to assess the statistical significance of an event under investigation. However, the common way of computing the P-value, which expresses the statistical significance, requires a very large number of permutations when small (and thus interesting) P-values are to be accurately estimated. This is computationally expensive and often infeasible. Recently, we proposed an alternative estimator, which requires far fewer permutations compared to the standard empirical approach while still reliably estimating small P-values [1]. Results The proposed P-value estimator has been enriched with additional functionalities and is made available to the general community through a public website and web service, called EPEPT. This means that the EPEPT routines can be accessed not only via a website, but also programmatically using any programming language that can interact with the web. Examples of web service clients in multiple programming languages can be downloaded. Additionally, EPEPT accepts data of various common experiment types used in computational biology. For these experiment types EPEPT first computes the permutation values and then performs the P-value estimation. Finally, the source code of EPEPT can be downloaded. Conclusions Different types of users, such as biologists, bioinformaticians and software engineers, can use the method in an appropriate and simple way. Availability http://informatics.systemsbiology.net/EPEPT/ PMID:22024252

  15. Automated EEG entropy measurements in coma, vegetative state/unresponsive wakefulness syndrome and minimally conscious state

    PubMed Central

    Gosseries, Olivia; Schnakers, Caroline; Ledoux, Didier; Vanhaudenhuyse, Audrey; Bruno, Marie-Aurélie; Demertzi, Athéna; Noirhomme, Quentin; Lehembre, Rémy; Damas, Pierre; Goldman, Serge; Peeters, Erika; Moonen, Gustave; Laureys, Steven

    Summary Monitoring the level of consciousness in brain-injured patients with disorders of consciousness is crucial as it provides diagnostic and prognostic information. Behavioral assessment remains the gold standard for assessing consciousness but previous studies have shown a high rate of misdiagnosis. This study aimed to investigate the usefulness of electroencephalography (EEG) entropy measurements in differentiating unconscious (coma or vegetative) from minimally conscious patients. Left fronto-temporal EEG recordings (10-minute resting state epochs) were prospectively obtained in 56 patients and 16 age-matched healthy volunteers. Patients were assessed in the acute (≤1 month post-injury; n=29) or chronic (>1 month post-injury; n=27) stage. The etiology was traumatic in 23 patients. Automated online EEG entropy calculations (providing an arbitrary value ranging from 0 to 91) were compared with behavioral assessments (Coma Recovery Scale-Revised) and outcome. EEG entropy correlated with Coma Recovery Scale total scores (r=0.49). Mean EEG entropy values were higher in minimally conscious (73±19; mean and standard deviation) than in vegetative/unresponsive wakefulness syndrome patients (45±28). Receiver operating characteristic analysis revealed an entropy cut-off value of 52 differentiating acute unconscious from minimally conscious patients (sensitivity 89% and specificity 90%). In chronic patients, entropy measurements offered no reliable diagnostic information. EEG entropy measurements did not allow prediction of outcome. User-independent time-frequency balanced spectral EEG entropy measurements seem to constitute an interesting diagnostic – albeit not prognostic – tool for assessing neural network complexity in disorders of consciousness in the acute setting. Future studies are needed before using this tool in routine clinical practice, and these should seek to improve automated EEG quantification paradigms in order to reduce the remaining false negative and false positive findings. PMID:21693085

  16. Subband Image Coding with Jointly Optimized Quantizers

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Chung, Wilson C.; Smith Mark J. T.

    1995-01-01

    An iterative design algorithm for the joint design of complexity- and entropy-constrained subband quantizers and associated entropy coders is proposed. Unlike conventional subband design algorithms, the proposed algorithm does not require the use of various bit allocation algorithms. Multistage residual quantizers are employed here because they provide greater control of the complexity-performance tradeoffs, and also because they allow efficient and effective high-order statistical modeling. The resulting subband coder exploits statistical dependencies within subbands, across subbands, and across stages, mainly through complexity-constrained high-order entropy coding. Experimental results demonstrate that the complexity-rate-distortion performance of the new subband coder is exceptional.

  17. Introduction to Permutation and Resampling-Based Hypothesis Tests

    ERIC Educational Resources Information Center

    LaFleur, Bonnie J.; Greevy, Robert A.

    2009-01-01

    A resampling-based method of inference--permutation tests--is often used when distributional assumptions are questionable or unmet. Not only are these methods useful for obvious departures from parametric assumptions (e.g., normality) and small sample sizes, but they are also more robust than their parametric counterparts in the presences of…

  18. Explorations in Statistics: Permutation Methods

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2012-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eighth installment of "Explorations in Statistics" explores permutation methods, empiric procedures we can use to assess an experimental result--to test a null hypothesis--when we are reluctant to trust statistical…

  19. Suppression of new particle formation from monoterpene oxidation by NOx

    NASA Astrophysics Data System (ADS)

    Wildt, J.; Mentel, T. F.; Kiendler-Scharr, A.; Hoffmann, T.; Andres, S.; Ehn, M.; Kleist, E.; Müsgen, P.; Rohrer, F.; Rudich, Y.; Springer, M.; Tillmann, R.; Wahner, A.

    2013-10-01

    The impact of nitrogen oxides (NOx = NO + NO2) on new particle formation (NPF) and on photochemical ozone production from real plant volatile organic compound (BVOC) emissions was studied in a laboratory set up. At high NOx conditions (BVOC/NOx < 7, NOx > 23 ppb) no new particles were formed. Instead photochemical ozone formation was observed resulting in higher hydroxyl radical (OH) and lower nitrogen monoxide (NO) concentrations. As soon as [NO] was reduced to below 1 ppb by OH reactions, NPF was observed. Adding high amounts of NOx caused NPF orders of magnitude slower than in analogous experiments at low NOx conditions (NOx ~ 300 ppt), although OH concentrations were higher. Varying NO2 photolysis enabled showing that NO was responsible for suppression of NPF suggesting that peroxy radicals are involved in NPF. The rates of NPF and photochemical ozone production were related by power law dependence with an exponent of approximately -2. This exponent indicated that the overall peroxy radical concentration must have been the same whenever NPF appeared. Thus permutation reactions of first generation peroxy radicals cannot be the rate limiting step in NPF from monoterpene oxidation. It was concluded that permutation reactions of higher generation peroxy radical like molecules limit the rate of new particle formation. In contrast to the strong effects on the particle numbers, the formation of particle mass was less sensitive to NOx concentrations, if at all. Only at very high NOx concentrations yields were reduced by about an order of magnitude.

  20. Suppression of new particle formation from monoterpene oxidation by NOx

    NASA Astrophysics Data System (ADS)

    Wildt, J.; Mentel, T. F.; Kiendler-Scharr, A.; Hoffmann, T.; Andres, S.; Ehn, M.; Kleist, E.; Müsgen, P.; Rohrer, F.; Rudich, Y.; Springer, M.; Tillmann, R.; Wahner, A.

    2014-03-01

    The impact of nitrogen oxides (NOx = NO + NO2) on new particle formation (NPF) and on photochemical ozone production from real plant volatile organic compound (BVOC) emissions was studied in a laboratory setup. At high NOx conditions ([BVOC] / [NOx] < 7, [NOx] > 23 ppb) new particle formation was suppressed. Instead, photochemical ozone formation was observed resulting in higher hydroxyl radical (OH) and lower nitrogen monoxide (NO) concentrations. When [NO] was reduced back to levels below 1 ppb by OH reactions, NPF was observed. Adding high amounts of NOx caused NPF to be slowed by orders of magnitude compared to analogous experiments at low NOx conditions ([NOx] ~300 ppt), although OH concentrations were higher. Varying NO2 photolysis enabled showing that NO was responsible for suppression of NPF. This suggests that peroxy radicals are involved in NPF. The rates of NPF and photochemical ozone production were related by power law dependence with an exponent approaching -2. This exponent indicated that the overall peroxy radical concentration must have been similar when NPF occurred. Thus, permutation reactions of first-generation peroxy radicals cannot be the rate limiting step in NPF from monoterpene oxidation. It was concluded that permutation reactions of higher generation peroxy-radical-like intermediates limit the rate of new particle formation. In contrast to the strong effects on the particle numbers, the formation of particle mass was substantially less sensitive to NOx concentrations. If at all, yields were reduced by about an order of magnitude only at very high NOx concentrations.

  1. Towards a novel look on low-frequency climate reconstructions

    NASA Astrophysics Data System (ADS)

    Kamenik, Christian; Goslar, Tomasz; Hicks, Sheila; Barnekow, Lena; Huusko, Antti

    2010-05-01

    Information on low-frequency (millennial to sub-centennial) climate change is often derived from sedimentary archives, such as peat profiles or lake sediments. Usually, these archives have non-annual and varying time resolution. Their dating is mainly based on radionuclides, which provide probabilistic age-depth relationships with complex error structures. Dating uncertainties impede the interpretation of sediment-based climate reconstructions. They complicate the calculation of time-dependent rates. In most cases, they make any calibration in time impossible. Sediment-based climate proxies are therefore often presented as a single, best-guess time series without proper calibration and error estimation. Errors along time and dating errors that propagate into the calculation of time-dependent rates are neglected. Our objective is to overcome the aforementioned limitations by using a 'swarm' or 'ensemble' of reconstructions instead of a single best-guess. The novelty of our approach is to take into account age-depth uncertainties by permuting through a large number of potential age-depth relationships of the archive of interest. For each individual permutation we can then calculate rates, calibrate proxies in time, and reconstruct the climate-state variable of interest. From the resulting swarm of reconstructions, we can derive realistic estimates of even complex error structures. The likelihood of reconstructions is visualized by a grid of two-dimensional kernels that take into account probabilities along time and the climate-state variable of interest simultaneously. For comparison and regional synthesis, likelihoods can be scored against other independent climate time series.

  2. A survey of the role of thermodynamic stability in viscous flow

    NASA Technical Reports Server (NTRS)

    Horne, W. C.; Smith, C. A.; Karamcheti, K.

    1991-01-01

    The stability of near-equilibrium states has been studied as a branch of the general field of nonequilibrium thermodynamics. By treating steady viscous flow as an open thermodynamic system, nonequilibrium principles such as the condition of minimum entropy-production rate for steady, near-equilibrium processes can be used to generate flow distributions from variational analyses. Examples considered in this paper are steady heat conduction, channel flow, and unconstrained three-dimensional flow. The entropy-production-rate condition has also been used for hydrodynamic stability criteria, and calculations of the stability of a laminar wall jet support this interpretation.

  3. Optimal behavior of viscoelastic flow at resonant frequencies.

    PubMed

    Lambert, A A; Ibáñez, G; Cuevas, S; del Río, J A

    2004-11-01

    The global entropy generation rate in the zero-mean oscillatory flow of a Maxwell fluid in a pipe is analyzed with the aim of determining its behavior at resonant flow conditions. This quantity is calculated explicitly using the analytic expression for the velocity field and assuming isothermal conditions. The global entropy generation rate shows well-defined peaks at the resonant frequencies where the flow displays maximum velocities. It was found that resonant frequencies can be considered optimal in the sense that they maximize the power transmitted to the pulsating flow at the expense of maximum dissipation.

  4. Entropy-Based Approach To Nonlinear Stability

    NASA Technical Reports Server (NTRS)

    Merriam, Marshal L.

    1991-01-01

    NASA technical memorandum suggests schemes for numerical solution of differential equations of flow made more accurate and robust by invoking second law of thermodynamics. Proposes instead of using artificial viscosity to suppress such unphysical solutions as spurious numerical oscillations and nonlinear instabilities, one should formulate equations so that rate of production of entropy within each cell of computational grid be nonnegative, as required by second law.

  5. Entropy of orthogonal polynomials with Freud weights and information entropies of the harmonic oscillator potential

    NASA Astrophysics Data System (ADS)

    Van Assche, W.; Yáñez, R. J.; Dehesa, J. S.

    1995-08-01

    The information entropy of the harmonic oscillator potential V(x)=1/2λx2 in both position and momentum spaces can be expressed in terms of the so-called ``entropy of Hermite polynomials,'' i.e., the quantity Sn(H):= -∫-∞+∞H2n(x)log H2n(x) e-x2dx. These polynomials are instances of the polynomials orthogonal with respect to the Freud weights w(x)=exp(-||x||m), m≳0. Here, a very precise and general result of the entropy of Freud polynomials recently established by Aptekarev et al. [J. Math. Phys. 35, 4423-4428 (1994)], specialized to the Hermite kernel (case m=2), leads to an important refined asymptotic expression for the information entropies of very excited states (i.e., for large n) in both position and momentum spaces, to be denoted by Sρ and Sγ, respectively. Briefly, it is shown that, for large values of n, Sρ+1/2logλ≂log(π√2n/e)+o(1) and Sγ-1/2log λ≂log(π√2n/e)+o(1), so that Sρ+Sγ≂log(2π2n/e2)+o(1) in agreement with the generalized indetermination relation of Byalinicki-Birula and Mycielski [Commun. Math. Phys. 44, 129-132 (1975)]. Finally, the rate of convergence of these two information entropies is numerically analyzed. In addition, using a Rakhmanov result, we describe a totally new proof of the leading term of the entropy of Freud polynomials which, naturally, is just a weak version of the aforementioned general result.

  6. Shear viscosity to entropy density ratios and implications for (im)perfect fluidity in Fermionic and Bosonic superfluids

    NASA Astrophysics Data System (ADS)

    Boyack, Rufus; Guo, Hao; Levin, K.

    2015-03-01

    Recent experiments on both unitary Fermi gases and high temperature superconductors (arxiv:1410.4835 [cond-mat.quant-gas], arxiv:1409.5820 [cond-mat.str-el].) have led to renewed interest in near perfect fluidity in condensed matter systems. This is quantified by studying the ratio of shear viscosity to entropy density. In this talk we present calculations of this ratio in homogeneous bosonic and fermionic superfluids, with the latter ranging from BCS to BEC. While the shear viscosity exhibits a power law (for bosons) or exponential suppression (for fermions), a similar dependence is found for the respective entropy densities. As a result, strict BCS and (true) bosonic superfluids have an analogous viscosity to entropy density ratio, behaving linearly with temperature times the (T-dependent) dissipation rate; this is characteristic of imperfect fluidity in weakly coupled fluids. This is contrasted with the behavior of fermions at unitarity which we argue is a consequence of additional terms in the entropy density thereby leading to more perfect fluidity. (arXiv:1407.7572v1 [cond-mat.quant-gas])

  7. Measurement of entropy generation within bypass transitional flow

    NASA Astrophysics Data System (ADS)

    Skifton, Richard; Budwig, Ralph; McEligot, Donald; Crepeau, John

    2012-11-01

    A flat plate made from quartz was submersed in the Idaho National Laboratory's Matched Index of Refraction (MIR) flow facility. PIV was utilized to capture spatial vectors maps at near wall locations with five to ten points within the viscous sublayer. Entropy generation was calculated directly from measured velocity fluctuation derivatives. Two flows were studied: a zero pressure gradient and an adverse pressure gradient (β = -0.039). The free stream turbulence intensity to drive bypass transition ranged between 3% (near trailing edge) and 8% (near leading edge). The pointwise entropy generation rate will be utilized as a design parameter to systematically reduce losses. As a second observation, the pointwise entropy can be shown to predict the onset of transitional flow. This research was partially supported by the DOE EPSCOR program, grant DE-SC0004751 and by the Idaho National Laboratory. Center for Advanced Energy Studies.

  8. A method for the fast estimation of a battery entropy-variation high-resolution curve - Application on a commercial LiFePO4/graphite cell

    NASA Astrophysics Data System (ADS)

    Damay, Nicolas; Forgez, Christophe; Bichat, Marie-Pierre; Friedrich, Guy

    2016-11-01

    The entropy-variation of a battery is responsible for heat generation or consumption during operation and its prior measurement is mandatory for developing a thermal model. It is generally done through the potentiometric method which is considered as a reference. However, it requires several days or weeks to get a look-up table with a 5 or 10% SoC (State of Charge) resolution. In this study, a calorimetric method based on the inversion of a thermal model is proposed for the fast estimation of a nearly continuous curve of entropy-variation. This is achieved by separating the heats produced while charging and discharging the battery. The entropy-variation is then deduced from the extracted entropic heat. The proposed method is validated by comparing the results obtained with several current rates to measurements made with the potentiometric method.

  9. Improved wavelet packet classification algorithm for vibrational intrusions in distributed fiber-optic monitoring systems

    NASA Astrophysics Data System (ADS)

    Wang, Bingjie; Pi, Shaohua; Sun, Qi; Jia, Bo

    2015-05-01

    An improved classification algorithm that considers multiscale wavelet packet Shannon entropy is proposed. Decomposition coefficients at all levels are obtained to build the initial Shannon entropy feature vector. After subtracting the Shannon entropy map of the background signal, components of the strongest discriminating power in the initial feature vector are picked out to rebuild the Shannon entropy feature vector, which is transferred to radial basis function (RBF) neural network for classification. Four types of man-made vibrational intrusion signals are recorded based on a modified Sagnac interferometer. The performance of the improved classification algorithm has been evaluated by the classification experiments via RBF neural network under different diffusion coefficients. An 85% classification accuracy rate is achieved, which is higher than the other common algorithms. The classification results show that this improved classification algorithm can be used to classify vibrational intrusion signals in an automatic real-time monitoring system.

  10. NASA Thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The NASA Thesaurus -- Volume 2, Access Vocabulary -- contains an alphabetical listing of all Thesaurus terms (postable and nonpostable) and permutations of all multiword and pseudo-multiword terms. Also included are Other Words (non-Thesaurus terms) consisting of abbreviations, chemical symbols, etc. The permutations and Other Words provide 'access' to the appropriate postable entries in the Thesaurus.

  11. A Permutation Test for Correlated Errors in Adjacent Questionnaire Items

    ERIC Educational Resources Information Center

    Hildreth, Laura A.; Genschel, Ulrike; Lorenz, Frederick O.; Lesser, Virginia M.

    2013-01-01

    Response patterns are of importance to survey researchers because of the insight they provide into the thought processes respondents use to answer survey questions. In this article we propose the use of structural equation modeling to examine response patterns and develop a permutation test to quantify the likelihood of observing a specific…

  12. The Parity Theorem Shuffle

    ERIC Educational Resources Information Center

    Smith, Michael D.

    2016-01-01

    The Parity Theorem states that any permutation can be written as a product of transpositions, but no permutation can be written as a product of both an even number and an odd number of transpositions. Most proofs of the Parity Theorem take several pages of mathematical formalism to complete. This article presents an alternative but equivalent…

  13. Heuristic Implementation of Dynamic Programming for Matrix Permutation Problems in Combinatorial Data Analysis

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie

    2008-01-01

    Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…

  14. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  15. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  16. A new EEG synchronization strength analysis method: S-estimator based normalized weighted-permutation mutual information.

    PubMed

    Cui, Dong; Pu, Weiting; Liu, Jing; Bian, Zhijie; Li, Qiuli; Wang, Lei; Gu, Guanghua

    2016-10-01

    Synchronization is an important mechanism for understanding information processing in normal or abnormal brains. In this paper, we propose a new method called normalized weighted-permutation mutual information (NWPMI) for double variable signal synchronization analysis and combine NWPMI with S-estimator measure to generate a new method named S-estimator based normalized weighted-permutation mutual information (SNWPMI) for analyzing multi-channel electroencephalographic (EEG) synchronization strength. The performances including the effects of time delay, embedding dimension, coupling coefficients, signal to noise ratios (SNRs) and data length of the NWPMI are evaluated by using Coupled Henon mapping model. The results show that the NWPMI is superior in describing the synchronization compared with the normalized permutation mutual information (NPMI). Furthermore, the proposed SNWPMI method is applied to analyze scalp EEG data from 26 amnestic mild cognitive impairment (aMCI) subjects and 20 age-matched controls with normal cognitive function, who both suffer from type 2 diabetes mellitus (T2DM). The proposed methods NWPMI and SNWPMI are suggested to be an effective index to estimate the synchronization strength. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Sorting signed permutations by inversions in O(nlogn) time.

    PubMed

    Swenson, Krister M; Rajan, Vaibhav; Lin, Yu; Moret, Bernard M E

    2010-03-01

    The study of genomic inversions (or reversals) has been a mainstay of computational genomics for nearly 20 years. After the initial breakthrough of Hannenhalli and Pevzner, who gave the first polynomial-time algorithm for sorting signed permutations by inversions, improved algorithms have been designed, culminating with an optimal linear-time algorithm for computing the inversion distance and a subquadratic algorithm for providing a shortest sequence of inversions--also known as sorting by inversions. Remaining open was the question of whether sorting by inversions could be done in O(nlogn) time. In this article, we present a qualified answer to this question, by providing two new sorting algorithms, a simple and fast randomized algorithm and a deterministic refinement. The deterministic algorithm runs in time O(nlogn + kn), where k is a data-dependent parameter. We provide the results of extensive experiments showing that both the average and the standard deviation for k are small constants, independent of the size of the permutation. We conclude (but do not prove) that almost all signed permutations can be sorted by inversions in O(nlogn) time.

  18. NOTE: Entropy-based automated classification of independent components separated from fMCG

    NASA Astrophysics Data System (ADS)

    Comani, S.; Srinivasan, V.; Alleva, G.; Romani, G. L.

    2007-03-01

    Fetal magnetocardiography (fMCG) is a noninvasive technique suitable for the prenatal diagnosis of the fetal heart function. Reliable fetal cardiac signals can be reconstructed from multi-channel fMCG recordings by means of independent component analysis (ICA). However, the identification of the separated components is usually accomplished by visual inspection. This paper discusses a novel automated system based on entropy estimators, namely approximate entropy (ApEn) and sample entropy (SampEn), for the classification of independent components (ICs). The system was validated on 40 fMCG datasets of normal fetuses with the gestational age ranging from 22 to 37 weeks. Both ApEn and SampEn were able to measure the stability and predictability of the physiological signals separated with ICA, and the entropy values of the three categories were significantly different at p <0.01. The system performances were compared with those of a method based on the analysis of the time and frequency content of the components. The outcomes of this study showed a superior performance of the entropy-based system, in particular for early gestation, with an overall ICs detection rate of 98.75% and 97.92% for ApEn and SampEn respectively, as against a value of 94.50% obtained with the time-frequency-based system.

  19. EXPLICIT SYMPLECTIC-LIKE INTEGRATORS WITH MIDPOINT PERMUTATIONS FOR SPINNING COMPACT BINARIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Junjie; Wu, Xin; Huang, Guoqing

    2017-01-01

    We refine the recently developed fourth-order extended phase space explicit symplectic-like methods for inseparable Hamiltonians using Yoshida’s triple product combined with a midpoint permuted map. The midpoint between the original variables and their corresponding extended variables at every integration step is readjusted as the initial values of the original variables and their corresponding extended ones at the next step integration. The triple-product construction is apparently superior to the composition of two triple products in computational efficiency. Above all, the new midpoint permutations are more effective in restraining the equality of the original variables and their corresponding extended ones at each integration step thanmore » the existing sequent permutations of momenta and coordinates. As a result, our new construction shares the benefit of implicit symplectic integrators in the conservation of the second post-Newtonian Hamiltonian of spinning compact binaries. Especially for the chaotic case, it can work well, but the existing sequent permuted algorithm cannot. When dissipative effects from the gravitational radiation reaction are included, the new symplectic-like method has a secular drift in the energy error of the dissipative system for the orbits that are regular in the absence of radiation, as an implicit symplectic integrator does. In spite of this, it is superior to the same-order implicit symplectic integrator in accuracy and efficiency. The new method is particularly useful in discussing the long-term evolution of inseparable Hamiltonian problems.« less

  20. A studentized permutation test for three-arm trials in the 'gold standard' design.

    PubMed

    Mütze, Tobias; Konietschke, Frank; Munk, Axel; Friede, Tim

    2017-03-15

    The 'gold standard' design for three-arm trials refers to trials with an active control and a placebo control in addition to the experimental treatment group. This trial design is recommended when being ethically justifiable and it allows the simultaneous comparison of experimental treatment, active control, and placebo. Parametric testing methods have been studied plentifully over the past years. However, these methods often tend to be liberal or conservative when distributional assumptions are not met particularly with small sample sizes. In this article, we introduce a studentized permutation test for testing non-inferiority and superiority of the experimental treatment compared with the active control in three-arm trials in the 'gold standard' design. The performance of the studentized permutation test for finite sample sizes is assessed in a Monte Carlo simulation study under various parameter constellations. Emphasis is put on whether the studentized permutation test meets the target significance level. For comparison purposes, commonly used Wald-type tests, which do not make any distributional assumptions, are included in the simulation study. The simulation study shows that the presented studentized permutation test for assessing non-inferiority in three-arm trials in the 'gold standard' design outperforms its competitors, for instance the test based on a quasi-Poisson model, for count data. The methods discussed in this paper are implemented in the R package ThreeArmedTrials which is available on the comprehensive R archive network (CRAN). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. LiDAR-Derived Surface Roughness Signatures of Basaltic Lava Types at the Muliwai a Pele Lava Channel, Mauna Ulu, Hawai'i

    NASA Technical Reports Server (NTRS)

    Whelley, Patrick L.; Garry, W. Brent; Hamilton, Christopher W.; Bleacher, Jacob E.

    2017-01-01

    We used light detection and ranging (LiDAR) data to calculate roughness patterns (homogeneity, mean-roughness, and entropy) for five lava types at two different resolutions (1.5 and 0.1 m/pixel). We found that end-member types (a a and pahoehoe) are separable (with 95% confidence) at both scales, indicating that roughness patterns are well suited for analyzing types of lava. Intermediate lavas were also explored, and we found that slabby-pahoehoe is separable from the other end-members using 1.5 m/pixel data, but not in the 0.1 m/pixel analysis. This suggests that the conversion from pahoehoe to slabby-pahoehoe is a meter-scale process, and the finer roughness characteristics of pahoehoe, such as ropes and toes, are not significantly affected. Furthermore, we introduce the ratio ENT/HOM (derived from lava roughness) as a proxy for assessing local lava flow rate from topographic data. High entropy and low homogeneity regions correlate with high flow rate while low entropy and high homogeneity regions correlate with low flow rate.We suggest that this relationship is not directional, rather it is apparent through roughness differences of the associated lava type emplaced at the high and low rates, respectively.

  2. Sex differences in the fetal heart rate variability indices of twins.

    PubMed

    Tendais, Iva; Figueiredo, Bárbara; Gonçalves, Hernâni; Bernardes, João; Ayres-de-Campos, Diogo; Montenegro, Nuno

    2015-03-01

    To evaluate the differences in linear and complex heart rate dynamics in twin pairs according to fetal sex combination [male-female (MF), male-male (MM), and female-female (FF)]. Fourteen twin pairs (6 MF, 3 MM, and 5 FF) were monitored between 31 and 36.4 weeks of gestation. Twenty-six fetal heart rate (FHR) recordings of both twins were simultaneously acquired and analyzed with a system for computerized analysis of cardiotocograms. Linear and nonlinear FHR indices were calculated. Overall, MM twins presented higher intrapair average in linear indices than the other pairs, whereas FF twins showed higher sympathetic-vagal balance. MF twins exhibited higher intrapair average in entropy indices and MM twins presented lower entropy values than FF twins considering the (automatically selected) threshold rLu. MM twin pairs showed higher intrapair differences in linear heart rate indices than MF and FF twins, whereas FF twins exhibited lower intrapair differences in entropy indices. The results of this exploratory study suggest that twins have sex-specific differences in linear and nonlinear indices of FHR. MM twins expressed signs of a more active autonomic nervous system and MF twins showed the most active complexity control system. These results suggest that fetal sex combination should be taken into consideration when performing detailed evaluation of the FHR in twins.

  3. Aeroacoustic and aerodynamic applications of the theory of nonequilibrium thermodynamics

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Smith, Charles A.; Karamcheti, Krishnamurty

    1991-01-01

    Recent developments in the field of nonequilibrium thermodynamics associated with viscous flows are examined and related to developments to the understanding of specific phenomena in aerodynamics and aeroacoustics. A key element of the nonequilibrium theory is the principle of minimum entropy production rate for steady dissipative processes near equilibrium, and variational calculus is used to apply this principle to several examples of viscous flow. A review of nonequilibrium thermodynamics and its role in fluid motion are presented. Several formulations are presented of the local entropy production rate and the local energy dissipation rate, two quantities that are of central importance to the theory. These expressions and the principle of minimum entropy production rate for steady viscous flows are used to identify parallel-wall channel flow and irrotational flow as having minimally dissipative velocity distributions. Features of irrotational, steady, viscous flow near an airfoil, such as the effect of trailing-edge radius on circulation, are also found to be compatible with the minimum principle. Finally, the minimum principle is used to interpret the stability of infinitesimal and finite amplitude disturbances in an initially laminar, parallel shear flow, with results that are consistent with experiment and linearized hydrodynamic stability theory. These results suggest that a thermodynamic approach may be useful in unifying the understanding of many diverse phenomena in aerodynamics and aeroacoustics.

  4. Predicting depressed patients with suicidal ideation from ECG recordings.

    PubMed

    Khandoker, A H; Luthra, V; Abouallaban, Y; Saha, S; Ahmed, K I; Mostafa, R; Chowdhury, N; Jelinek, H F

    2017-05-01

    Globally suicidal behavior is the third most common cause of death among patients with major depressive disorder (MDD). This study presents multi-lag tone-entropy (T-E) analysis of heart rate variability (HRV) as a screening tool for identifying MDD patients with suicidal ideation. Sixty-one ECG recordings (10 min) were acquired and analyzed from control subjects (29 CONT), 16 MDD subjects with (MDDSI+) and 16 without suicidal ideation (MDDSI-). After ECG preprocessing, tone and entropy values were calculated for multiple lags (m: 1-10). The MDDSI+ group was found to have a higher mean tone value compared to that of the MDDSI- group for lags 1-8, whereas the mean entropy value was lower in MDDSI+ than that in CONT group at all lags (1-10). Leave-one-out cross-validation tests, using a classification and regression tree (CART), obtained 94.83 % accuracy in predicting MDDSI+ subjects by using a combination of tone and entropy values at all lags and including demographic factors (age, BMI and waist circumference) compared to results with time and frequency domain HRV analysis. The results of this pilot study demonstrate the usefulness of multi-lag T-E analysis in identifying MDD patients with suicidal ideation and highlight the change in autonomic nervous system modulation of the heart rate associated with depression and suicidal ideation.

  5. The Anterior Insula Tracks Behavioral Entropy during an Interpersonal Competitive Game

    PubMed Central

    Matsumoto, Madoka; Matsumoto, Kenji; Omori, Takashi

    2015-01-01

    In competitive situations, individuals need to adjust their behavioral strategy dynamically in response to their opponent’s behavior. In the present study, we investigated the neural basis of how individuals adjust their strategy during a simple, competitive game of matching pennies. We used entropy as a behavioral index of randomness in decision-making, because maximizing randomness is thought to be an optimal strategy in the game, according to game theory. While undergoing functional magnetic resonance imaging (fMRI), subjects played matching pennies with either a human or computer opponent in each block, although in reality they played the game with the same computer algorithm under both conditions. The winning rate of each block was also manipulated. Both the opponent (human or computer), and the winning rate, independently affected subjects’ block-wise entropy during the game. The fMRI results revealed that activity in the bilateral anterior insula was positively correlated with subjects’ (not opponent’s) behavioral entropy during the game, which indicates that during an interpersonal competitive game, the anterior insula tracked how uncertain subjects’ behavior was, rather than how uncertain subjects felt their opponent's behavior was. Our results suggest that intuitive or automatic processes based on somatic markers may be a key to optimally adjusting behavioral strategies in competitive situations. PMID:26039634

  6. Analysis of cardiac signals using spatial filling index and time-frequency domain

    PubMed Central

    Faust, Oliver; Acharya U, Rajendra; Krishnan, SM; Min, Lim Choo

    2004-01-01

    Background Analysis of heart rate variation (HRV) has become a popular noninvasive tool for assessing the activities of the autonomic nervous system (ANS). HRV analysis is based on the concept that fast fluctuations may specifically reflect changes of sympathetic and vagal activity. It shows that the structure generating the signal is not simply linear, but also involves nonlinear contributions. These signals are essentially non-stationary; may contain indicators of current disease, or even warnings about impending diseases. The indicators may be present at all times or may occur at random in the time scale. However, to study and pinpoint abnormalities in voluminous data collected over several hours is strenuous and time consuming. Methods This paper presents the spatial filling index and time-frequency analysis of heart rate variability signal for disease identification. Renyi's entropy is evaluated for the signal in the Wigner-Ville and Continuous Wavelet Transformation (CWT) domain. Results This Renyi's entropy gives lower 'p' value for scalogram than Wigner-Ville distribution and also, the contours of scalogram visually show the features of the diseases. And in the time-frequency analysis, the Renyi's entropy gives better result for scalogram than the Wigner-Ville distribution. Conclusion Spatial filling index and Renyi's entropy has distinct regions for various diseases with an accuracy of more than 95%. PMID:15361254

  7. MCMC genome rearrangement.

    PubMed

    Miklós, István

    2003-10-01

    As more and more genomes have been sequenced, genomic data is rapidly accumulating. Genome-wide mutations are believed more neutral than local mutations such as substitutions, insertions and deletions, therefore phylogenetic investigations based on inversions, transpositions and inverted transpositions are less biased by the hypothesis on neutral evolution. Although efficient algorithms exist for obtaining the inversion distance of two signed permutations, there is no reliable algorithm when both inversions and transpositions are considered. Moreover, different type of mutations happen with different rates, and it is not clear how to weight them in a distance based approach. We introduce a Markov Chain Monte Carlo method to genome rearrangement based on a stochastic model of evolution, which can estimate the number of different evolutionary events needed to sort a signed permutation. The performance of the method was tested on simulated data, and the estimated numbers of different types of mutations were reliable. Human and Drosophila mitochondrial data were also analysed with the new method. The mixing time of the Markov Chain is short both in terms of CPU times and number of proposals. The source code in C is available on request from the author.

  8. The convergence rate of approximate solutions for nonlinear scalar conservation laws

    NASA Technical Reports Server (NTRS)

    Nessyahu, Haim; Tadmor, Eitan

    1991-01-01

    The convergence rate is discussed of approximate solutions for the nonlinear scalar conservation law. The linear convergence theory is extended into a weak regime. The extension is based on the usual two ingredients of stability and consistency. On the one hand, the counterexamples show that one must strengthen the linearized L(sup 2)-stability requirement. It is assumed that the approximate solutions are Lip(sup +)-stable in the sense that they satisfy a one-sided Lipschitz condition, in agreement with Oleinik's E-condition for the entropy solution. On the other hand, the lack of smoothness requires to weaken the consistency requirement, which is measured in the Lip'-(semi)norm. It is proved for Lip(sup +)-stable approximate solutions, that their Lip'convergence rate to the entropy solution is of the same order as their Lip'-consistency. The Lip'-convergence rate is then converted into stronger L(sup p) convergence rate estimates.

  9. The evolutionary synchronization of the exchange rate system in ASEAN+6

    NASA Astrophysics Data System (ADS)

    Feng, Xiaobing; Hu, Haibo; Wang, Xiaofan

    2010-12-01

    Although there are extensive researches on the behavior of the world currency network, the complexity of the Asian regional currency system is not well understood regardless of its importance. Using daily exchange rates this paper examines exchange rate co-movements in the region before and after the China exchange rate reform. It was found that the correlation between Asian currencies and the US Dollar, the previous regional key currency has become weaker and intra-Asia interactions have increased. Cross sample entropy and cross entropy approaches are also applied to examine the synchrony behavior among the Asian currencies. The study also shows that the Asian exchange rate markets featured are neither stochastic nor efficient. These findings may shed some light on the in-depth understanding of collective behaviors in a regional currency network; they will also lay a theoretical foundation for further policy formulation in Asian currency integration.

  10. Does horizon entropy satisfy a quantum null energy conjecture?

    NASA Astrophysics Data System (ADS)

    Fu, Zicao; Marolf, Donald

    2016-12-01

    A modern version of the idea that the area of event horizons gives 4G times an entropy is the Hubeny-Rangamani causal holographic information (CHI) proposal for holographic field theories. Given a region R of a holographic QFTs, CHI computes A/4G on a certain cut of an event horizon in the gravitational dual. The result is naturally interpreted as a coarse-grained entropy for the QFT. CHI is known to be finitely greater than the fine-grained Hubeny-Rangamani-Takayanagi (HRT) entropy when \\partial R lies on a Killing horizon of the QFT spacetime, and in this context satisfies other non-trivial properties expected of an entropy. Here we present evidence that it also satisfies the quantum null energy condition (QNEC), which bounds the second derivative of the entropy of a quantum field theory on one side of a non-expanding null surface by the flux of stress-energy across the surface. In particular, we show CHI to satisfy the QNEC in 1  +  1 holographic CFTs when evaluated in states dual to conical defects in AdS3. This surprising result further supports the idea that CHI defines a useful notion of coarse-grained holographic entropy, and suggests unprecedented bounds on the rate at which bulk horizon generators emerge from a caustic. To supplement our motivation, we include an appendix deriving a corresponding coarse-grained generalized second law for 1  +  1 holographic CFTs perturbatively coupled to dilaton gravity.

  11. Comparison of hemodynamic effects of intravenous etomidate versus propofol during induction and intubation using entropy guided hypnosis levels.

    PubMed

    Shah, Shagun Bhatia; Chowdhury, Itee; Bhargava, Ajay Kumar; Sabbharwal, Bhawnish

    2015-01-01

    This study aimed to compare the hemodynamic responses during induction and intubation between propofol and etomidate using entropy guided hypnosis. Sixty ASA I & II patients in the age group 20-60 yrs, scheduled for modified radical mastectomy were randomly allocated in two groups based on induction agent Etomidate or Propofol. Both groups received intravenous midazolam 0.03 mg kg(-1) and fentanyl 2 μg kg(-1) as premedication. After induction with the desired agent titrated to entropy 40, vecuronium 0.1 mg kg(-1) was administered for neuromuscular blockade. Heart rate, systolic, diastolic and mean arterial pressures, response entropy [RE] and state entropy [SE] were recorded at baseline, induction and upto three minutes post intubation. Data was subject to statistical analysis SPSS (version 12.0) the paired and the unpaired Student's T-tests for equality of means. Etomidate provided hemodynamic stability without the requirement of any rescue drug in 96.6% patients whereas rescue drug ephedrine was required in 36.6% patients in propofol group. Reduced induction doses 0.15mg kg(-1) for etomidate and 0.98 mg kg(-1) for propofol, sufficed to give an adequate anaesthetic depth based on entropy. Etomidate provides more hemodynamic stability than propofol during induction and intubation. Reduced induction doses of etomidate and propofol titrated to entropy translated into increased hemodynamic stability for both drugs and sufficed to give an adequate anaesthetic depth.

  12. Comparison of hemodynamic effects of intravenous etomidate versus propofol during induction and intubation using entropy guided hypnosis levels

    PubMed Central

    Shah, Shagun Bhatia; Chowdhury, Itee; Bhargava, Ajay Kumar; Sabbharwal, Bhawnish

    2015-01-01

    Background and Aims: This study aimed to compare the hemodynamic responses during induction and intubation between propofol and etomidate using entropy guided hypnosis. Material and Methods: Sixty ASA I & II patients in the age group 20-60 yrs, scheduled for modified radical mastectomy were randomly allocated in two groups based on induction agent Etomidate or Propofol. Both groups received intravenous midazolam 0.03 mg kg-1 and fentanyl 2 μg kg-1 as premedication. After induction with the desired agent titrated to entropy 40, vecuronium 0.1 mg kg-1 was administered for neuromuscular blockade. Heart rate, systolic, diastolic and mean arterial pressures, response entropy [RE] and state entropy [SE] were recorded at baseline, induction and upto three minutes post intubation. Data was subject to statistical analysis SPSS (version 12.0) the paired and the unpaired Student's T-tests for equality of means. Results: Etomidate provided hemodynamic stability without the requirement of any rescue drug in 96.6% patients whereas rescue drug ephedrine was required in 36.6% patients in propofol group. Reduced induction doses 0.15mg kg-1 for etomidate and 0.98 mg kg-1 for propofol, sufficed to give an adequate anaesthetic depth based on entropy. Conclusion: Etomidate provides more hemodynamic stability than propofol during induction and intubation. Reduced induction doses of etomidate and propofol titrated to entropy translated into increased hemodynamic stability for both drugs and sufficed to give an adequate anaesthetic depth. PMID:25948897

  13. Time domain structures in a colliding magnetic flux rope experiment

    NASA Astrophysics Data System (ADS)

    Tang, Shawn Wenjie; Gekelman, Walter; Dehaas, Timothy; Vincena, Steve; Pribyl, Patrick

    2017-10-01

    Electron phase-space holes, regions of positive potential on the scale of the Debye length, have been observed in auroras as well as in laboratory experiments. These potential structures, also known as Time Domain Structures (TDS), are packets of intense electric field spikes that have significant components parallel to the local magnetic field. In an ongoing investigation at UCLA, TDS were observed on the surface of two magnetized flux ropes produced within the Large Plasma Device (LAPD). A barium oxide (BaO) cathode was used to produce an 18 m long magnetized plasma column and a lanthanum hexaboride (LaB6) source was used to create 11 m long kink unstable flux ropes. Using two probes capable of measuring the local electric and magnetic fields, correlation analysis was performed on tens of thousands of these structures and their propagation velocities, probability distribution function and spatial distribution were determined. The TDS became abundant as the flux ropes collided and appear to emanate from the reconnection region in between them. In addition, a preliminary analysis of the permutation entropy and statistical complexity of the data suggests that the TDS signals may be chaotic in nature. Work done at the Basic Plasma Science Facility (BaPSF) at UCLA which is supported by DOE and NSF.

  14. A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings

    PubMed Central

    Liu, Jie; Hu, Youmin; Wu, Bo; Wang, Yan; Xie, Fengyun

    2017-01-01

    The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD). Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features’ information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components. PMID:28524088

  15. Analysis of EEG signals regularity in adults during video game play in 2D and 3D.

    PubMed

    Khairuddin, Hamizah R; Malik, Aamir S; Mumtaz, Wajid; Kamel, Nidal; Xia, Likun

    2013-01-01

    Video games have long been part of the entertainment industry. Nonetheless, it is not well known how video games can affect us with the advancement of 3D technology. The purpose of this study is to investigate the EEG signals regularity when playing video games in 2D and 3D modes. A total of 29 healthy subjects (24 male, 5 female) with mean age of 21.79 (1.63) years participated. Subjects were asked to play a car racing video game in three different modes (2D, 3D passive and 3D active). In 3D passive mode, subjects needed to wear a passive polarized glasses (cinema type) while for 3D active, an active shutter glasses was used. Scalp EEG data was recorded during game play using 19-channel EEG machine and linked ear was used as reference. After data were pre-processed, the signal irregularity for all conditions was computed. Two parameters were used to measure signal complexity for time series data: i) Hjorth-Complexity and ii) Composite Permutation Entropy Index (CPEI). Based on these two parameters, our results showed that the complexity level increased from eyes closed to eyes open condition; and further increased in the case of 3D as compared to 2D game play.

  16. Optimized mixed Markov models for motif identification

    PubMed Central

    Huang, Weichun; Umbach, David M; Ohler, Uwe; Li, Leping

    2006-01-01

    Background Identifying functional elements, such as transcriptional factor binding sites, is a fundamental step in reconstructing gene regulatory networks and remains a challenging issue, largely due to limited availability of training samples. Results We introduce a novel and flexible model, the Optimized Mixture Markov model (OMiMa), and related methods to allow adjustment of model complexity for different motifs. In comparison with other leading methods, OMiMa can incorporate more than the NNSplice's pairwise dependencies; OMiMa avoids model over-fitting better than the Permuted Variable Length Markov Model (PVLMM); and OMiMa requires smaller training samples than the Maximum Entropy Model (MEM). Testing on both simulated and actual data (regulatory cis-elements and splice sites), we found OMiMa's performance superior to the other leading methods in terms of prediction accuracy, required size of training data or computational time. Our OMiMa system, to our knowledge, is the only motif finding tool that incorporates automatic selection of the best model. OMiMa is freely available at [1]. Conclusion Our optimized mixture of Markov models represents an alternative to the existing methods for modeling dependent structures within a biological motif. Our model is conceptually simple and effective, and can improve prediction accuracy and/or computational speed over other leading methods. PMID:16749929

  17. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  18. Fractal Based Analysis of the Influence of Odorants on Heart Activity

    NASA Astrophysics Data System (ADS)

    Namazi, Hamidreza; Kulish, Vladimir V.

    2016-12-01

    An important challenge in heart research is to make the relation between the features of external stimuli and heart activity. Olfactory stimulation is an important type of stimulation that affects the heart activity, which is mapped on Electrocardiogram (ECG) signal. Yet, no one has discovered any relation between the structures of olfactory stimuli and the ECG signal. This study investigates the relation between the structures of heart rate and the olfactory stimulus (odorant). We show that the complexity of the heart rate is coupled with the molecular complexity of the odorant, where more structurally complex odorant causes less fractal heart rate. Also, odorant having higher entropy causes the heart rate having lower approximate entropy. The method discussed here can be applied and investigated in case of patients with heart diseases as the rehabilitation purpose.

  19. Quantum image encryption based on restricted geometric and color transformations

    NASA Astrophysics Data System (ADS)

    Song, Xian-Hua; Wang, Shen; Abd El-Latif, Ahmed A.; Niu, Xia-Mu

    2014-08-01

    A novel encryption scheme for quantum images based on restricted geometric and color transformations is proposed. The new strategy comprises efficient permutation and diffusion properties for quantum image encryption. The core idea of the permutation stage is to scramble the codes of the pixel positions through restricted geometric transformations. Then, a new quantum diffusion operation is implemented on the permutated quantum image based on restricted color transformations. The encryption keys of the two stages are generated by two sensitive chaotic maps, which can ensure the security of the scheme. The final step, measurement, is built by the probabilistic model. Experiments conducted on statistical analysis demonstrate that significant improvements in the results are in favor of the proposed approach.

  20. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.

    PubMed

    Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai

    2017-02-20

    Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.

  2. Thermodynamics of organisms in the context of dynamic energy budget theory.

    PubMed

    Sousa, Tânia; Mota, Rui; Domingos, Tiago; Kooijman, S A L M

    2006-11-01

    We carry out a thermodynamic analysis to an organism. It is applicable to any type of organism because (1) it is based on a thermodynamic formalism applicable to all open thermodynamic systems and (2) uses a general model to describe the internal structure of the organism--the dynamic energy budget (DEB) model. Our results on the thermodynamics of DEB organisms are the following. (1) Thermodynamic constraints for the following types of organisms: (a) aerobic and exothermic, (b) anaerobic and exothermic, and (c) anaerobic and endothermic; showing that anaerobic organisms have a higher thermodynamic flexibility. (2) A way to compute the changes in the enthalpy and in the entropy of living biomass that accompany changes in growth rate solving the problem of evaluating the thermodynamic properties of biomass as a function of the amount of reserves. (3) Two expressions for Thornton's coefficient that explain its experimental variability and theoretically underpin its use in metabolic studies. (4) A mechanism that organisms in non-steady-state use to rid themselves of internal entropy production: "dilution of entropy production by growth." To demonstrate the practical applicability of DEB theory to quantify thermodynamic changes in organisms we use published data on Klebsiella aerogenes growing aerobically in a continuous culture. We obtain different values for molar entropies of the reserve and the structure of Klebsiella aerogenes proving that the reserve density concept of DEB theory is essential in discussions concerning (a) the relationship between organization and entropy and (b) the mechanism of storing entropy in new biomass. Additionally, our results suggest that the entropy of dead biomass is significantly different from the entropy of living biomass.

  3. Modeling of groundwater productivity in northeastern Wasit Governorate, Iraq using frequency ratio and Shannon's entropy models

    NASA Astrophysics Data System (ADS)

    Al-Abadi, Alaa M.

    2017-05-01

    In recent years, delineation of groundwater productivity zones plays an increasingly important role in sustainable management of groundwater resource throughout the world. In this study, groundwater productivity index of northeastern Wasit Governorate was delineated using probabilistic frequency ratio (FR) and Shannon's entropy models in framework of GIS. Eight factors believed to influence the groundwater occurrence in the study area were selected and used as the input data. These factors were elevation (m), slope angle (degree), geology, soil, aquifer transmissivity (m2/d), storativity (dimensionless), distance to river (m), and distance to faults (m). In the first step, borehole location inventory map consisting of 68 boreholes with relatively high yield (>8 l/sec) was prepared. 47 boreholes (70 %) were used as training data and the remaining 21 (30 %) were used for validation. The predictive capability of each model was determined using relative operating characteristic technique. The results of the analysis indicate that the FR model with a success rate of 87.4 % and prediction rate 86.9 % performed slightly better than Shannon's entropy model with success rate of 84.4 % and prediction rate of 82.4 %. The resultant groundwater productivity index was classified into five classes using natural break classification scheme: very low, low, moderate, high, and very high. The high-very high classes for FR and Shannon's entropy models occurred within 30 % (217 km2) and 31 % (220 km2), respectively indicating low productivity conditions of the aquifer system. From final results, both of the models were capable to prospect GWPI with very good results, but FR was better in terms of success and prediction rates. Results of this study could be helpful for better management of groundwater resources in the study area and give planners and decision makers an opportunity to prepare appropriate groundwater investment plans.

  4. Measuring Ambiguity in HLA Typing Methods

    PubMed Central

    Madbouly, Abeer; Freeman, John; Maiers, Martin

    2012-01-01

    In hematopoietic stem cell transplantation, donor selection is based primarily on matching donor and patient HLA genes. These genes are highly polymorphic and their typing can result in exact allele assignment at each gene (the resolution at which patients and donors are matched), but it can also result in a set of ambiguous assignments, depending on the typing methodology used. To facilitate rapid identification of matched donors, registries employ statistical algorithms to infer HLA alleles from ambiguous genotypes. Linkage disequilibrium information encapsulated in haplotype frequencies is used to facilitate prediction of the most likely haplotype assignment. An HLA typing with less ambiguity produces fewer high-probability haplotypes and a more reliable prediction. We estimated ambiguity for several HLA typing methods across four continental populations using an information theory-based measure, Shannon's entropy. We used allele and haplotype frequencies to calculate entropy for different sets of 1,000 subjects with simulated HLA typing. Using allele frequencies we calculated an average entropy in Caucasians of 1.65 for serology, 1.06 for allele family level, 0.49 for a 2002-era SSO kit, and 0.076 for single-pass SBT. When using haplotype frequencies in entropy calculations, we found average entropies of 0.72 for serology, 0.73 for allele family level, 0.05 for SSO, and 0.002 for single-pass SBT. Application of haplotype frequencies further reduces HLA typing ambiguity. We also estimated expected confirmatory typing mismatch rates for simulated subjects. In a hypothetical registry with all donors typed using the same method, the entropy values based on haplotype frequencies correspond to confirmatory typing mismatch rates of 1.31% for SSO versus only 0.08% for SBT. Intermediate-resolution single-pass SBT contains the least ambiguity of the methods we evaluated and therefore the most certainty in allele prediction. The presented measure objectively evaluates HLA typing methods and can help define acceptable HLA typing for donor recruitment. PMID:22952712

  5. Activation energy and entropy for viscosity of wormlike micelle solutions.

    PubMed

    Chandler, H D

    2013-11-01

    The viscosities of two surfactant solutions which form wormlike micelles (WLMs) were studied over a range of temperatures and strain rates. WLM solutions appear to differ from many other shear thinning systems in that, as the shear rate increases, stress-shear rate curves tend to converge with temperature rather than diverge and this can sometimes lead to higher temperature curves crossing those at lower. Behaviour was analysed in terms of activation kinetics. It is suggested that two mechanisms are involved: Newtonian flow, following an Arrhenius law superimposed on a non-Newtonian flow described by a stress assisted kinetic law, this being a more general form of the Arrhenius law. Anomalous flow is introduced into the kinetic equation via a stress dependent activation entropy term. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. NASA thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Access Vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries, and pseudo-multiword terms that are permutations of words that contain words within words. The Access Vocabulary contains 40,738 entries that give increased access to the hierarchies in Volume 1 - Hierarchical Listing.

  7. NASA Thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The Access Vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries, and pseudo-multiword terms that are permutations of words that contain words within words. The Access Vocabulary contains, 40,661 entries that give increased access to he hierarchies in Volume 1 - Hierarchical Listing.

  8. Instability of Hierarchical Cluster Analysis Due to Input Order of the Data: The PermuCLUSTER Solution

    ERIC Educational Resources Information Center

    van der Kloot, Willem A.; Spaans, Alexander M. J.; Heiser, Willem J.

    2005-01-01

    Hierarchical agglomerative cluster analysis (HACA) may yield different solutions under permutations of the input order of the data. This instability is caused by ties, either in the initial proximity matrix or arising during agglomeration. The authors recommend to repeat the analysis on a large number of random permutations of the rows and columns…

  9. Optimal control of hybrid qubits: Implementing the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Rivera-Ruiz, C. M.; de Lima, E. F.; Fanchini, F. F.; Lopez-Richard, V.; Castelano, L. K.

    2018-03-01

    The optimal quantum control theory is employed to determine electric pulses capable of producing quantum gates with a fidelity higher than 0.9997, when noise is not taken into account. Particularly, these quantum gates were chosen to perform the permutation algorithm in hybrid qubits in double quantum dots (DQDs). The permutation algorithm is an oracle based quantum algorithm that solves the problem of the permutation parity faster than a classical algorithm without the necessity of entanglement between particles. The only requirement for achieving the speedup is the use of a one-particle quantum system with at least three levels. The high fidelity found in our results is closely related to the quantum speed limit, which is a measure of how fast a quantum state can be manipulated. Furthermore, we model charge noise by considering an average over the optimal field centered at different values of the reference detuning, which follows a Gaussian distribution. When the Gaussian spread is of the order of 5 μ eV (10% of the correct value), the fidelity is still higher than 0.95. Our scheme also can be used for the practical realization of different quantum algorithms in DQDs.

  10. PsiQuaSP-A library for efficient computation of symmetric open quantum systems.

    PubMed

    Gegg, Michael; Richter, Marten

    2017-11-24

    In a recent publication we showed that permutation symmetry reduces the numerical complexity of Lindblad quantum master equations for identical multi-level systems from exponential to polynomial scaling. This is important for open system dynamics including realistic system bath interactions and dephasing in, for instance, the Dicke model, multi-Λ system setups etc. Here we present an object-oriented C++ library that allows to setup and solve arbitrary quantum optical Lindblad master equations, especially those that are permutationally symmetric in the multi-level systems. PsiQuaSP (Permutation symmetry for identical Quantum Systems Package) uses the PETSc package for sparse linear algebra methods and differential equations as basis. The aim of PsiQuaSP is to provide flexible, storage efficient and scalable code while being as user friendly as possible. It is easily applied to many quantum optical or quantum information systems with more than one multi-level system. We first review the basics of the permutation symmetry for multi-level systems in quantum master equations. The application of PsiQuaSP to quantum dynamical problems is illustrated with several typical, simple examples of open quantum optical systems.

  11. Experimental and analytical investigation of direct and indirect noise generated from non-isentropic boundaries

    NASA Astrophysics Data System (ADS)

    de Domenico, Francesca; Rolland, Erwan; Hochgreb, Simone

    2017-11-01

    Pressure fluctuations in combustors arise either directly from the heat release rate perturbations of the flame (direct noise), or indirectly from the acceleration of entropy, vorticity or compositional perturbations through nozzles or turbine guide vanes (indirect noise). In this work, the second mechanism is experimentally investigated in a simplified rig. Synthetic entropy spots are generated via Joule effect or helium injection and then accelerated via orifice plates of different area contraction and thickness. The objective of the study is to parametrically analyse the entropy-to-sound conversion in non isentropic contractions (e.g. with pressure losses), represented by the orifice plates. Acoustic measurements are performed to reconstruct the acoustic and entropic transfer functions of the orifices and compare experimental data with analytical predictions, to investigate the effect of orifice thickness and area ratio on the transfer functions. PIV measurements are performed to study the stretching and dispersion of the entropy waves due to mean flow effects. Secondly, PIV images taken in the jet exiting downstream of the orifices are used to investigate the coupling of the acoustic and entropy fields with the hydrodynamic field. EPRSC, Qualcomm.

  12. The Root Cause of the Overheating Problem

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2017-01-01

    Previously we identified the receding flow, where two fluid streams recede from each other, as an open numerical problem, because all well-known numerical fluxes give an anomalous temperature rise, thus called the overheating problem. This phenomenon, although presented in several textbooks, and many previous publications, has scarcely been satisfactorily addressed and the root cause of the overheating problem not well understood. We found that this temperature rise was solely connected to entropy rise and proposed to use the method of characteristics to eradicate the problem. However, the root cause of the entropy production was still unclear. In the present study, we identify the cause of this problem: the entropy rise is rooted in the pressure flux in a finite volume formulation and is implanted at the first time step. It is found theoretically inevitable for all existing numerical flux schemes used in the finite volume setting, as confirmed by numerical tests. This difficulty cannot be eliminated by manipulating time step, grid size, spatial accuracy, etc, although the rate of overheating depends on the flux scheme used. Finally, we incorporate the entropy transport equation, in place of the energy equation, to ensure preservation of entropy, thus correcting this temperature anomaly. Its applicability is demonstrated for some relevant 1D and 2D problems. Thus, the present study validates that the entropy generated ab initio is the genesis of the overheating problem.

  13. Numerical study of entropy generation in MHD water-based carbon nanotubes along an inclined permeable surface

    NASA Astrophysics Data System (ADS)

    Soomro, Feroz Ahmed; Rizwan-ul-Haq; Khan, Z. H.; Zhang, Qiang

    2017-10-01

    Main theme of the article is to examine the entropy generation analysis for the magneto-hydrodynamic mixed convection flow of water functionalized carbon nanotubes along an inclined stretching surface. Thermophysical properties of both particles and working fluid are incorporated in the system of governing partial differential equations. Rehabilitation of nonlinear system of equations is obtained via similarity transformations. Moreover, solutions of these equations are further utilized to determine the volumetric entropy and characteristic entropy generation. Solutions of governing boundary layer equations are obtained numerically using the finite difference method. Effects of two types of carbon nanotubes, namely, single-wall carbon nanotubes (SWCNTs) and multi-wall carbon nanotubes (MWCNTs) with water as base fluid have been analyzed over the physical quantities of interest, namely, surface skin friction, heat transfer rate and entropy generation coefficients. Influential results of velocities, temperature, entropy generation and isotherms are plotted against the emerging parameter, namely, nanoparticle fraction 0≤φ ≤ 0.2, thermal convective parameter 0≤ λ ≤ 5, Hartmann number 0≤ M≤ 2, suction/injection parameter -1≤ S≤ 1, and Eckert number 0≤ Ec ≤ 2. It is finally concluded that skin friction increases due to the increase in the magnetic parameter, suction/injection and nanoparticle volume fraction, whereas the Nusselt number shows an increasing trend due to the increase in the suction parameter, mixed convection parameter and nanoparticle volume fraction. Similarly, entropy generation shows an opposite behavior for the Hartmann number and mixed convection parameter for both single-wall and multi-wall carbon nanotubes.

  14. Controllability of symmetric spin networks

    NASA Astrophysics Data System (ADS)

    Albertini, Francesca; D'Alessandro, Domenico

    2018-05-01

    We consider a network of n spin 1/2 systems which are pairwise interacting via Ising interaction and are controlled by the same electro-magnetic control field. Such a system presents symmetries since the Hamiltonian is unchanged if we permute two spins. This prevents full (operator) controllability, in that not every unitary evolution can be obtained. We prove however that controllability is verified if we restrict ourselves to unitary evolutions which preserve the above permutation invariance. For low dimensional cases, n = 2 and n = 3, we provide an analysis of the Lie group of available evolutions and give explicit control laws to transfer between two arbitrary permutation invariant states. This class of states includes highly entangled states such as Greenberger-Horne-Zeilinger (GHZ) states and W states, which are of interest in quantum information.

  15. Storage and computationally efficient permutations of factorized covariance and square-root information matrices

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector-stored upper-triangular diagonal factorized covariance (UD) and vector stored upper-triangular square-root information filter (SRIF) arrays is presented. The method involves cyclical permutation of the rows and columns of the arrays and retriangularization with appropriate square-root-free fast Givens rotations or elementary slow Givens reflections. A minimal amount of computation is performed and only one scratch vector of size N is required, where N is the column dimension of the arrays. To make the method efficient for large SRIF arrays on a virtual memory machine, three additional scratch vectors each of size N are used to avoid expensive paging faults. The method discussed is compared with the methods and routines of Bierman's Estimation Subroutine Library (ESL).

  16. Megahertz-Rate Semi-Device-Independent Quantum Random Number Generators Based on Unambiguous State Discrimination

    NASA Astrophysics Data System (ADS)

    Brask, Jonatan Bohr; Martin, Anthony; Esposito, William; Houlmann, Raphael; Bowles, Joseph; Zbinden, Hugo; Brunner, Nicolas

    2017-05-01

    An approach to quantum random number generation based on unambiguous quantum state discrimination is developed. We consider a prepare-and-measure protocol, where two nonorthogonal quantum states can be prepared, and a measurement device aims at unambiguously discriminating between them. Because the states are nonorthogonal, this necessarily leads to a minimal rate of inconclusive events whose occurrence must be genuinely random and which provide the randomness source that we exploit. Our protocol is semi-device-independent in the sense that the output entropy can be lower bounded based on experimental data and a few general assumptions about the setup alone. It is also practically relevant, which we demonstrate by realizing a simple optical implementation, achieving rates of 16.5 Mbits /s . Combining ease of implementation, a high rate, and a real-time entropy estimation, our protocol represents a promising approach intermediate between fully device-independent protocols and commercial quantum random number generators.

  17. Mathematical model for thermal and entropy analysis of thermal solar collectors by using Maxwell nanofluids with slip conditions, thermal radiation and variable thermal conductivity

    NASA Astrophysics Data System (ADS)

    Aziz, Asim; Jamshed, Wasim; Aziz, Taha

    2018-04-01

    In the present research a simplified mathematical model for the solar thermal collectors is considered in the form of non-uniform unsteady stretching surface. The non-Newtonian Maxwell nanofluid model is utilized for the working fluid along with slip and convective boundary conditions and comprehensive analysis of entropy generation in the system is also observed. The effect of thermal radiation and variable thermal conductivity are also included in the present model. The mathematical formulation is carried out through a boundary layer approach and the numerical computations are carried out for Cu-water and TiO2-water nanofluids. Results are presented for the velocity, temperature and entropy generation profiles, skin friction coefficient and Nusselt number. The discussion is concluded on the effect of various governing parameters on the motion, temperature variation, entropy generation, velocity gradient and the rate of heat transfer at the boundary.

  18. Ectopic beats in approximate entropy and sample entropy-based HRV assessment

    NASA Astrophysics Data System (ADS)

    Singh, Butta; Singh, Dilbag; Jaryal, A. K.; Deepak, K. K.

    2012-05-01

    Approximate entropy (ApEn) and sample entropy (SampEn) are the promising techniques for extracting complex characteristics of cardiovascular variability. Ectopic beats, originating from other than the normal site, are the artefacts contributing a serious limitation to heart rate variability (HRV) analysis. The approaches like deletion and interpolation are currently in use to eliminate the bias produced by ectopic beats. In this study, normal R-R interval time series of 10 healthy and 10 acute myocardial infarction (AMI) patients were analysed by inserting artificial ectopic beats. Then the effects of ectopic beats editing by deletion, degree-zero and degree-one interpolation on ApEn and SampEn have been assessed. Ectopic beats addition (even 2%) led to reduced complexity, resulting in decreased ApEn and SampEn of both healthy and AMI patient data. This reduction has been found to be dependent on level of ectopic beats. Editing of ectopic beats by interpolation degree-one method is found to be superior to other methods.

  19. A pairwise maximum entropy model accurately describes resting-state human brain networks

    PubMed Central

    Watanabe, Takamitsu; Hirose, Satoshi; Wada, Hiroyuki; Imai, Yoshio; Machida, Toru; Shirouzu, Ichiro; Konishi, Seiki; Miyashita, Yasushi; Masuda, Naoki

    2013-01-01

    The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks. PMID:23340410

  20. Cardiorespiratory Information Dynamics during Mental Arithmetic and Sustained Attention

    PubMed Central

    Widjaja, Devy; Montalto, Alessandro; Vlemincx, Elke; Marinazzo, Daniele; Van Huffel, Sabine; Faes, Luca

    2015-01-01

    An analysis of cardiorespiratory dynamics during mental arithmetic, which induces stress, and sustained attention was conducted using information theory. The information storage and internal information of heart rate variability (HRV) were determined respectively as the self-entropy of the tachogram, and the self-entropy of the tachogram conditioned to the knowledge of respiration. The information transfer and cross information from respiration to HRV were assessed as the transfer and cross-entropy, both measures of cardiorespiratory coupling. These information-theoretic measures identified significant nonlinearities in the cardiorespiratory time series. Additionally, it was shown that, although mental stress is related to a reduction in vagal activity, no difference in cardiorespiratory coupling was found when several mental states (rest, mental stress, sustained attention) are compared. However, the self-entropy of HRV conditioned to respiration was very informative to study the predictability of RR interval series during mental tasks, and showed higher predictability during mental arithmetic compared to sustained attention or rest. PMID:26042824

  1. Multiscale entropy analysis of heart rate variability in heart failure, hypertensive, and sinoaortic-denervated rats: classical and refined approaches.

    PubMed

    Silva, Luiz Eduardo Virgilio; Lataro, Renata Maria; Castania, Jaci Airton; da Silva, Carlos Alberto Aguiar; Valencia, Jose Fernando; Murta, Luiz Otavio; Salgado, Helio Cesar; Fazan, Rubens; Porta, Alberto

    2016-07-01

    The analysis of heart rate variability (HRV) by nonlinear methods has been gaining increasing interest due to their ability to quantify the complexity of cardiovascular regulation. In this study, multiscale entropy (MSE) and refined MSE (RMSE) were applied to track the complexity of HRV as a function of time scale in three pathological conscious animal models: rats with heart failure (HF), spontaneously hypertensive rats (SHR), and rats with sinoaortic denervation (SAD). Results showed that HF did not change HRV complexity, although there was a tendency to decrease the entropy in HF animals. On the other hand, SHR group was characterized by reduced complexity at long time scales, whereas SAD animals exhibited a smaller short- and long-term irregularity. We propose that short time scales (1 to 4), accounting for fast oscillations, are more related to vagal and respiratory control, whereas long time scales (5 to 20), accounting for slow oscillations, are more related to sympathetic control. The increased sympathetic modulation is probably the main reason for the lower entropy observed at high scales for both SHR and SAD groups, acting as a negative factor for the cardiovascular complexity. This study highlights the contribution of the multiscale complexity analysis of HRV for understanding the physiological mechanisms involved in cardiovascular regulation. Copyright © 2016 the American Physiological Society.

  2. Soil Bacteria and Fungi Respond on Different Spatial Scales to Invasion by the Legume Lespedeza cuneata

    DTIC Science & Technology

    2011-05-24

    of 230   community similarity (Legendre and Legendre 1998). 231   232   Permutational Multivariate Analysis of Variance ( PerMANOVA ) (McArdle...241   null hypothesis can be rejected with a type I error rate of a. We used an implementation 242   of PerMANOVA that involved sequential removal...TEXTURE, and 249   HABITAT. 250   251   The null distribution for PerMANOVA tests for site-scale effects was generated 252   using a restricted

  3. Statistical physics of self-replication.

    PubMed

    England, Jeremy L

    2013-09-28

    Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.

  4. NASA thesaurus. Volume 2: Access vocabulary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The access vocabulary, which is essentially a permuted index, provides access to any word or number in authorized postable and nonpostable terms. Additional entries include postable and nonpostable terms, other word entries and pseudo-multiword terms that are permutations of words that contain words within words. The access vocabulary contains almost 42,000 entries that give increased access to the hierarchies in Volume 1 - Hierarchical Listing.

  5. Genomic Analysis of Complex Microbial Communities in Wounds

    DTIC Science & Technology

    2012-01-01

    thoroughly in the ecology literature. Permutation Multivariate Analysis of Variance ( PerMANOVA ). We used PerMANOVA to test the null-hypothesis of no...difference between the bacterial communities found within a single wound compared to those from different patients (α = 0.05). PerMANOVA is a...permutation-based version of the multivariate analysis of variance (MANOVA). PerMANOVA uses the distances between samples to partition variance and

  6. Circular permutation of the starch-binding domain: inversion of ligand selectivity with increased affinity.

    PubMed

    Stephen, Preyesh; Tseng, Kai-Li; Liu, Yu-Nan; Lyu, Ping-Chiang

    2012-03-07

    Proteins containing starch-binding domains (SBDs) are used in a variety of scientific and technological applications. A circularly permutated SBD (CP90) with improved affinity and selectivity toward longer-chain carbohydrates was synthesized, suggesting that a new starch-binding protein may be developed for specific scientific and industrial applications. This journal is © The Royal Society of Chemistry 2012

  7. Sampling solution traces for the problem of sorting permutations by signed reversals

    PubMed Central

    2012-01-01

    Background Traditional algorithms to solve the problem of sorting by signed reversals output just one optimal solution while the space of all optimal solutions can be huge. A so-called trace represents a group of solutions which share the same set of reversals that must be applied to sort the original permutation following a partial ordering. By using traces, we therefore can represent the set of optimal solutions in a more compact way. Algorithms for enumerating the complete set of traces of solutions were developed. However, due to their exponential complexity, their practical use is limited to small permutations. A partial enumeration of traces is a sampling of the complete set of traces and can be an alternative for the study of distinct evolutionary scenarios of big permutations. Ideally, the sampling should be done uniformly from the space of all optimal solutions. This is however conjectured to be ♯P-complete. Results We propose and evaluate three algorithms for producing a sampling of the complete set of traces that instead can be shown in practice to preserve some of the characteristics of the space of all solutions. The first algorithm (RA) performs the construction of traces through a random selection of reversals on the list of optimal 1-sequences. The second algorithm (DFALT) consists in a slight modification of an algorithm that performs the complete enumeration of traces. Finally, the third algorithm (SWA) is based on a sliding window strategy to improve the enumeration of traces. All proposed algorithms were able to enumerate traces for permutations with up to 200 elements. Conclusions We analysed the distribution of the enumerated traces with respect to their height and average reversal length. Various works indicate that the reversal length can be an important aspect in genome rearrangements. The algorithms RA and SWA show a tendency to lose traces with high average reversal length. Such traces are however rare, and qualitatively our results show that, for testable-sized permutations, the algorithms DFALT and SWA produce distributions which approximate the reversal length distributions observed with a complete enumeration of the set of traces. PMID:22704580

  8. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test

    PubMed Central

    2013-01-01

    Background The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. Results One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to “filter” redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. Conclusion We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the summary-statistic based approach. We also implement the summary-statistic test using Z-statistics from an already-published GWAS of Chronic Obstructive Pulmonary Disorder (COPD) and correlation structure obtained from HapMap. We experiment with the modification of this test because the correlation structure is assumed imperfectly known. PMID:24199751

  9. Properties of permutation-based gene tests and controlling type 1 error using a summary statistic based gene test.

    PubMed

    Swanson, David M; Blacker, Deborah; Alchawa, Taofik; Ludwig, Kerstin U; Mangold, Elisabeth; Lange, Christoph

    2013-11-07

    The advent of genome-wide association studies has led to many novel disease-SNP associations, opening the door to focused study on their biological underpinnings. Because of the importance of analyzing these associations, numerous statistical methods have been devoted to them. However, fewer methods have attempted to associate entire genes or genomic regions with outcomes, which is potentially more useful knowledge from a biological perspective and those methods currently implemented are often permutation-based. One property of some permutation-based tests is that their power varies as a function of whether significant markers are in regions of linkage disequilibrium (LD) or not, which we show from a theoretical perspective. We therefore develop two methods for quantifying the degree of association between a genomic region and outcome, both of whose power does not vary as a function of LD structure. One method uses dimension reduction to "filter" redundant information when significant LD exists in the region, while the other, called the summary-statistic test, controls for LD by scaling marker Z-statistics using knowledge of the correlation matrix of markers. An advantage of this latter test is that it does not require the original data, but only their Z-statistics from univariate regressions and an estimate of the correlation structure of markers, and we show how to modify the test to protect the type 1 error rate when the correlation structure of markers is misspecified. We apply these methods to sequence data of oral cleft and compare our results to previously proposed gene tests, in particular permutation-based ones. We evaluate the versatility of the modification of the summary-statistic test since the specification of correlation structure between markers can be inaccurate. We find a significant association in the sequence data between the 8q24 region and oral cleft using our dimension reduction approach and a borderline significant association using the summary-statistic based approach. We also implement the summary-statistic test using Z-statistics from an already-published GWAS of Chronic Obstructive Pulmonary Disorder (COPD) and correlation structure obtained from HapMap. We experiment with the modification of this test because the correlation structure is assumed imperfectly known.

  10. Permutation glass.

    PubMed

    Williams, Mobolaji

    2018-01-01

    The field of disordered systems in statistical physics provides many simple models in which the competing influences of thermal and nonthermal disorder lead to new phases and nontrivial thermal behavior of order parameters. In this paper, we add a model to the subject by considering a disordered system where the state space consists of various orderings of a list. As in spin glasses, the disorder of such "permutation glasses" arises from a parameter in the Hamiltonian being drawn from a distribution of possible values, thus allowing nominally "incorrect orderings" to have lower energies than "correct orderings" in the space of permutations. We analyze a Gaussian, uniform, and symmetric Bernoulli distribution of energy costs, and, by employing Jensen's inequality, derive a simple condition requiring the permutation glass to always transition to the correctly ordered state at a temperature lower than that of the nondisordered system, provided that this correctly ordered state is accessible. We in turn find that in order for the correctly ordered state to be accessible, the probability that an incorrectly ordered component is energetically favored must be less than the inverse of the number of components in the system. We show that all of these results are consistent with a replica symmetric ansatz of the system. We conclude by arguing that there is no distinct permutation glass phase for the simplest model considered here and by discussing how to extend the analysis to more complex Hamiltonians capable of novel phase behavior and replica symmetry breaking. Finally, we outline an apparent correspondence between the presented system and a discrete-energy-level fermion gas. In all, the investigation introduces a class of exactly soluble models into statistical mechanics and provides a fertile ground to investigate statistical models of disorder.

  11. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  12. Multiscale entropy-based methods for heart rate variability complexity analysis

    NASA Astrophysics Data System (ADS)

    Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio

    2015-03-01

    Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.

  13. Information Theory to Probe Intrapartum Fetal Heart Rate Dynamics

    NASA Astrophysics Data System (ADS)

    Granero-Belinchon, Carlos; Roux, Stéphane; Abry, Patrice; Doret, Muriel; Garnier, Nicolas

    2017-11-01

    Intrapartum fetal heart rate (FHR) monitoring constitutes a reference tool in clinical practice to assess the baby health status and to detect fetal acidosis. It is usually analyzed by visual inspection grounded on FIGO criteria. Characterization of Intrapartum fetal heart rate temporal dynamics remains a challenging task and continuously receives academic research efforts. Complexity measures, often implemented with tools referred to as \\emph{Approximate Entropy} (ApEn) or \\emph{Sample Entropy} (SampEn), have regularly been reported as significant features for intrapartum FHR analysis. We explore how Information Theory, and especially {\\em auto mutual information} (AMI), is connected to ApEn and SampEn and can be used to probe FHR dynamics. Applied to a large (1404 subjects) and documented database of FHR data, collected in a French academic hospital, it is shown that i) auto mutual information outperforms ApEn and SampEn for acidosis detection in the first stage of labor and continues to yield the best performance in the second stage; ii) Shannon entropy increases as labor progresses, and is always much larger in the second stage;iii) babies suffering from fetal acidosis additionally show more structured temporal dynamics than healthy ones and that this progressive structuration can be used for early acidosis detection.

  14. Stability, Consistency and Performance of Distribution Entropy in Analysing Short Length Heart Rate Variability (HRV) Signal.

    PubMed

    Karmakar, Chandan; Udhayakumar, Radhagayathri K; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu

    2017-01-01

    Distribution entropy ( DistEn ) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters-the embedding dimension m , and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy ( ApEn ) and sample entropy ( SampEn ) measures. The performance of DistEn can also be affected by the data length N . In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter ( m or M ) or combination of two parameters ( N and M ). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn . The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series.

  15. Stability, Consistency and Performance of Distribution Entropy in Analysing Short Length Heart Rate Variability (HRV) Signal

    PubMed Central

    Karmakar, Chandan; Udhayakumar, Radhagayathri K.; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu

    2017-01-01

    Distribution entropy (DistEn) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters—the embedding dimension m, and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy (ApEn) and sample entropy (SampEn) measures. The performance of DistEn can also be affected by the data length N. In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter (m or M) or combination of two parameters (N and M). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn. The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series. PMID:28979215

  16. Estimating the mutual information of an EEG-based Brain-Computer Interface.

    PubMed

    Schlögl, A; Neuper, C; Pfurtscheller, G

    2002-01-01

    An EEG-based Brain-Computer Interface (BCI) could be used as an additional communication channel between human thoughts and the environment. The efficacy of such a BCI depends mainly on the transmitted information rate. Shannon's communication theory was used to quantify the information rate of BCI data. For this purpose, experimental EEG data from four BCI experiments was analyzed off-line. Subjects imaginated left and right hand movements during EEG recording from the sensorimotor area. Adaptive autoregressive (AAR) parameters were used as features of single trial EEG and classified with linear discriminant analysis. The intra-trial variation as well as the inter-trial variability, the signal-to-noise ratio, the entropy of information, and the information rate were estimated. The entropy difference was used as a measure of the separability of two classes of EEG patterns.

  17. How to think about indiscernible particles

    NASA Astrophysics Data System (ADS)

    Giglio, Daniel Joseph

    Permutation symmetries which arise in quantum mechanics pose an intriguing problem. It is not clear that particles which exhibit permutation symmetries (i.e. particles which are indiscernible, meaning that they can be swapped with each other without this yielding a new physical state) qualify as "objects" in any reasonable sense of the term. One solution to this puzzle, which I attribute to W.V. Quine, would have us eliminate such particles from our ontology altogether in order to circumvent the metaphysical vexations caused by permutation symmetries. In this essay I argue that Quine's solution is too rash, and in its place I suggest a novel solution based on altering some of the language of quantum mechanics. Before launching into the technical details of indiscernible particles, however, I begin this essay with some remarks on the methodology -- instrumentalism -- which motivates my arguments.

  18. Entropy and energy spectra in low-Prandtl-number convection with rotation.

    PubMed

    Pharasi, Hirdesh K; Kumar, Krishna; Bhattacharjee, Jayanta K

    2014-02-01

    We present results for entropy and kinetic energy spectra computed from direct numerical simulations for low-Prandtl-number (Pr < 1) turbulent flow in Rayleigh-Bénard convection with uniform rotation about a vertical axis. The simulations are performed in a three-dimensional periodic box for a range of the Taylor number (0 ≤ Ta ≤ 10(8)) and reduced Rayleigh number r = Ra/Ra(∘)(Ta,Pr) (1.0 × 10(2) ≤ r ≤ 5.0 × 10(3)). The Rossby number Ro varies in the range 1.34 ≤ Ro ≤ 73. The entropy spectrum E(θ)(k) shows bisplitting into two branches for lower values of wave number k. The entropy in the lower branch scales with k as k(-1.4 ± 0.1) for r>10(3) for the rotation rates considered here. The entropy in the upper branch also shows scaling behavior with k, but the scaling exponent decreases with increasing Ta for all r. The energy spectrum E(v)(k) is also found to scale with the wave number k as k(-1.4 ± 0.1) for r>10(3). The scaling exponent for the energy spectrum and the lower branch of the entropy spectrum vary between -1.7 and -2.4 for lower values of r (<10(3)). We also provide some simple arguments based on the variation of the Kolmogorov picture to support the results of simulations.

  19. Comparison of intravenous labetalol and bupivacaine scalp block on the hemodynamic and entropy changes following skull pin application: A randomized, open label clinical trial.

    PubMed

    Bharne, Sidhesh; Bidkar, Prasanna Udupi; Badhe, Ashok Shankar; Parida, Satyen; Ramesh, Andi Sadayandi

    2016-01-01

    The application of skull pins in neurosurgical procedures is a highly noxious stimulus that causes hemodynamic changes and a rise in spectral entropy levels. We designed a study to compare intravenous (IV) labetalol and bupivacaine scalp block in blunting these changes. Sixty-six patients undergoing elective neurosurgical procedures were randomized into two groups, L (labetalol) and B (bupivacaine) of 33 each. After a standard induction sequence using fentanyl, propofol and vecuronium, patients were intubated. Baseline hemodynamic parameters and entropy levels were noted. Five minutes before, application of the pins, group L patients received IV labetalol 0.25 mg/kg and group B patients received scalp block with 30 ml of 0.25% bupivacaine. Following application of the pins, heart rate (HR), systolic arterial pressure (SAP), diastolic arterial pressure (DAP), mean arterial pressure (MAP), and response entropy (RE)/state entropy (SE) were noted at regular time points up to 5 min. The two groups were comparable with respect to their demographic characteristics. Baseline hemodynamic parameters and entropy levels were also similar. After pinning, the HR, SAP, DAP, MAP, and RE/SE all increased in both groups but were lower in the scalp block group patients. HR increased by 19.8% in group L and by 11% in group B. SAP increased by 11.9% in group L and remained unchanged in group B. DAP increased by 19.7% in group L and by 9.9% in group B, MAP increased by 15.6% in group L and 5% in group B (P < 0.05). No adverse effects were noted. Scalp block with bupivacaine is more effective than IV labetalol in attenuating the rise in hemodynamic parameters and entropy changes following skull pin application.

  20. Entropy of Masseter Muscle Pain Sensitivity: A New Technique for Pain Assessment.

    PubMed

    Castrillon, Eduardo E; Exposto, Fernando G; Sato, Hitoshi; Tanosoto, Tomohiro; Arima, Taro; Baad-Hansen, Lene; Svensson, Peter

    2017-01-01

    To test whether manipulation of mechanical pain sensitivity (MPS) of the masseter muscle is reflected in quantitative measures of entropy. In a randomized, single-blinded, placebo-controlled design, 20 healthy volunteers had glutamate, lidocaine, and isotonic saline injected into the masseter muscle. Self-assessed pain intensity on a numeric rating scale (NRS) was evaluated up to 10 minutes following the injection, and MPS was evaluated after application (at 5 minutes and 30 minutes) of three different forces (0.5 kg, 1 kg, and 2 kg) to 15 different sites of the masseter muscle. Finally, the entropy and center of gravity (COG) of the pain sensitivity scores were calculated. Analysis of variance was used to test differences in means of tested outcomes and Tukey post hoc tests were used to adjust for multiple comparisons. The main findings were: (1) Compared with both lidocaine and isotonic saline, glutamate injections caused an increase in peak, duration, and area under the NRS pain curve (P < .01); (2) A pressure of 2 kg caused the highest NRS pain scores (P < .03) and entropy values (P < .02); (3) Glutamate injections caused increases in entropy values when assessed with 0.5 kg and 1.0 kg but not with 2.0 kg of pressure; and (4) COG coordinates revealed differences between the x coordinates for time (P < .01) and time and force for the y coordinates (P < .01). These results suggest that manipulation of MPS of the masseter muscle with painful glutamate injections can increase the diversity of MPS, which is reflected in entropy measures. Entropy allows quantification of the diversity of MPS, which may be important in clinical assessment of pain states such as myofascial temporomandibular disorders.

  1. Fermion systems in discrete space-time

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    2007-05-01

    Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.

  2. Dynamic Testing and Automatic Repair of Reconfigurable Wiring Harnesses

    DTIC Science & Technology

    2006-11-27

    Switch An M ×N grid of switches configured to provide a M -input, N -output routing network. Permutation Network A permutation network performs an...wiring reduces the effective advantage of their reduced switch count, particularly when considering that regular grids (crossbar switches being a...are connected to. The outline circuit shown in Fig. 20 shows how a suitable ‘discovery probe’ might be implemented. The circuit shows a UART

  3. Tolerance of a Knotted Near-Infrared Fluorescent Protein to Random Circular Permutation.

    PubMed

    Pandey, Naresh; Kuypers, Brianna E; Nassif, Barbara; Thomas, Emily E; Alnahhas, Razan N; Segatori, Laura; Silberg, Jonathan J

    2016-07-12

    Bacteriophytochrome photoreceptors (BphP) are knotted proteins that have been developed as near-infrared fluorescent protein (iRFP) reporters of gene expression. To explore how rearrangements in the peptides that interlace into the knot within the BphP photosensory core affect folding, we subjected iRFPs to random circular permutation using an improved transposase mutagenesis strategy and screened for variants that fluoresce. We identified 27 circularly permuted iRFPs that display biliverdin-dependent fluorescence in Escherichia coli. The variants with the brightest whole cell fluorescence initiated translation at residues near the domain linker and knot tails, although fluorescent variants that initiated translation within the PAS and GAF domains were discovered. Circularly permuted iRFPs retained sufficient cofactor affinity to fluoresce in tissue culture without the addition of biliverdin, and one variant displayed enhanced fluorescence when expressed in bacteria and tissue culture. This variant displayed a quantum yield similar to that of iRFPs but exhibited increased resistance to chemical denaturation, suggesting that the observed increase in the magnitude of the signal arose from more efficient protein maturation. These results show how the contact order of a knotted BphP can be altered without disrupting chromophore binding and fluorescence, an important step toward the creation of near-infrared biosensors with expanded chemical sensing functions for in vivo imaging.

  4. Tolerance of a knotted near infrared fluorescent protein to random circular permutation

    PubMed Central

    Pandey, Naresh; Kuypers, Brianna E.; Nassif, Barbara; Thomas, Emily E.; Alnahhas, Razan N.; Segatori, Laura; Silberg, Jonathan J.

    2016-01-01

    Bacteriophytochrome photoreceptors (BphP) are knotted proteins that have been developed as near-infrared fluorescent protein (iRFP) reporters of gene expression. To explore how rearrangements in the peptides that interlace into the knot within the BphP photosensory core affect folding, we subjected iRFP to random circular permutation using an improved transposase mutagenesis strategy and screened for variants that fluoresce. We identified twenty seven circularly permuted iRFP that display biliverdin-dependent fluorescence in Escherichia coli. The variants with the brightest whole cell fluorescence initiated translation at residues near the domain linker and knot tails, although fluorescent variants were discovered that initiated translation within the PAS and GAF domains. Circularly permuted iRFP retained sufficient cofactor affinity to fluoresce in tissue culture without the addition of biliverdin, and one variant displayed enhanced fluorescence when expressed in bacteria and tissue culture. This variant displayed a similar quantum yield as iRFP, but exhibited increased resistance to chemical denaturation, suggesting that the observed signal increase arose from more efficient protein maturation. These results show how the contact order of a knotted BphP can be altered without disrupting chromophore binding and fluorescence, an important step towards the creation of near-infrared biosensors with expanded chemical-sensing functions for in vivo imaging. PMID:27304983

  5. Nonlinear digital signal processing in mental health: characterization of major depression using instantaneous entropy measures of heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Garcia, Ronald G; Citi, Luca; Scilingo, Enzo P; Tomaz, Carlos A; Barbieri, Riccardo

    2015-01-01

    Nonlinear digital signal processing methods that address system complexity have provided useful computational tools for helping in the diagnosis and treatment of a wide range of pathologies. More specifically, nonlinear measures have been successful in characterizing patients with mental disorders such as Major Depression (MD). In this study, we propose the use of instantaneous measures of entropy, namely the inhomogeneous point-process approximate entropy (ipApEn) and the inhomogeneous point-process sample entropy (ipSampEn), to describe a novel characterization of MD patients undergoing affective elicitation. Because these measures are built within a nonlinear point-process model, they allow for the assessment of complexity in cardiovascular dynamics at each moment in time. Heartbeat dynamics were characterized from 48 healthy controls and 48 patients with MD while emotionally elicited through either neutral or arousing audiovisual stimuli. Experimental results coming from the arousing tasks show that ipApEn measures are able to instantaneously track heartbeat complexity as well as discern between healthy subjects and MD patients. Conversely, standard heart rate variability (HRV) analysis performed in both time and frequency domains did not show any statistical significance. We conclude that measures of entropy based on nonlinear point-process models might contribute to devising useful computational tools for care in mental health.

  6. From Maximum Entropy Models to Non-Stationarity and Irreversibility

    NASA Astrophysics Data System (ADS)

    Cofre, Rodrigo; Cessac, Bruno; Maldonado, Cesar

    The maximum entropy distribution can be obtained from a variational principle. This is important as a matter of principle and for the purpose of finding approximate solutions. One can exploit this fact to obtain relevant information about the underlying stochastic process. We report here in recent progress in three aspects to this approach.1- Biological systems are expected to show some degree of irreversibility in time. Based on the transfer matrix technique to find the spatio-temporal maximum entropy distribution, we build a framework to quantify the degree of irreversibility of any maximum entropy distribution.2- The maximum entropy solution is characterized by a functional called Gibbs free energy (solution of the variational principle). The Legendre transformation of this functional is the rate function, which controls the speed of convergence of empirical averages to their ergodic mean. We show how the correct description of this functional is determinant for a more rigorous characterization of first and higher order phase transitions.3- We assess the impact of a weak time-dependent external stimulus on the collective statistics of spiking neuronal networks. We show how to evaluate this impact on any higher order spatio-temporal correlation. RC supported by ERC advanced Grant ``Bridges'', BC: KEOPS ANR-CONICYT, Renvision and CM: CONICYT-FONDECYT No. 3140572.

  7. Why We Should Not Be Indifferent to Specification Choices for Difference-in-Differences.

    PubMed

    Ryan, Andrew M; Burgess, James F; Dimick, Justin B

    2015-08-01

    To evaluate the effects of specification choices on the accuracy of estimates in difference-in-differences (DID) models. Process-of-care quality data from Hospital Compare between 2003 and 2009. We performed a Monte Carlo simulation experiment to estimate the effect of an imaginary policy on quality. The experiment was performed for three different scenarios in which the probability of treatment was (1) unrelated to pre-intervention performance; (2) positively correlated with pre-intervention levels of performance; and (3) positively correlated with pre-intervention trends in performance. We estimated alternative DID models that varied with respect to the choice of data intervals, the comparison group, and the method of obtaining inference. We assessed estimator bias as the mean absolute deviation between estimated program effects and their true value. We evaluated the accuracy of inferences through statistical power and rates of false rejection of the null hypothesis. Performance of alternative specifications varied dramatically when the probability of treatment was correlated with pre-intervention levels or trends. In these cases, propensity score matching resulted in much more accurate point estimates. The use of permutation tests resulted in lower false rejection rates for the highly biased estimators, but the use of clustered standard errors resulted in slightly lower false rejection rates for the matching estimators. When treatment and comparison groups differed on pre-intervention levels or trends, our results supported specifications for DID models that include matching for more accurate point estimates and models using clustered standard errors or permutation tests for better inference. Based on our findings, we propose a checklist for DID analysis. © Health Research and Educational Trust.

  8. Approximate Entropies for Stochastic Time Series and EKG Time Series of Patients with Epilepsy and Pseudoseizures

    NASA Astrophysics Data System (ADS)

    Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron

    2009-10-01

    A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.

  9. Heart rate variability analysis based on time-frequency representation and entropies in hypertrophic cardiomyopathy patients.

    PubMed

    Clariá, F; Vallverdú, M; Baranowski, R; Chojnowska, L; Caminal, P

    2008-03-01

    In hypertrophic cardiomyopathy (HCM) patients there is an increased risk of premature death, which can occur with little or no warning. Furthermore, classification for sudden cardiac death on patients with HCM is very difficult. The aim of our study was to improve the prognostic value of heart rate variability (HRV) in HCM patients, giving insight into changes of the autonomic nervous system. In this way, the suitability of linear and nonlinear measures was studied to assess the HRV. These measures were based on time-frequency representation (TFR) and on Shannon and Rényi entropies, and compared with traditional HRV measures. Holter recordings of 64 patients with HCM and 55 healthy subjects were analyzed. The HCM patients consisted of two groups: 13 high risk patients, after aborted sudden cardiac death (SCD); 51 low risk patients, without SCD. Five-hour RR signals, corresponding to the sleep period of the subjects, were considered for the analysis as a comparable standard situation. These RR signals were filtered in the three frequency bands: very low frequency band (VLF, 0-0.04 Hz), low frequency band (LF, 0.04-0.15 Hz) and high frequency band (HF, 0.15-0.45 Hz). TFR variables based on instantaneous frequency and energy functions were able to classify HCM patients and healthy subjects (control group). Results revealed that measures obtained from TFR analysis of the HRV better classified the groups of subjects than traditional HRV parameters. However, results showed that nonlinear measures improved group classification. It was observed that entropies calculated in the HF band showed the highest statistically significant levels comparing the HCM group and the control group, p-value < 0.0005. The values of entropy measures calculated in the HCM group presented lower values, indicating a decreasing of complexity, than those calculated from the control group. Moreover, similar behavior was observed comparing high and low risk of premature death, the values of the entropy being lower in high risk patients, p-value < 0.05, indicating an increase of predictability. Furthermore, measures from information entropy, but not from TFR, seem to be useful for enhanced risk stratification in HCM patients with an increased risk of sudden cardiac death.

  10. Maximum entropy modeling of metabolic networks by constraining growth-rate moments predicts coexistence of phenotypes

    NASA Astrophysics Data System (ADS)

    De Martino, Daniele

    2017-12-01

    In this work maximum entropy distributions in the space of steady states of metabolic networks are considered upon constraining the first and second moments of the growth rate. Coexistence of fast and slow phenotypes, with bimodal flux distributions, emerges upon considering control on the average growth (optimization) and its fluctuations (heterogeneity). This is applied to the carbon catabolic core of Escherichia coli where it quantifies the metabolic activity of slow growing phenotypes and it provides a quantitative map with metabolic fluxes, opening the possibility to detect coexistence from flux data. A preliminary analysis on data for E. coli cultures in standard conditions shows degeneracy for the inferred parameters that extend in the coexistence region.

  11. Frenetic Bounds on the Entropy Production

    NASA Astrophysics Data System (ADS)

    Maes, Christian

    2017-10-01

    We give a systematic derivation of positive lower bounds for the expected entropy production (EP) rate in classical statistical mechanical systems obeying a dynamical large deviation principle. The logic is the same for the return to thermodynamic equilibrium as it is for steady nonequilibria working under the condition of local detailed balance. We recover there recently studied "uncertainty" relations for the EP, appearing in studies about the effectiveness of mesoscopic machines. In general our refinement of the positivity of the expected EP rate is obtained in terms of a positive and even function of the expected current(s) which measures the dynamical activity in the system, a time-symmetric estimate of the changes in the system's configuration. Also underdamped diffusions can be included in the analysis.

  12. General Rotorcraft Aeromechanical Stability Program (GRASP) - Theory Manual

    DTIC Science & Technology

    1990-10-01

    the A basis. Two symbols frequently encountered in vector operations that use index notation are the Kronecker delta eij and the Levi - Civita epsilon...Blade root cutout fijk Levi - Civita epsilon permutation symbol 0 pretwist angle 0’ pretwist per unit length (d;) Oi Tait-Bryan angles K~i moment strains...the components of the identity tensor in a Cartesian coordinate system, while the Levi Civita epsilon consists of components of the permutation

  13. Scrambled Sobol Sequences via Permutation

    DTIC Science & Technology

    2009-01-01

    LCG LCG64 LFG MLFG PMLCG Sobol Scrambler PermutationScrambler LinearScrambler <<uses>> PermuationFactory StaticFactory DynamicFactory <<uses>> Figure 3...Phy., 19:252–256, 1979. [2] Emanouil I. Atanassov. A new efficient algorithm for generating the scrambled sobol ’ sequence. In NMA ’02: Revised Papers...Deidre W.Evan, and Micheal Mascagni. On the scrambled sobol sequence. In ICCS2005, pages 775–782, 2005. [7] Richard Durstenfeld. Algorithm 235: Random

  14. Optimization and experimental realization of the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Yalçınkaya, I.; Gedik, Z.

    2017-12-01

    The quantum permutation algorithm provides computational speed-up over classical algorithms for determining the parity of a given cyclic permutation. For its n -qubit implementations, the number of required quantum gates scales quadratically with n due to the quantum Fourier transforms included. We show here for the n -qubit case that the algorithm can be simplified so that it requires only O (n ) quantum gates, which theoretically reduces the complexity of the implementation. To test our results experimentally, we utilize IBM's 5-qubit quantum processor to realize the algorithm by using the original and simplified recipes for the 2-qubit case. It turns out that the latter results in a significantly higher success probability which allows us to verify the algorithm more precisely than the previous experimental realizations. We also verify the algorithm for the first time for the 3-qubit case with a considerable success probability by taking the advantage of our simplified scheme.

  15. A faster 1.375-approximation algorithm for sorting by transpositions.

    PubMed

    Cunha, Luís Felipe I; Kowada, Luis Antonio B; Hausen, Rodrigo de A; de Figueiredo, Celina M H

    2015-11-01

    Sorting by Transpositions is an NP-hard problem for which several polynomial-time approximation algorithms have been developed. Hartman and Shamir (2006) developed a 1.5-approximation [Formula: see text] algorithm, whose running time was improved to O(nlogn) by Feng and Zhu (2007) with a data structure they defined, the permutation tree. Elias and Hartman (2006) developed a 1.375-approximation O(n(2)) algorithm, and Firoz et al. (2011) claimed an improvement to the running time, from O(n(2)) to O(nlogn), by using the permutation tree. We provide counter-examples to the correctness of Firoz et al.'s strategy, showing that it is not possible to reach a component by sufficient extensions using the method proposed by them. In addition, we propose a 1.375-approximation algorithm, modifying Elias and Hartman's approach with the use of permutation trees and achieving O(nlogn) time.

  16. A Weak Quantum Blind Signature with Entanglement Permutation

    NASA Astrophysics Data System (ADS)

    Lou, Xiaoping; Chen, Zhigang; Guo, Ying

    2015-09-01

    Motivated by the permutation encryption algorithm, a weak quantum blind signature (QBS) scheme is proposed. It involves three participants, including the sender Alice, the signatory Bob and the trusted entity Charlie, in four phases, i.e., initializing phase, blinding phase, signing phase and verifying phase. In a small-scale quantum computation network, Alice blinds the message based on a quantum entanglement permutation encryption algorithm that embraces the chaotic position string. Bob signs the blinded message with private parameters shared beforehand while Charlie verifies the signature's validity and recovers the original message. Analysis shows that the proposed scheme achieves the secure blindness for the signer and traceability for the message owner with the aid of the authentic arbitrator who plays a crucial role when a dispute arises. In addition, the signature can neither be forged nor disavowed by the malicious attackers. It has a wide application to E-voting and E-payment system, etc.

  17. An extended continuous estimation of distribution algorithm for solving the permutation flow-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Shao, Zhongshi; Pi, Dechang; Shao, Weishi

    2017-11-01

    This article proposes an extended continuous estimation of distribution algorithm (ECEDA) to solve the permutation flow-shop scheduling problem (PFSP). In ECEDA, to make a continuous estimation of distribution algorithm (EDA) suitable for the PFSP, the largest order value rule is applied to convert continuous vectors to discrete job permutations. A probabilistic model based on a mixed Gaussian and Cauchy distribution is built to maintain the exploration ability of the EDA. Two effective local search methods, i.e. revolver-based variable neighbourhood search and Hénon chaotic-based local search, are designed and incorporated into the EDA to enhance the local exploitation. The parameters of the proposed ECEDA are calibrated by means of a design of experiments approach. Simulation results and comparisons based on some benchmark instances show the efficiency of the proposed algorithm for solving the PFSP.

  18. Structural predictor for nonlinear sheared dynamics in simple glass-forming liquids

    NASA Astrophysics Data System (ADS)

    Ingebrigtsen, Trond S.; Tanaka, Hajime

    2018-01-01

    Glass-forming liquids subjected to sufficiently strong shear universally exhibit striking nonlinear behavior; for example, a power-law decrease of the viscosity with increasing shear rate. This phenomenon has attracted considerable attention over the years from both fundamental and applicational viewpoints. However, the out-of-equilibrium and nonlinear nature of sheared fluids have made theoretical understanding of this phenomenon very challenging and thus slower to progress. We find here that the structural relaxation time as a function of the two-body excess entropy, calculated for the extensional axis of the shear flow, collapses onto the corresponding equilibrium curve for a wide range of pair potentials ranging from harsh repulsive to soft and finite. This two-body excess entropy collapse provides a powerful approach to predicting the dynamics of nonequilibrium liquids from their equilibrium counterparts. Furthermore, the two-body excess entropy scaling suggests that sheared dynamics is controlled purely by the liquid structure captured in the form of the two-body excess entropy along the extensional direction, shedding light on the perplexing mechanism behind shear thinning.

  19. Maximum entropy production in environmental and ecological systems.

    PubMed

    Kleidon, Axel; Malhi, Yadvinder; Cox, Peter M

    2010-05-12

    The coupled biosphere-atmosphere system entails a vast range of processes at different scales, from ecosystem exchange fluxes of energy, water and carbon to the processes that drive global biogeochemical cycles, atmospheric composition and, ultimately, the planetary energy balance. These processes are generally complex with numerous interactions and feedbacks, and they are irreversible in their nature, thereby producing entropy. The proposed principle of maximum entropy production (MEP), based on statistical mechanics and information theory, states that thermodynamic processes far from thermodynamic equilibrium will adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate. This issue focuses on the latest development of applications of MEP to the biosphere-atmosphere system including aspects of the atmospheric circulation, the role of clouds, hydrology, vegetation effects, ecosystem exchange of energy and mass, biogeochemical interactions and the Gaia hypothesis. The examples shown in this special issue demonstrate the potential of MEP to contribute to improved understanding and modelling of the biosphere and the wider Earth system, and also explore limitations and constraints to the application of the MEP principle.

  20. Structural predictor for nonlinear sheared dynamics in simple glass-forming liquids.

    PubMed

    Ingebrigtsen, Trond S; Tanaka, Hajime

    2018-01-02

    Glass-forming liquids subjected to sufficiently strong shear universally exhibit striking nonlinear behavior; for example, a power-law decrease of the viscosity with increasing shear rate. This phenomenon has attracted considerable attention over the years from both fundamental and applicational viewpoints. However, the out-of-equilibrium and nonlinear nature of sheared fluids have made theoretical understanding of this phenomenon very challenging and thus slower to progress. We find here that the structural relaxation time as a function of the two-body excess entropy, calculated for the extensional axis of the shear flow, collapses onto the corresponding equilibrium curve for a wide range of pair potentials ranging from harsh repulsive to soft and finite. This two-body excess entropy collapse provides a powerful approach to predicting the dynamics of nonequilibrium liquids from their equilibrium counterparts. Furthermore, the two-body excess entropy scaling suggests that sheared dynamics is controlled purely by the liquid structure captured in the form of the two-body excess entropy along the extensional direction, shedding light on the perplexing mechanism behind shear thinning.

Top