Change point detection of the Persian Gulf sea surface temperature
NASA Astrophysics Data System (ADS)
Shirvani, A.
2017-01-01
In this study, the Student's t parametric and Mann-Whitney nonparametric change point models (CPMs) were applied to detect change point in the annual Persian Gulf sea surface temperature anomalies (PGSSTA) time series for the period 1951-2013. The PGSSTA time series, which were serially correlated, were transformed to produce an uncorrelated pre-whitened time series. The pre-whitened PGSSTA time series were utilized as the input file of change point models. Both the applied parametric and nonparametric CPMs estimated the change point in the PGSSTA in 1992. The PGSSTA follow the normal distribution up to 1992 and thereafter, but with a different mean value after year 1992. The estimated slope of linear trend in PGSSTA time series for the period 1951-1992 was negative; however, that was positive after the detected change point. Unlike the PGSSTA, the applied CPMs suggested no change point in the Niño3.4SSTA time series.
Constructing networks from a dynamical system perspective for multivariate nonlinear time series.
Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael
2016-03-01
We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
... will not apply to such long-term options series until the time to expiration is less than nine months... trading on the Exchange, the Exchange shall from time to time open for trading series of options therein... not apply to options series until the time to expiration is less than nine months (that is, until such...
NASA Astrophysics Data System (ADS)
Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.
2018-03-01
This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.
Reconstruction of ensembles of coupled time-delay systems from time series.
Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P
2014-06-01
We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.
Scale-dependent intrinsic entropies of complex time series.
Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E
2016-04-13
Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).
Application of an Entropic Approach to Assessing Systems Integration
2012-03-01
two econometrical measures of information efficiency – Shannon entropy and Hurst exponent . Shannon entropy (which is explained in Chapter III) can be...applied to evaluate long-term correlation of time series, while Hurst exponent can be applied to classify the time series in accordance to existence...of trend. Hurst exponent is the statistical measure of time series long-range dependence, and its value falls in the interval [0, 1] – a value in
Athanasiou, Christos I; Kopsini, Angeliki
2018-06-12
In the field of antimicrobial resistance, the number of studies that use time series data has increased recently. The purpose of this study is the systematic review of all studies on antibacterial consumption and on Pseudomonas aeruginosa resistance in healthcare settings, that have used time series data. A systematic review of the literature till June 2017 was conducted. All the studies that have used time series data and have examined the inhospital antibiotic consumption and Ps. aeruginosa resistance rates or incidence were eligible. No other exclusion criteria were applied. Data on the structure, terminology used, methods used and results of each article were recorded and analyzed as possible. A total of thirty six studies were retrieved, twenty three of which were in accordance with our criteria. Thirteen of them were quasi experimental studies and ten were ecological observational studies. Eighteen studies collected time series data of both parameters and the statistical methodology of "time series analysis" was applied in nine studies. Most of the studies were published in the last eight years. The Interrupted Time Series design was the most widespread. As expected, there was high heterogeneity in regard to the study design, terminology and statistical methods applied. Copyright © 2018. Published by Elsevier Ltd.
The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis
NASA Astrophysics Data System (ADS)
Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.
2017-12-01
The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.
Graph theory applied to the analysis of motor activity in patients with schizophrenia and depression
Fasmer, Erlend Eindride; Berle, Jan Øystein; Oedegaard, Ketil J.; Hauge, Erik R.
2018-01-01
Depression and schizophrenia are defined only by their clinical features, and diagnostic separation between them can be difficult. Disturbances in motor activity pattern are central features of both types of disorders. We introduce a new method to analyze time series, called the similarity graph algorithm. Time series of motor activity, obtained from actigraph registrations over 12 days in depressed and schizophrenic patients, were mapped into a graph and we then applied techniques from graph theory to characterize these time series, primarily looking for changes in complexity. The most marked finding was that depressed patients were found to be significantly different from both controls and schizophrenic patients, with evidence of less regularity of the time series, when analyzing the recordings with one hour intervals. These findings support the contention that there are important differences in control systems regulating motor behavior in patients with depression and schizophrenia. The similarity graph algorithm we have described can easily be applied to the study of other types of time series. PMID:29668743
Fasmer, Erlend Eindride; Fasmer, Ole Bernt; Berle, Jan Øystein; Oedegaard, Ketil J; Hauge, Erik R
2018-01-01
Depression and schizophrenia are defined only by their clinical features, and diagnostic separation between them can be difficult. Disturbances in motor activity pattern are central features of both types of disorders. We introduce a new method to analyze time series, called the similarity graph algorithm. Time series of motor activity, obtained from actigraph registrations over 12 days in depressed and schizophrenic patients, were mapped into a graph and we then applied techniques from graph theory to characterize these time series, primarily looking for changes in complexity. The most marked finding was that depressed patients were found to be significantly different from both controls and schizophrenic patients, with evidence of less regularity of the time series, when analyzing the recordings with one hour intervals. These findings support the contention that there are important differences in control systems regulating motor behavior in patients with depression and schizophrenia. The similarity graph algorithm we have described can easily be applied to the study of other types of time series.
Cross-correlation of point series using a new method
NASA Technical Reports Server (NTRS)
Strothers, Richard B.
1994-01-01
Traditional methods of cross-correlation of two time series do not apply to point time series. Here, a new method, devised specifically for point series, utilizes a correlation measure that is based in the rms difference (or, alternatively, the median absolute difference) between nearest neightbors in overlapped segments of the two series. Error estimates for the observed locations of the points, as well as a systematic shift of one series with respect to the other to accommodate a constant, but unknown, lead or lag, are easily incorporated into the analysis using Monte Carlo techniques. A methodological restriction adopted here is that one series be treated as a template series against which the other, called the target series, is cross-correlated. To estimate a significance level for the correlation measure, the adopted alternative (null) hypothesis is that the target series arises from a homogeneous Poisson process. The new method is applied to cross-correlating the times of the greatest geomagnetic storms with the times of maximum in the undecennial solar activity cycle.
A comparison of high-frequency cross-correlation measures
NASA Astrophysics Data System (ADS)
Precup, Ovidiu V.; Iori, Giulia
2004-12-01
On a high-frequency scale the time series are not homogeneous, therefore standard correlation measures cannot be directly applied to the raw data. There are two ways to deal with this problem. The time series can be homogenised through an interpolation method (An Introduction to High-Frequency Finance, Academic Press, NY, 2001) (linear or previous tick) and then the Pearson correlation statistic computed. Recently, methods that can handle raw non-synchronous time series have been developed (Int. J. Theor. Appl. Finance 6(1) (2003) 87; J. Empirical Finance 4 (1997) 259). This paper compares two traditional methods that use interpolation with an alternative method applied directly to the actual time series.
Time-reversibility in seismic sequences: Application to the seismicity of Mexican subduction zone
NASA Astrophysics Data System (ADS)
Telesca, L.; Flores-Márquez, E. L.; Ramírez-Rojas, A.
2018-02-01
In this paper we investigate the time-reversibility of series associated with the seismicity of five seismic areas of the subduction zone beneath the Southwest Pacific Mexican coast, applying the horizontal visibility graph method to the series of earthquake magnitudes, interevent times, interdistances and magnitude increments. We applied the Kullback-Leibler divergence D that is a metric for quantifying the degree of time-irreversibility in time series. Our findings suggest that among the five seismic areas, Jalisco-Colima is characterized by time-reversibility in all the four seismic series. Our results are consistent with the peculiar seismo-tectonic characteristics of Jalisco-Colima, which is the closest to the Middle American Trench and belongs to the Mexican volcanic arc.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...
The MEM of spectral analysis applied to L.O.D.
NASA Astrophysics Data System (ADS)
Fernandez, L. I.; Arias, E. F.
The maximum entropy method (MEM) has been widely applied for polar motion studies taking advantage of its performance on the management of complex time series. The authors used the algorithm of the MEM to estimate Cross Spectral function in order to compare interannual Length-of-Day (LOD) time series with Southern Oscillation Index (SOI) and Sea Surface Temperature (SST) series, which are close related to El Niño-Southern Oscillation (ENSO) events.
Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin
NASA Astrophysics Data System (ADS)
zhang, L.
2011-12-01
Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be studied through the copula theory. As to the parameter estimation, the maximum likelihood estimation (MLE) will be applied. To illustrate the method, the univariate time series model and the dependence structure will be determined and tested using the monthly discharge time series of Cuyahoga River Basin.
Forecasting Enrollments with Fuzzy Time Series.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals
NASA Astrophysics Data System (ADS)
Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.
2018-02-01
Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ <α-, where α+ is related to heart rate decelerations and α- to heart rate accelerations, and the proportion of the signal in which the above inequality holds. A very similar effect is observed if asymmetric noise is added to a symmetric self-affine function. No such phenomena are observed in the same physiological data after shuffling or with a group of symmetric synthetic time series.
NASA Astrophysics Data System (ADS)
Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi
2012-10-01
In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.
Improving GNSS time series for volcano monitoring: application to Canary Islands (Spain)
NASA Astrophysics Data System (ADS)
García-Cañada, Laura; Sevilla, Miguel J.; Pereda de Pablo, Jorge; Domínguez Cerdeña, Itahiza
2017-04-01
The number of permanent GNSS stations has increased significantly in recent years for different geodetic applications such as volcano monitoring, which require a high precision. Recently we have started to have coordinates time series long enough so that we can apply different analysis and filters that allow us to improve the GNSS coordinates results. Following this idea we have processed data from GNSS permanent stations used by the Spanish Instituto Geográfico Nacional (IGN) for volcano monitoring in Canary Islands to obtained time series by double difference processing method with Bernese v5.0 for the period 2007-2014. We have identified the characteristics of these time series and obtained models to estimate velocities with greater accuracy and more realistic uncertainties. In order to improve the results we have used two kinds of filters to improve the time series. The first, a spatial filter, has been computed using the series of residuals of all stations in the Canary Islands without an anomalous behaviour after removing a linear trend. This allows us to apply this filter to all sets of coordinates of the permanent stations reducing their dispersion. The second filter takes account of the temporal correlation in the coordinate time series for each station individually. A research about the evolution of the velocity depending on the series length has been carried out and it has demonstrated the need for using time series of at least four years. Therefore, in those stations with more than four years of data, we calculated the velocity and the characteristic parameters in order to have time series of residuals. This methodology has been applied to the GNSS data network in El Hierro (Canary Islands) during the 2011-2012 eruption and the subsequent magmatic intrusions (2012-2014). The results show that in the new series it is easier to detect anomalous behaviours in the coordinates, so they are most useful to detect crustal deformations in volcano monitoring.
Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis
NASA Astrophysics Data System (ADS)
Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal
Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.
Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
2001-01-01
Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…
Early warning by near-real time disturbance monitoring (Invited)
NASA Astrophysics Data System (ADS)
Verbesselt, J.; Zeileis, A.; Herold, M.
2013-12-01
Near real-time monitoring of ecosystem disturbances is critical for rapidly assessing and addressing impacts on carbon dynamics, biodiversity, and socio-ecological processes. Satellite remote sensing enables cost-effective and accurate monitoring at frequent time steps over large areas. Yet, generic methods to detect disturbances within newly captured satellite images are lacking. We propose a multi-purpose time-series-based disturbance detection approach that identifies and models stable historical variation to enable change detection within newly acquired data. Satellite image time series of vegetation greenness provide a global record of terrestrial vegetation productivity over the past decades. Here, we assess and demonstrate the method by applying it to (1) real-world satellite greenness image time series between February 2000 and July 2011 covering Somalia to detect drought-related vegetation disturbances (2) landsat image time series to detect forest disturbances. First, results illustrate that disturbances are successfully detected in near real-time while being robust to seasonality and noise. Second, major drought-related disturbance corresponding with most drought-stressed regions in Somalia are detected from mid-2010 onwards. Third, the method can be applied to landsat image time series having a lower temporal data density. Furthermore the method can analyze in-situ or satellite data time series of biophysical indicators from local to global scale since it is fast, does not depend on thresholds and does not require time series gap filling. While the data and methods used are appropriate for proof-of-concept development of global scale disturbance monitoring, specific applications (e.g., drought or deforestation monitoring) mandates integration within an operational monitoring framework. Furthermore, the real-time monitoring method is implemented in open-source environment and is freely available in the BFAST package for R software. Information illustrating how to apply the method on satellite image time series are available at http://bfast.R-Forge.R-project.org/ and the example section of the bfastmonitor() function within the BFAST package.
FATS: Feature Analysis for Time Series
NASA Astrophysics Data System (ADS)
Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim
2017-11-01
FATS facilitates and standardizes feature extraction for time series data; it quickly and efficiently calculates a compilation of many existing light curve features. Users can characterize or analyze an astronomical photometric database, though this library is not necessarily restricted to the astronomical domain and can also be applied to any kind of time series data.
Information and Complexity Measures Applied to Observed and Simulated Soil Moisture Time Series
USDA-ARS?s Scientific Manuscript database
Time series of soil moisture-related parameters provides important insights in functioning of soil water systems. Analysis of patterns within these time series has been used in several studies. The objective of this work was to compare patterns in observed and simulated soil moisture contents to u...
Time series momentum and contrarian effects in the Chinese stock market
NASA Astrophysics Data System (ADS)
Shi, Huai-Long; Zhou, Wei-Xing
2017-10-01
This paper concentrates on the time series momentum or contrarian effects in the Chinese stock market. We evaluate the performance of the time series momentum strategy applied to major stock indices in mainland China and explore the relation between the performance of time series momentum strategies and some firm-specific characteristics. Our findings indicate that there is a time series momentum effect in the short run and a contrarian effect in the long run in the Chinese stock market. The performances of the time series momentum and contrarian strategies are highly dependent on the look-back and holding periods and firm-specific characteristics.
Testing for intracycle determinism in pseudoperiodic time series.
Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A
2008-06-01
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.
Local normalization: Uncovering correlations in non-stationary financial time series
NASA Astrophysics Data System (ADS)
Schäfer, Rudi; Guhr, Thomas
2010-09-01
The measurement of correlations between financial time series is of vital importance for risk management. In this paper we address an estimation error that stems from the non-stationarity of the time series. We put forward a method to rid the time series of local trends and variable volatility, while preserving cross-correlations. We test this method in a Monte Carlo simulation, and apply it to empirical data for the S&P 500 stocks.
Smoothing of climate time series revisited
NASA Astrophysics Data System (ADS)
Mann, Michael E.
2008-08-01
We present an easily implemented method for smoothing climate time series, generalizing upon an approach previously described by Mann (2004). The method adaptively weights the three lowest order time series boundary constraints to optimize the fit with the raw time series. We apply the method to the instrumental global mean temperature series from 1850-2007 and to various surrogate global mean temperature series from 1850-2100 derived from the CMIP3 multimodel intercomparison project. These applications demonstrate that the adaptive method systematically out-performs certain widely used default smoothing methods, and is more likely to yield accurate assessments of long-term warming trends.
The short time Fourier transform and local signals
NASA Astrophysics Data System (ADS)
Okumura, Shuhei
In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.
Analysis of Time-Series Quasi-Experiments. Final Report.
ERIC Educational Resources Information Center
Glass, Gene V.; Maguire, Thomas O.
The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…
A Computer Evolution in Teaching Undergraduate Time Series
ERIC Educational Resources Information Center
Hodgess, Erin M.
2004-01-01
In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…
The detection of local irreversibility in time series based on segmentation
NASA Astrophysics Data System (ADS)
Teng, Yue; Shang, Pengjian
2018-06-01
We propose a strategy for the detection of local irreversibility in stationary time series based on multiple scale. The detection is beneficial to evaluate the displacement of irreversibility toward local skewness. By means of this method, we can availably discuss the local irreversible fluctuations of time series as the scale changes. The method was applied to simulated nonlinear signals generated by the ARFIMA process and logistic map to show how the irreversibility functions react to the increasing of the multiple scale. The method was applied also to series of financial markets i.e., American, Chinese and European markets. The local irreversibility for different markets demonstrate distinct characteristics. Simulations and real data support the need of exploring local irreversibility.
Modeling seasonal variation of hip fracture in Montreal, Canada.
Modarres, Reza; Ouarda, Taha B M J; Vanasse, Alain; Orzanco, Maria Gabriela; Gosselin, Pierre
2012-04-01
The investigation of the association of the climate variables with hip fracture incidences is important in social health issues. This study examined and modeled the seasonal variation of monthly population based hip fracture rate (HFr) time series. The seasonal ARIMA time series modeling approach is used to model monthly HFr incidences time series of female and male patients of the ages 40-74 and 75+ of Montreal, Québec province, Canada, in the period of 1993-2004. The correlation coefficients between meteorological variables such as temperature, snow depth, rainfall depth and day length and HFr are significant. The nonparametric Mann-Kendall test for trend assessment and the nonparametric Levene's test and Wilcoxon's test for checking the difference of HFr before and after change point are also used. The seasonality in HFr indicated sharp difference between winter and summer time. The trend assessment showed decreasing trends in HFr of female and male groups. The nonparametric test also indicated a significant change of the mean HFr. A seasonal ARIMA model was applied for HFr time series without trend and a time trend ARIMA model (TT-ARIMA) was developed and fitted to HFr time series with a significant trend. The multi criteria evaluation showed the adequacy of SARIMA and TT-ARIMA models for modeling seasonal hip fracture time series with and without significant trend. In the time series analysis of HFr of the Montreal region, the effects of the seasonal variation of climate variables on hip fracture are clear. The Seasonal ARIMA model is useful for modeling HFr time series without trend. However, for time series with significant trend, the TT-ARIMA model should be applied for modeling HFr time series. Copyright © 2011 Elsevier Inc. All rights reserved.
Time Series Discord Detection in Medical Data using a Parallel Relational Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.
Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less
Time Series Discord Detection in Medical Data using a Parallel Relational Database [PowerPoint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodbridge, Diane; Wilson, Andrew T.; Rintoul, Mark Daniel
Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less
Information retrieval for nonstationary data records
NASA Technical Reports Server (NTRS)
Su, M. Y.
1971-01-01
A review and a critical discussion are made on the existing methods for analysis of nonstationary time series, and a new algorithm for splitting nonstationary time series, is applied to the analysis of sunspot data.
Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.
Kis, Maria
2005-01-01
In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.
Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A
2017-04-01
Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.
Phase unwrapping in three dimensions with application to InSAR time series.
Hooper, Andrew; Zebker, Howard A
2007-09-01
The problem of phase unwrapping in two dimensions has been studied extensively in the past two decades, but the three-dimensional (3D) problem has so far received relatively little attention. We develop here a theoretical framework for 3D phase unwrapping and also describe two algorithms for implementation, both of which can be applied to synthetic aperture radar interferometry (InSAR) time series. We test the algorithms on simulated data and find both give more accurate results than a two-dimensional algorithm. When applied to actual InSAR time series, we find good agreement both between the algorithms and with ground truth.
Real-time Series Resistance Monitoring in PV Systems; NREL (National Renewable Energy Laboratory)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deceglie, M. G.; Silverman, T. J.; Marion, B.
We apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IV curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IVmore » curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on micro-inverters or module-integrated electronics, but it can also be extended to full strings. Automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks, including broken ribbons, broken solder bonds, and contact problems in the junction or combiner box. We describe the method in detail and describe a sample application to data collected from modules operating in the field.« less
Phase walk analysis of leptokurtic time series.
Schreiber, Korbinian; Modest, Heike I; Räth, Christoph
2018-06-01
The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.
Empirical method to measure stochasticity and multifractality in nonlinear time series
NASA Astrophysics Data System (ADS)
Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping
2013-12-01
An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.
Boolean network inference from time series data incorporating prior biological knowledge.
Haider, Saad; Pal, Ranadip
2012-01-01
Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
Identifying hidden common causes from bivariate time series: a method using recurrence plots.
Hirata, Yoshito; Aihara, Kazuyuki
2010-01-01
We propose a method for inferring the existence of hidden common causes from observations of bivariate time series. We detect related time series by excessive simultaneous recurrences in the corresponding recurrence plots. We also use a noncoverage property of a recurrence plot by the other to deny the existence of a directional coupling. We apply the proposed method to real wind data.
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
2014-06-15
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
Estimation of Parameters from Discrete Random Nonstationary Time Series
NASA Astrophysics Data System (ADS)
Takayasu, H.; Nakamura, T.
For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.
A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
Non-parametric characterization of long-term rainfall time series
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Pandey, Brij Kishor
2018-03-01
The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.
NASA Astrophysics Data System (ADS)
Berx, Barbara; Payne, Mark R.
2017-04-01
Scientific interest in the sub-polar gyre of the North Atlantic Ocean has increased in recent years. The sub-polar gyre has contracted and weakened, and changes in circulation pathways have been linked to changes in marine ecosystem productivity. To aid fisheries and environmental scientists, we present here a time series of the Sub-Polar Gyre Index (SPG-I) based on monthly mean maps of sea surface height. The established definition of the SPG-I is applied, and the first EOF (empirical orthogonal function) and PC (principal component) are presented. Sensitivity to the spatial domain and time series length are explored but found not to be important factors in terms of the SPG-I's interpretation. Our time series compares well with indices presented previously. The SPG-I time series is freely available online (http://dx.doi.org/10.7489/1806-1), and we invite the community to access, apply, and publish studies using this index time series.
A comparison of the stochastic and machine learning approaches in hydrologic time series forecasting
NASA Astrophysics Data System (ADS)
Kim, T.; Joo, K.; Seo, J.; Heo, J. H.
2016-12-01
Hydrologic time series forecasting is an essential task in water resources management and it becomes more difficult due to the complexity of runoff process. Traditional stochastic models such as ARIMA family has been used as a standard approach in time series modeling and forecasting of hydrological variables. Due to the nonlinearity in hydrologic time series data, machine learning approaches has been studied with the advantage of discovering relevant features in a nonlinear relation among variables. This study aims to compare the predictability between the traditional stochastic model and the machine learning approach. Seasonal ARIMA model was used as the traditional time series model, and Random Forest model which consists of decision tree and ensemble method using multiple predictor approach was applied as the machine learning approach. In the application, monthly inflow data from 1986 to 2015 of Chungju dam in South Korea were used for modeling and forecasting. In order to evaluate the performances of the used models, one step ahead and multi-step ahead forecasting was applied. Root mean squared error and mean absolute error of two models were compared.
NASA Astrophysics Data System (ADS)
Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris
2018-02-01
We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Volatility of linear and nonlinear time series
NASA Astrophysics Data System (ADS)
Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo
2005-07-01
Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.
Wu, Zi Yi; Xie, Ping; Sang, Yan Fang; Gu, Hai Ting
2018-04-01
The phenomenon of jump is one of the importantly external forms of hydrological variabi-lity under environmental changes, representing the adaption of hydrological nonlinear systems to the influence of external disturbances. Presently, the related studies mainly focus on the methods for identifying the jump positions and jump times in hydrological time series. In contrast, few studies have focused on the quantitative description and classification of jump degree in hydrological time series, which make it difficult to understand the environmental changes and evaluate its potential impacts. Here, we proposed a theatrically reliable and easy-to-apply method for the classification of jump degree in hydrological time series, using the correlation coefficient as a basic index. The statistical tests verified the accuracy, reasonability, and applicability of this method. The relationship between the correlation coefficient and the jump degree of series were described using mathematical equation by derivation. After that, several thresholds of correlation coefficients under different statistical significance levels were chosen, based on which the jump degree could be classified into five levels: no, weak, moderate, strong and very strong. Finally, our method was applied to five diffe-rent observed hydrological time series, with diverse geographic and hydrological conditions in China. The results of the classification of jump degrees in those series were closely accorded with their physically hydrological mechanisms, indicating the practicability of our method.
A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data
NASA Astrophysics Data System (ADS)
Awajan, Ahmad Mohd; Ismail, Mohd Tahir
2017-08-01
Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.
Fulcher, Ben D; Jones, Nick S
2017-11-22
Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Application of computational mechanics to the analysis of natural data: an example in geomagnetism.
Clarke, Richard W; Freeman, Mervyn P; Watkins, Nicholas W
2003-01-01
We discuss how the ideal formalism of computational mechanics can be adapted to apply to a noninfinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the concept of effective soficity is introduced. We believe that computational mechanics cannot be applied to a noisy and finite data series without invoking an argument based upon effective soficity. A related distinction between noise and unresolved structure is also defined: Noise can only be eliminated by increasing the length of the time series, whereas the resolution of previously unresolved structure only requires the finite memory of the analysis to be increased. The benefits of these concepts are demonstrated in a simulated times series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to an analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the Earth-based station under an electrical current pattern that is fixed with respect to the Sun-Earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, some useful terminology for the discussion of model construction in general is introduced.
A geodetic matched filter search for slow slip with application to the Mexico subduction zone
NASA Astrophysics Data System (ADS)
Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W. B.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.
2017-12-01
Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low-frequency earthquakes and repeating earthquakes provide evidence of low-amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent data sets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with postprocessed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modeling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T, and Mw of events larger than Mw 6 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the downdip edges of the Mw>7.5 slow slip events.
A geodetic matched-filter search for slow slip with application to the Mexico subduction zone
NASA Astrophysics Data System (ADS)
Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.
2017-12-01
Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low frequency earthquakes and repeating earthquakes provide evidence of low amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here, we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent datasets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with post-processed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modelling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T and Mw of events larger than Mw 6.0 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the down dip edges of the Mw > 7.5 SSEs.
Time lagged ordinal partition networks for capturing dynamics of continuous dynamical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCullough, Michael; Iu, Herbert Ho-Ching; Small, Michael
2015-05-15
We investigate a generalised version of the recently proposed ordinal partition time series to network transformation algorithm. First, we introduce a fixed time lag for the elements of each partition that is selected using techniques from traditional time delay embedding. The resulting partitions define regions in the embedding phase space that are mapped to nodes in the network space. Edges are allocated between nodes based on temporal succession thus creating a Markov chain representation of the time series. We then apply this new transformation algorithm to time series generated by the Rössler system and find that periodic dynamics translate tomore » ring structures whereas chaotic time series translate to band or tube-like structures—thereby indicating that our algorithm generates networks whose structure is sensitive to system dynamics. Furthermore, we demonstrate that simple network measures including the mean out degree and variance of out degrees can track changes in the dynamical behaviour in a manner comparable to the largest Lyapunov exponent. We also apply the same analysis to experimental time series generated by a diode resonator circuit and show that the network size, mean shortest path length, and network diameter are highly sensitive to the interior crisis captured in this particular data set.« less
Detection of a sudden change of the field time series based on the Lorenz system.
Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.
Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology
NASA Technical Reports Server (NTRS)
Forkel, Matthias; Carvalhais, Nuno; Verbesselt, Jan; Mahecha, Miguel D.; Neigh, Christopher S.R.; Reichstein, Markus
2013-01-01
Changing trends in ecosystem productivity can be quantified using satellite observations of Normalized Difference Vegetation Index (NDVI). However, the estimation of trends from NDVI time series differs substantially depending on analyzed satellite dataset, the corresponding spatiotemporal resolution, and the applied statistical method. Here we compare the performance of a wide range of trend estimation methods and demonstrate that performance decreases with increasing inter-annual variability in the NDVI time series. Trend slope estimates based on annual aggregated time series or based on a seasonal-trend model show better performances than methods that remove the seasonal cycle of the time series. A breakpoint detection analysis reveals that an overestimation of breakpoints in NDVI trends can result in wrong or even opposite trend estimates. Based on our results, we give practical recommendations for the application of trend methods on long-term NDVI time series. Particularly, we apply and compare different methods on NDVI time series in Alaska, where both greening and browning trends have been previously observed. Here, the multi-method uncertainty of NDVI trends is quantified through the application of the different trend estimation methods. Our results indicate that greening NDVI trends in Alaska are more spatially and temporally prevalent than browning trends. We also show that detected breakpoints in NDVI trends tend to coincide with large fires. Overall, our analyses demonstrate that seasonal trend methods need to be improved against inter-annual variability to quantify changing trends in ecosystem productivity with higher accuracy.
Time series with tailored nonlinearities
NASA Astrophysics Data System (ADS)
Räth, C.; Laut, I.
2015-10-01
It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.
Estimating phase synchronization in dynamical systems using cellular nonlinear networks
NASA Astrophysics Data System (ADS)
Sowa, Robert; Chernihovskyi, Anton; Mormann, Florian; Lehnertz, Klaus
2005-06-01
We propose a method for estimating phase synchronization between time series using the parallel computing architecture of cellular nonlinear networks (CNN’s). Applying this method to time series of coupled nonlinear model systems and to electroencephalographic time series from epilepsy patients, we show that an accurate approximation of the mean phase coherence R —a bivariate measure for phase synchronization—can be achieved with CNN’s using polynomial-type templates.
Detection of chaos: New approach to atmospheric pollen time-series analysis
NASA Astrophysics Data System (ADS)
Bianchi, M. M.; Arizmendi, C. M.; Sanchez, J. R.
1992-09-01
Pollen and spores are biological particles that are ubiquitous to the atmosphere and are pathologically significant, causing plant diseases and inhalant allergies. One of the main objectives of aerobiological surveys is forecasting. Prediction models are required in order to apply aerobiological knowledge to medical or agricultural practice; a necessary condition of these models is not to be chaotic. The existence of chaos is detected through the analysis of a time series. The time series comprises hourly counts of atmospheric pollen grains obtained using a Burkard spore trap from 1987 to 1989 at Mar del Plata. Abraham's method to obtain the correlation dimension was applied. A low and fractal dimension shows chaotic dynamics. The predictability of models for atomspheric pollen forecasting is discussed.
Tobias, Robert; Inauen, Jennifer
2010-10-01
Gathering time-series data of behaviors and psychological variables is important to understand, guide, and evaluate behavior-change campaigns and other change processes. However, repeated measurement can affect the phenomena investigated, particularly frequent face-to-face interviews, which are often the only option in developing countries. This article presents three intervention control studies to investigate this issue. Daily diaries in Cuba did not affect behavior or attitudes for persons with intervention but reduced attitudes for persons without intervention. Reactivity of face-to-face interviews in Bolivia was negligible if applied weekly, but strong if applied twice per week. The article concludes with recommendations for gathering time-series data in developing countries.
Time-Series Analysis: A Cautionary Tale
NASA Technical Reports Server (NTRS)
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
Indispensable finite time corrections for Fokker-Planck equations from time series data.
Ragwitz, M; Kantz, H
2001-12-17
The reconstruction of Fokker-Planck equations from observed time series data suffers strongly from finite sampling rates. We show that previously published results are degraded considerably by such effects. We present correction terms which yield a robust estimation of the diffusion terms, together with a novel method for one-dimensional problems. We apply these methods to time series data of local surface wind velocities, where the dependence of the diffusion constant on the state variable shows a different behavior than previously suggested.
Refined composite multiscale weighted-permutation entropy of financial time series
NASA Astrophysics Data System (ADS)
Zhang, Yongping; Shang, Pengjian
2018-04-01
For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.
NASA Astrophysics Data System (ADS)
Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas
2017-04-01
For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.
NASA Astrophysics Data System (ADS)
Xie, Hong-Bo; Dokos, Socrates
2013-06-01
We present a hybrid symplectic geometry and central tendency measure (CTM) method for detection of determinism in noisy time series. CTM is effective for detecting determinism in short time series and has been applied in many areas of nonlinear analysis. However, its performance significantly degrades in the presence of strong noise. In order to circumvent this difficulty, we propose to use symplectic principal component analysis (SPCA), a new chaotic signal de-noising method, as the first step to recover the system dynamics. CTM is then applied to determine whether the time series arises from a stochastic process or has a deterministic component. Results from numerical experiments, ranging from six benchmark deterministic models to 1/f noise, suggest that the hybrid method can significantly improve detection of determinism in noisy time series by about 20 dB when the data are contaminated by Gaussian noise. Furthermore, we apply our algorithm to study the mechanomyographic (MMG) signals arising from contraction of human skeletal muscle. Results obtained from the hybrid symplectic principal component analysis and central tendency measure demonstrate that the skeletal muscle motor unit dynamics can indeed be deterministic, in agreement with previous studies. However, the conventional CTM method was not able to definitely detect the underlying deterministic dynamics. This result on MMG signal analysis is helpful in understanding neuromuscular control mechanisms and developing MMG-based engineering control applications.
Xie, Hong-Bo; Dokos, Socrates
2013-06-01
We present a hybrid symplectic geometry and central tendency measure (CTM) method for detection of determinism in noisy time series. CTM is effective for detecting determinism in short time series and has been applied in many areas of nonlinear analysis. However, its performance significantly degrades in the presence of strong noise. In order to circumvent this difficulty, we propose to use symplectic principal component analysis (SPCA), a new chaotic signal de-noising method, as the first step to recover the system dynamics. CTM is then applied to determine whether the time series arises from a stochastic process or has a deterministic component. Results from numerical experiments, ranging from six benchmark deterministic models to 1/f noise, suggest that the hybrid method can significantly improve detection of determinism in noisy time series by about 20 dB when the data are contaminated by Gaussian noise. Furthermore, we apply our algorithm to study the mechanomyographic (MMG) signals arising from contraction of human skeletal muscle. Results obtained from the hybrid symplectic principal component analysis and central tendency measure demonstrate that the skeletal muscle motor unit dynamics can indeed be deterministic, in agreement with previous studies. However, the conventional CTM method was not able to definitely detect the underlying deterministic dynamics. This result on MMG signal analysis is helpful in understanding neuromuscular control mechanisms and developing MMG-based engineering control applications.
Clustering Financial Time Series by Network Community Analysis
NASA Astrophysics Data System (ADS)
Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio
In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.
Entropy of electromyography time series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.
2007-12-01
A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.
Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana
2007-04-01
Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P < 0.001). The patients with CHD had higher fractal dimension in each exercise test program separately, as well as in exercise program at all. ApEn was significant lower in CHD group in both RR and ST-T ECG intervals (P < 0.001). The nonlinear dynamic methods could have clinical and prognostic applicability also in short-time ECG series. Dynamic analysis based on chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.
Detection of a sudden change of the field time series based on the Lorenz system
Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832
Validation of a national hydrological model
NASA Astrophysics Data System (ADS)
McMillan, H. K.; Booker, D. J.; Cattoën, C.
2016-10-01
Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.
Time-series modeling of long-term weight self-monitoring data.
Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka
2015-08-01
Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.
Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.
Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo
2007-10-01
Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.
Evaluation of nonlinearity and validity of nonlinear modeling for complex time series
NASA Astrophysics Data System (ADS)
Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo
2007-10-01
Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.
Financial time series analysis based on information categorization method
NASA Astrophysics Data System (ADS)
Tian, Qiang; Shang, Pengjian; Feng, Guochen
2014-12-01
The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.
31 CFR 344.5 - What other provisions apply to subscriptions for Time Deposit securities?
Code of Federal Regulations, 2011 CFR
2011-07-01
... U.S. TREASURY SECURITIES-STATE AND LOCAL GOVERNMENT SERIES Time Deposit Securities § 344.5 What other provisions apply to subscriptions for Time Deposit securities? (a) When is my subscription due... subscriptions for Time Deposit securities? 344.5 Section 344.5 Money and Finance: Treasury Regulations Relating...
31 CFR 344.5 - What other provisions apply to subscriptions for Time Deposit securities?
Code of Federal Regulations, 2012 CFR
2012-07-01
... U.S. TREASURY SECURITIES-STATE AND LOCAL GOVERNMENT SERIES Time Deposit Securities § 344.5 What other provisions apply to subscriptions for Time Deposit securities? (a) When is my subscription due... subscriptions for Time Deposit securities? 344.5 Section 344.5 Money and Finance: Treasury Regulations Relating...
31 CFR 344.5 - What other provisions apply to subscriptions for Time Deposit securities?
Code of Federal Regulations, 2013 CFR
2013-07-01
... U.S. TREASURY SECURITIES-STATE AND LOCAL GOVERNMENT SERIES Time Deposit Securities § 344.5 What other provisions apply to subscriptions for Time Deposit securities? (a) When is my subscription due... subscriptions for Time Deposit securities? 344.5 Section 344.5 Money and Finance: Treasury Regulations Relating...
31 CFR 344.5 - What other provisions apply to subscriptions for Time Deposit securities?
Code of Federal Regulations, 2010 CFR
2010-07-01
... U.S. TREASURY SECURITIES-STATE AND LOCAL GOVERNMENT SERIES Time Deposit Securities § 344.5 What other provisions apply to subscriptions for Time Deposit securities? (a) When is my subscription due... subscriptions for Time Deposit securities? 344.5 Section 344.5 Money and Finance: Treasury Regulations Relating...
31 CFR 344.5 - What other provisions apply to subscriptions for Time Deposit securities?
Code of Federal Regulations, 2014 CFR
2014-07-01
... SERVICE U.S. TREASURY SECURITIES-STATE AND LOCAL GOVERNMENT SERIES Time Deposit Securities § 344.5 What other provisions apply to subscriptions for Time Deposit securities? (a) When is my subscription due... subscriptions for Time Deposit securities? 344.5 Section 344.5 Money and Finance: Treasury Regulations Relating...
Signal processing techniques were applied to high-resolution time series data obtained from conductivity loggers placed upstream and downstream of a wastewater treatment facility along a river. Data was collected over 14-60 days, and several seasons. The power spectral densit...
Variance fluctuations in nonstationary time series: a comparative study of music genres
NASA Astrophysics Data System (ADS)
Jennings, Heather D.; Ivanov, Plamen Ch.; De Martins, Allan M.; da Silva, P. C.; Viswanathan, G. M.
2004-05-01
An important problem in physics concerns the analysis of audio time series generated by transduced acoustic phenomena. Here, we develop a new method to quantify the scaling properties of the local variance of nonstationary time series. We apply this technique to analyze audio signals obtained from selected genres of music. We find quantitative differences in the correlation properties of high art music, popular music, and dance music. We discuss the relevance of these objective findings in relation to the subjective experience of music.
A harmonic linear dynamical system for prominent ECG feature extraction.
Thi, Ngoc Anh Nguyen; Yang, Hyung-Jeong; Kim, SunHee; Do, Luu Ngoc
2014-01-01
Unsupervised mining of electrocardiography (ECG) time series is a crucial task in biomedical applications. To have efficiency of the clustering results, the prominent features extracted from preprocessing analysis on multiple ECG time series need to be investigated. In this paper, a Harmonic Linear Dynamical System is applied to discover vital prominent features via mining the evolving hidden dynamics and correlations in ECG time series. The discovery of the comprehensible and interpretable features of the proposed feature extraction methodology effectively represents the accuracy and the reliability of clustering results. Particularly, the empirical evaluation results of the proposed method demonstrate the improved performance of clustering compared to the previous main stream feature extraction approaches for ECG time series clustering tasks. Furthermore, the experimental results on real-world datasets show scalability with linear computation time to the duration of the time series.
Strakova, Eva; Zikova, Alice; Vohradsky, Jiri
2014-01-01
A computational model of gene expression was applied to a novel test set of microarray time series measurements to reveal regulatory interactions between transcriptional regulators represented by 45 sigma factors and the genes expressed during germination of a prokaryote Streptomyces coelicolor. Using microarrays, the first 5.5 h of the process was recorded in 13 time points, which provided a database of gene expression time series on genome-wide scale. The computational modeling of the kinetic relations between the sigma factors, individual genes and genes clustered according to the similarity of their expression kinetics identified kinetically plausible sigma factor-controlled networks. Using genome sequence annotations, functional groups of genes that were predominantly controlled by specific sigma factors were identified. Using external binding data complementing the modeling approach, specific genes involved in the control of the studied process were identified and their function suggested.
Maydeu-Olivares, Alberto
2016-01-01
Nesselroade and Molenaar advocate the use of an idiographic filter approach. This is a fixed-effects approach, which may limit the number of individuals that can be simultaneously modeled, and it is not clear how to model the presence of subpopulations. Most important, Nesselroade and Molenaar's proposal appears to be best suited for modeling long time series on a few variables for a few individuals. Long time series are not common in psychological applications. Can it be applied to the usual longitudinal data we face? These are characterized by short time series (four to five points in time), hundreds of individuals, and dozens of variables. If so, what do we gain? Applied settings most often involve between-individual decisions. I conjecture that their approach will not outperform common, simpler, methods. However, when intraindividual decisions are involved, their approach may have an edge.
Stress Corrosion Cracking Study of Aluminum Alloys Using Electrochemical Noise Analysis
NASA Astrophysics Data System (ADS)
Rathod, R. C.; Sapate, S. G.; Raman, R.; Rathod, W. S.
2013-12-01
Stress corrosion cracking studies of aluminum alloys AA2219, AA8090, and AA5456 in heat-treated and non heat-treated condition were carried out using electrochemical noise technique with various applied stresses. Electrochemical noise time series data (corrosion potential vs. time) was obtained for the stressed tensile specimens in 3.5% NaCl aqueous solution at room temperature (27 °C). The values of drop in corrosion potential, total corrosion potential, mean corrosion potential, and hydrogen overpotential were evaluated from corrosion potential versus time series data. The electrochemical noise time series data was further analyzed with rescaled range ( R/ S) analysis proposed by Hurst to obtain the Hurst exponent. According to the results, higher values of the Hurst exponents with increased applied stresses showed more susceptibility to stress corrosion cracking as confirmed in case of alloy AA 2219 and AA8090.
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue
2017-01-01
Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques. PMID:28383496
Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue
2017-04-06
Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques.
Altiparmak, Fatih; Ferhatosmanoglu, Hakan; Erdal, Selnur; Trost, Donald C
2006-04-01
An effective analysis of clinical trials data involves analyzing different types of data such as heterogeneous and high dimensional time series data. The current time series analysis methods generally assume that the series at hand have sufficient length to apply statistical techniques to them. Other ideal case assumptions are that data are collected in equal length intervals, and while comparing time series, the lengths are usually expected to be equal to each other. However, these assumptions are not valid for many real data sets, especially for the clinical trials data sets. An addition, the data sources are different from each other, the data are heterogeneous, and the sensitivity of the experiments varies by the source. Approaches for mining time series data need to be revisited, keeping the wide range of requirements in mind. In this paper, we propose a novel approach for information mining that involves two major steps: applying a data mining algorithm over homogeneous subsets of data, and identifying common or distinct patterns over the information gathered in the first step. Our approach is implemented specifically for heterogeneous and high dimensional time series clinical trials data. Using this framework, we propose a new way of utilizing frequent itemset mining, as well as clustering and declustering techniques with novel distance metrics for measuring similarity between time series data. By clustering the data, we find groups of analytes (substances in blood) that are most strongly correlated. Most of these relationships already known are verified by the clinical panels, and, in addition, we identify novel groups that need further biomedical analysis. A slight modification to our algorithm results an effective declustering of high dimensional time series data, which is then used for "feature selection." Using industry-sponsored clinical trials data sets, we are able to identify a small set of analytes that effectively models the state of normal health.
Ghalyan, Najah F; Miller, David J; Ray, Asok
2018-06-12
Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.
Analysis of financial time series using multiscale entropy based on skewness and kurtosis
NASA Astrophysics Data System (ADS)
Xu, Meng; Shang, Pengjian
2018-01-01
There is a great interest in studying dynamic characteristics of the financial time series of the daily stock closing price in different regions. Multi-scale entropy (MSE) is effective, mainly in quantifying the complexity of time series on different time scales. This paper applies a new method for financial stability from the perspective of MSE based on skewness and kurtosis. To better understand the superior coarse-graining method for the different kinds of stock indexes, we take into account the developmental characteristics of the three continents of Asia, North America and European stock markets. We study the volatility of different financial time series in addition to analyze the similarities and differences of coarsening time series from the perspective of skewness and kurtosis. A kind of corresponding relationship between the entropy value of stock sequences and the degree of stability of financial markets, were observed. The three stocks which have particular characteristics in the eight piece of stock sequences were discussed, finding the fact that it matches the result of applying the MSE method to showing results on a graph. A comparative study is conducted to simulate over synthetic and real world data. Results show that the modified method is more effective to the change of dynamics and has more valuable information. The result is obtained at the same time, finding the results of skewness and kurtosis discrimination is obvious, but also more stable.
NASA Astrophysics Data System (ADS)
Baydaroğlu, Özlem; Koçak, Kasım; Duran, Kemal
2018-06-01
Prediction of water amount that will enter the reservoirs in the following month is of vital importance especially for semi-arid countries like Turkey. Climate projections emphasize that water scarcity will be one of the serious problems in the future. This study presents a methodology for predicting river flow for the subsequent month based on the time series of observed monthly river flow with hybrid models of support vector regression (SVR). Monthly river flow over the period 1940-2012 observed for the Kızılırmak River in Turkey has been used for training the method, which then has been applied for predictions over a period of 3 years. SVR is a specific implementation of support vector machines (SVMs), which transforms the observed input data time series into a high-dimensional feature space (input matrix) by way of a kernel function and performs a linear regression in this space. SVR requires a special input matrix. The input matrix was produced by wavelet transforms (WT), singular spectrum analysis (SSA), and a chaotic approach (CA) applied to the input time series. WT convolutes the original time series into a series of wavelets, and SSA decomposes the time series into a trend, an oscillatory and a noise component by singular value decomposition. CA uses a phase space formed by trajectories, which represent the dynamics producing the time series. These three methods for producing the input matrix for the SVR proved successful, while the SVR-WT combination resulted in the highest coefficient of determination and the lowest mean absolute error.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronfman, B. H.
Time-series analysis provides a useful tool in the evaluation of public policy outputs. It is shown that the general Box and Jenkins method, when extended to allow for multiple interrupts, enables researchers simultaneously to examine changes in drift and level of a series, and to select the best fit model for the series. As applied to urban renewal allocations, results show significant changes in the level of the series, corresponding to changes in party control of the Executive. No support is given to the ''incrementalism'' hypotheses as no significant changes in drift are found.
A study of stationarity in time series by using wavelet transform
NASA Astrophysics Data System (ADS)
Dghais, Amel Abdoullah Ahmed; Ismail, Mohd Tahir
2014-07-01
In this work the core objective is to apply discrete wavelet transform (DWT) functions namely Haar, Daubechies, Symmlet, Coiflet and discrete approximation of the meyer wavelets in non-stationary financial time series data from US stock market (DJIA30). The data consists of 2048 daily data of closing index starting from December 17, 2004 until October 23, 2012. From the unit root test the results show that the data is non stationary in the level. In order to study the stationarity of a time series, the autocorrelation function (ACF) is used. Results indicate that, Haar function is the lowest function to obtain noisy series as compared to Daubechies, Symmlet, Coiflet and discrete approximation of the meyer wavelets. In addition, the original data after decomposition by DWT is less noisy series than decomposition by DWT for return time series.
Streamflow Prediction based on Chaos Theory
NASA Astrophysics Data System (ADS)
Li, X.; Wang, X.; Babovic, V. M.
2015-12-01
Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.
Visual analytics techniques for large multi-attribute time series data
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.
2008-01-01
Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
NASA Satellite Data for Seagrass Health Modeling and Monitoring
NASA Technical Reports Server (NTRS)
Spiering, Bruce A.; Underwood, Lauren; Ross, Kenton
2011-01-01
Time series derived information for coastal waters will be used to provide input data for the Fong and Harwell model. The current MODIS land mask limits where the model can be applied; this project will: a) Apply MODIS data with resolution higher than the standard products (250-m vs. 1-km). b) Seek to refine the land mask. c) Explore nearby areas to use as proxies for time series directly over the beds. Novel processing approaches will be leveraged from other NASA projects and customized as inputs for seagrass productivity modeling
Signal processing techniques were applied to high-resolution time series data obtained from conductivity loggers placed upstream and downstream of an oil and gas wastewater treatment facility along a river. Data was collected over 14-60 days. The power spectral density was us...
NASA Astrophysics Data System (ADS)
Zakynthinaki, M. S.; Stirling, J. R.
2007-01-01
Stochastic optimization is applied to the problem of optimizing the fit of a model to the time series of raw physiological (heart rate) data. The physiological response to exercise has been recently modeled as a dynamical system. Fitting the model to a set of raw physiological time series data is, however, not a trivial task. For this reason and in order to calculate the optimal values of the parameters of the model, the present study implements the powerful stochastic optimization method ALOPEX IV, an algorithm that has been proven to be fast, effective and easy to implement. The optimal parameters of the model, calculated by the optimization method for the particular athlete, are very important as they characterize the athlete's current condition. The present study applies the ALOPEX IV stochastic optimization to the modeling of a set of heart rate time series data corresponding to different exercises of constant intensity. An analysis of the optimization algorithm, together with an analytic proof of its convergence (in the absence of noise), is also presented.
NASA Astrophysics Data System (ADS)
Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert
2016-05-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond.
NASA Astrophysics Data System (ADS)
Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.
2015-12-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond. Spatial and temporal patterns were further analysed with an ecological perspective in a follow-up study. Results show that changes in land use patterns such as land use intensification and reduced agricultural expansion reflect the socio-economic transformations that occurred in the region
Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology
NASA Astrophysics Data System (ADS)
Sun, N.; Wang, Y. J.
2018-04-01
Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.
Investigation of Cepstrum Analysis for Seismic/Acoustic Signal Sensor Range Determination.
1981-01-01
distorted by transmission through a linear system . For example, the effect of multipath and reverberation may be modeled in terms of a signal that is...called the short time averaged cepstrum. To derive some analytical expressions for short time average cepstrums we choose some functions of interest...linear process applied to the time series or any equivalent time function Repiod Period The amount of time required for one cycle of a time series Saphe
Complexity multiscale asynchrony measure and behavior for interacting financial dynamics
NASA Astrophysics Data System (ADS)
Yang, Ge; Wang, Jun; Niu, Hongli
2016-08-01
A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.
Distinguishing time-delayed causal interactions using convergent cross mapping
Ye, Hao; Deyle, Ethan R.; Gilarranz, Luis J.; Sugihara, George
2015-01-01
An important problem across many scientific fields is the identification of causal effects from observational data alone. Recent methods (convergent cross mapping, CCM) have made substantial progress on this problem by applying the idea of nonlinear attractor reconstruction to time series data. Here, we expand upon the technique of CCM by explicitly considering time lags. Applying this extended method to representative examples (model simulations, a laboratory predator-prey experiment, temperature and greenhouse gas reconstructions from the Vostok ice core, and long-term ecological time series collected in the Southern California Bight), we demonstrate the ability to identify different time-delayed interactions, distinguish between synchrony induced by strong unidirectional-forcing and true bidirectional causality, and resolve transitive causal chains. PMID:26435402
POD Model Reconstruction for Gray-Box Fault Detection
NASA Technical Reports Server (NTRS)
Park, Han; Zak, Michail
2007-01-01
Proper orthogonal decomposition (POD) is the mathematical basis of a method of constructing low-order mathematical models for the "gray-box" fault-detection algorithm that is a component of a diagnostic system known as beacon-based exception analysis for multi-missions (BEAM). POD has been successfully applied in reducing computational complexity by generating simple models that can be used for control and simulation for complex systems such as fluid flows. In the present application to BEAM, POD brings the same benefits to automated diagnosis. BEAM is a method of real-time or offline, automated diagnosis of a complex dynamic system.The gray-box approach makes it possible to utilize incomplete or approximate knowledge of the dynamics of the system that one seeks to diagnose. In the gray-box approach, a deterministic model of the system is used to filter a time series of system sensor data to remove the deterministic components of the time series from further examination. What is left after the filtering operation is a time series of residual quantities that represent the unknown (or at least unmodeled) aspects of the behavior of the system. Stochastic modeling techniques are then applied to the residual time series. The procedure for detecting abnormal behavior of the system then becomes one of looking for statistical differences between the residual time series and the predictions of the stochastic model.
Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui
2016-08-31
Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.
NASA Astrophysics Data System (ADS)
Huang, Xin; Zeng, Jun; Zhou, Lina; Hu, Chunxiu; Yin, Peiyuan; Lin, Xiaohui
2016-08-01
Time-series metabolomics studies can provide insight into the dynamics of disease development and facilitate the discovery of prospective biomarkers. To improve the performance of early risk identification, a new strategy for analyzing time-series data based on dynamic networks (ATSD-DN) in a systematic time dimension is proposed. In ATSD-DN, the non-overlapping ratio was applied to measure the changes in feature ratios during the process of disease development and to construct dynamic networks. Dynamic concentration analysis and network topological structure analysis were performed to extract early warning information. This strategy was applied to the study of time-series lipidomics data from a stepwise hepatocarcinogenesis rat model. A ratio of lyso-phosphatidylcholine (LPC) 18:1/free fatty acid (FFA) 20:5 was identified as the potential biomarker for hepatocellular carcinoma (HCC). It can be used to classify HCC and non-HCC rats, and the area under the curve values in the discovery and external validation sets were 0.980 and 0.972, respectively. This strategy was also compared with a weighted relative difference accumulation algorithm (wRDA), multivariate empirical Bayes statistics (MEBA) and support vector machine-recursive feature elimination (SVM-RFE). The better performance of ATSD-DN suggests its potential for a more complete presentation of time-series changes and effective extraction of early warning information.
Coyle, R.T.; Barrett, J.M.
1982-05-04
Disclosed is a process for substantially reducing the series resistance of a solar cell having a thick film metal contact assembly thereon while simultaneously removing oxide coatings from the surface of the assembly prior to applying solder therewith. The process includes applying a flux to the contact assembly and heating the cell for a period of time sufficient to substantially remove the series resistance associated with the assembly by etching the assembly with the flux while simultaneously removing metal oxides from said surface of said assembly.
Coyle, R. T.; Barrett, Joy M.
1984-01-01
Disclosed is a process for substantially reducing the series resistance of a solar cell having a thick film metal contact assembly thereon while simultaneously removing oxide coatings from the surface of the assembly prior to applying solder therewith. The process includes applying a flux to the contact assembly and heating the cell for a period of time sufficient to substantially remove the series resistance associated with the assembly by etching the assembly with the flux while simultaneously removing metal oxides from said surface of said assembly.
NASA Astrophysics Data System (ADS)
Cannas, Barbara; Fanni, Alessandra; Murari, Andrea; Pisano, Fabio; Contributors, JET
2018-02-01
In this paper, the dynamic characteristics of type-I ELM time-series from the JET tokamak, the world’s largest magnetic confinement plasma physics experiment, have been investigated. The dynamic analysis has been focused on the detection of nonlinear structure in D α radiation time series. Firstly, the method of surrogate data has been applied to evaluate the statistical significance of the null hypothesis of static nonlinear distortion of an underlying Gaussian linear process. Several nonlinear statistics have been evaluated, such us the time delayed mutual information, the correlation dimension and the maximal Lyapunov exponent. The obtained results allow us to reject the null hypothesis, giving evidence of underlying nonlinear dynamics. Moreover, no evidence of low-dimensional chaos has been found; indeed, the analysed time series are better characterized by the power law sensitivity to initial conditions which can suggest a motion at the ‘edge of chaos’, at the border between chaotic and regular non-chaotic dynamics. This uncertainty makes it necessary to further investigate about the nature of the nonlinear dynamics. For this purpose, a second surrogate test to distinguish chaotic orbits from pseudo-periodic orbits has been applied. In this case, we cannot reject the null hypothesis which means that the ELM time series is possibly pseudo-periodic. In order to reproduce pseudo-periodic dynamical properties, a periodic state-of-the-art model, proposed to reproduce the ELM cycle, has been corrupted by a dynamical noise, obtaining time series qualitatively in agreement with experimental time series.
Minimum entropy density method for the time series analysis
NASA Astrophysics Data System (ADS)
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
Characterising experimental time series using local intrinsic dimension
NASA Astrophysics Data System (ADS)
Buzug, Thorsten M.; von Stamm, Jens; Pfister, Gerd
1995-02-01
Experimental strange attractors are analysed with the averaged local intrinsic dimension proposed by A. Passamante et al. [Phys. Rev. A 39 (1989) 3640] which is based on singular value decomposition of local trajectory matrices. The results are compared to the values of Kaplan-Yorke and the correlation dimension. The attractors, reconstructed with Takens' delay time coordinates from scalar velocity time series, are measured in the hydrodynamic Taylor-Couette system. A period doubling route towards chaos obtained from a very short Taylor-Couette cylinder yields a sequence of experimental time series where the local intrinsic dimension is applied.
Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng
2015-01-01
The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior. PMID:26000011
NASA Astrophysics Data System (ADS)
Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang
2018-02-01
This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.
Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng
2015-01-01
The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.
A note on an attempt at more efficient Poisson series evaluation. [for lunar libration
NASA Technical Reports Server (NTRS)
Shelus, P. J.; Jefferys, W. H., III
1975-01-01
A substantial reduction has been achieved in the time necessary to compute lunar libration series. The method involves eliminating many of the trigonometric function calls by a suitable transformation and applying a short SNOBOL processor to the FORTRAN coding of the transformed series, which obviates many of the multiplication operations during the course of series evaluation. It is possible to accomplish similar results quite easily with other Poisson series.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
Measuring Two Decades of Ice Mass Loss using GRACE and SLR
NASA Astrophysics Data System (ADS)
Bonin, J. A.; Chambers, D. P.
2016-12-01
We use Satellite Laser Ranging (SLR) to extend the time series of ice mass change back in time to 1994. The SLR series is of far lesser spatial resolution than GRACE, so we apply a constrained inversion technique to better localize the signal. We approximate the likely errors due to SLR's measurement errors combined with the inversion errors from using a low-resolution series, then estimate the interannual mass change over Greenland and Antarctica.
Mining Recent Temporal Patterns for Event Detection in Multivariate Time Series Data
Batal, Iyad; Fradkin, Dmitriy; Harrison, James; Moerchen, Fabian; Hauskrecht, Milos
2015-01-01
Improving the performance of classifiers using pattern mining techniques has been an active topic of data mining research. In this work we introduce the recent temporal pattern mining framework for finding predictive patterns for monitoring and event detection problems in complex multivariate time series data. This framework first converts time series into time-interval sequences of temporal abstractions. It then constructs more complex temporal patterns backwards in time using temporal operators. We apply our framework to health care data of 13,558 diabetic patients and show its benefits by efficiently finding useful patterns for detecting and diagnosing adverse medical conditions that are associated with diabetes. PMID:25937993
Application of time series analysis for assessing reservoir trophic status
Paris Honglay Chen; Ka-Chu Leung
2000-01-01
This study is to develop and apply a practical procedure for the time series analysis of reservoir eutrophication conditions. A multiplicative decomposition method is used to determine the trophic variations including seasonal, circular, long-term and irregular changes. The results indicate that (1) there is a long high peak for seven months from April to October...
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Segmentation of time series with long-range fractal correlations.
Bernaola-Galván, P; Oliver, J L; Hackenberg, M; Coronado, A V; Ivanov, P Ch; Carpena, P
2012-06-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.
Introduction and application of the multiscale coefficient of variation analysis.
Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh
2017-10-01
Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.
Two States Mapping Based Time Series Neural Network Model for Compensation Prediction Residual Error
NASA Astrophysics Data System (ADS)
Jung, Insung; Koo, Lockjo; Wang, Gi-Nam
2008-11-01
The objective of this paper was to design a model of human bio signal data prediction system for decreasing of prediction error using two states mapping based time series neural network BP (back-propagation) model. Normally, a lot of the industry has been applied neural network model by training them in a supervised manner with the error back-propagation algorithm for time series prediction systems. However, it still has got a residual error between real value and prediction result. Therefore, we designed two states of neural network model for compensation residual error which is possible to use in the prevention of sudden death and metabolic syndrome disease such as hypertension disease and obesity. We determined that most of the simulation cases were satisfied by the two states mapping based time series prediction model. In particular, small sample size of times series were more accurate than the standard MLP model.
Improvements to surrogate data methods for nonstationary time series.
Lucio, J H; Valdés, R; Rodríguez, L R
2012-05-01
The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.
The local properties of ocean surface waves by the phase-time method
NASA Technical Reports Server (NTRS)
Huang, Norden E.; Long, Steven R.; Tung, Chi-Chao; Donelan, Mark A.; Yuan, Yeli; Lai, Ronald J.
1992-01-01
A new approach using phase information to view and study the properties of frequency modulation, wave group structures, and wave breaking is presented. The method is applied to ocean wave time series data and a new type of wave group (containing the large 'rogue' waves) is identified. The method also has the capability of broad applications in the analysis of time series data in general.
Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.
Malkin, Zinovy
2016-04-01
The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.
Empirical Investigation of Critical Transitions in Paleoclimate
NASA Astrophysics Data System (ADS)
Loskutov, E. M.; Mukhin, D.; Gavrilov, A.; Feigin, A.
2016-12-01
In this work we apply a new empirical method for the analysis of complex spatially distributed systems to the analysis of paleoclimate data. The method consists of two general parts: (i) revealing the optimal phase-space variables and (ii) construction the empirical prognostic model by observed time series. The method of phase space variables construction based on the data decomposition into nonlinear dynamical modes which was successfully applied to global SST field and allowed clearly separate time scales and reveal climate shift in the observed data interval [1]. The second part, the Bayesian approach to optimal evolution operator reconstruction by time series is based on representation of evolution operator in the form of nonlinear stochastic function represented by artificial neural networks [2,3]. In this work we are focused on the investigation of critical transitions - the abrupt changes in climate dynamics - in match longer time scale process. It is well known that there were number of critical transitions on different time scales in the past. In this work, we demonstrate the first results of applying our empirical methods to analysis of paleoclimate variability. In particular, we discuss the possibility of detecting, identifying and prediction such critical transitions by means of nonlinear empirical modeling using the paleoclimate record time series. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep155102. Ya. I. Molkov, D. N. Mukhin, E. M. Loskutov, A.M. Feigin, (2012) : Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.3. Mukhin, D., Kondrashov, D., Loskutov, E., Gavrilov, A., Feigin, A., & Ghil, M. (2015). Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models. Journal of Climate, 28(5), 1962-1976. http://doi.org/10.1175/JCLI-D-14-00240.1
Orbital forced frequencies in the 975000 year pollen record from Tenagi Philippon (Greece)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mommersteeg, H.J.P.M.; Young, R.; Wijmstra, T.A.
Frequency analysis was applied to different time series obtained from the 975 ka pollen record of Tenagi Philippon (Macedonia, Greece). These time series are characteristic of different vegetation types related to specific climatic conditions. Time control of the 196 m deep core was based on 11 finite {sup 14}C dates in the upper 17 m, magnetostratigraphy and correlation with the marine oxygen isotope stratigraphy. Maximum entropy spectrum analyses and thomson multi-taper spectrum analysis were applied using the complete time series. Periods of 95-99, 40-45. 24.0-25.5 and 19-21 ka which can be related to orbital forcing, as well as periods ofmore » about 68, 30 ka and of about 15.5, 13.5, 12 and 10.5 ka were detected. The detected periods of about 68, 30 ka and 16, 14, 12, 10.5 ka are likely to be harmonics and combination tones of the periods related to orbital forcing. The period of around 30 ka is possibly a secondary peak of obliquity. To study the stability of the detected periods through time, analysis with a moving window was employed. Signals in the eccentricity band were detected clearly during the last 650 ka. In the precession band, detected periods of about 24 ka show an increase in amplitude during the last 650 ka. The evolution of orbital frequencies during the last 1.0 Ma is in general agreement with the results of other marine and continental time series. Time series related to different climatic settings showed a different response to orbital forcing. Time series of vegetational elements sensitive to changes in net precipitation were forced in the precession and obliquity bands. Changes in precession caused changes in the monsoon system, which indirectly had a strong influence on the climatic history of Greece. Time series of vegetational elements which are more indicative of changes in annual temperature are forced in the eccentricity band. 54 refs., 12 figs., 3 tabs.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
...; Special Conditions No. 25-442-SC] Special Conditions: Boeing Model 747-8 Series Airplanes; Overhead Flight... conditions. SUMMARY: These special conditions are issued for Boeing Model 747-8 series airplanes. These... applied for, and was granted, an extension of time for the amended type certificate, which changed the...
Kinetics analysis and quantitative calculations for the successive radioactive decay process
NASA Astrophysics Data System (ADS)
Zhou, Zhiping; Yan, Deyue; Zhao, Yuliang; Chai, Zhifang
2015-01-01
The general radioactive decay kinetics equations with branching were developed and the analytical solutions were derived by Laplace transform method. The time dependence of all the nuclide concentrations can be easily obtained by applying the equations to any known radioactive decay series. Taking the example of thorium radioactive decay series, the concentration evolution over time of various nuclide members in the family has been given by the quantitative numerical calculations with a computer. The method can be applied to the quantitative prediction and analysis for the daughter nuclides in the successive decay with branching of the complicated radioactive processes, such as the natural radioactive decay series, nuclear reactor, nuclear waste disposal, nuclear spallation, synthesis and identification of superheavy nuclides, radioactive ion beam physics and chemistry, etc.
Quantifying complexity of financial short-term time series by composite multiscale entropy measure
NASA Astrophysics Data System (ADS)
Niu, Hongli; Wang, Jun
2015-05-01
It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.
Transformation-cost time-series method for analyzing irregularly sampled data
NASA Astrophysics Data System (ADS)
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
Transformation-cost time-series method for analyzing irregularly sampled data.
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades
Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd
2017-01-01
The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter. PMID:28813566
A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades.
Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd
2017-08-01
The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter.
Process fault detection and nonlinear time series analysis for anomaly detection in safeguards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, T.L.; Mullen, M.F.; Wangen, L.E.
In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less
feets: feATURE eXTRACTOR for tIME sERIES
NASA Astrophysics Data System (ADS)
Cabral, Juan; Sanchez, Bruno; Ramos, Felipe; Gurovich, Sebastián; Granitto, Pablo; VanderPlas, Jake
2018-06-01
feets characterizes and analyzes light-curves from astronomical photometric databases for modelling, classification, data cleaning, outlier detection and data analysis. It uses machine learning algorithms to determine the numerical descriptors that characterize and distinguish the different variability classes of light-curves; these range from basic statistical measures such as the mean or standard deviation to complex time-series characteristics such as the autocorrelation function. The library is not restricted to the astronomical field and could also be applied to any kind of time series. This project is a derivative work of FATS (ascl:1711.017).
Flamm, Christoph; Graef, Andreas; Pirker, Susanne; Baumgartner, Christoph; Deistler, Manfred
2013-01-01
Granger causality is a useful concept for studying causal relations in networks. However, numerical problems occur when applying the corresponding methodology to high-dimensional time series showing co-movement, e.g. EEG recordings or economic data. In order to deal with these shortcomings, we propose a novel method for the causal analysis of such multivariate time series based on Granger causality and factor models. We present the theoretical background, successfully assess our methodology with the help of simulated data and show a potential application in EEG analysis of epileptic seizures. PMID:23354014
Bayesian dynamic modeling of time series of dengue disease case counts.
Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander
2017-07-01
The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen; Talpe, Matthieu; Shum, C. K.; Schmidt, Michael
2017-12-01
In recent decades, decomposition techniques have enabled increasingly more applications for dimension reduction, as well as extraction of additional information from geophysical time series. Traditionally, the principal component analysis (PCA)/empirical orthogonal function (EOF) method and more recently the independent component analysis (ICA) have been applied to extract, statistical orthogonal (uncorrelated), and independent modes that represent the maximum variance of time series, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the autocovariance matrix and diagonalizing higher (than two) order statistical tensors from centered time series, respectively. However, the stationarity assumption in these techniques is not justified for many geophysical and climate variables even after removing cyclic components, e.g., the commonly removed dominant seasonal cycles. In this paper, we present a novel decomposition method, the complex independent component analysis (CICA), which can be applied to extract non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA, where (a) we first define a new complex dataset that contains the observed time series in its real part, and their Hilbert transformed series as its imaginary part, (b) an ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex dataset in (a), and finally, (c) the dominant independent complex modes are extracted and used to represent the dominant space and time amplitudes and associated phase propagation patterns. The performance of CICA is examined by analyzing synthetic data constructed from multiple physically meaningful modes in a simulation framework, with known truth. Next, global terrestrial water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) gravimetry mission (2003-2016), and satellite radiometric sea surface temperature (SST) data (1982-2016) over the Atlantic and Pacific Oceans are used with the aim of demonstrating signal separations of the North Atlantic Oscillation (NAO) from the Atlantic Multi-decadal Oscillation (AMO), and the El Niño Southern Oscillation (ENSO) from the Pacific Decadal Oscillation (PDO). CICA results indicate that ENSO-related patterns can be extracted from the Gravity Recovery And Climate Experiment Terrestrial Water Storage (GRACE TWS) with an accuracy of 0.5-1 cm in terms of equivalent water height (EWH). The magnitude of errors in extracting NAO or AMO from SST data using the complex EOF (CEOF) approach reaches up to 50% of the signal itself, while it is reduced to 16% when applying CICA. Larger errors with magnitudes of 100% and 30% of the signal itself are found while separating ENSO from PDO using CEOF and CICA, respectively. We thus conclude that the CICA is more effective than CEOF in separating non-stationary patterns.
Allan deviation analysis of financial return series
NASA Astrophysics Data System (ADS)
Hernández-Pérez, R.
2012-05-01
We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.
2005-11-01
more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates
NASA Astrophysics Data System (ADS)
Li, Jinyang; Shang, Pengjian
2018-07-01
Irreversibility is an important property of time series. In this paper, we propose the higher moments and multiscale Kullback-Leibler divergence to analyze time series irreversibility. The higher moments Kullback-Leibler divergence (HMKLD) can amplify irreversibility and make the irreversibility variation more obvious. Therefore, many time series whose irreversibility is hard to be found are also able to show the variations. We employ simulated data and financial stock data to test and verify this method, and find that HMKLD of stock data is growing in the form of fluctuations. As for multiscale Kullback-Leibler divergence (MKLD), it is very complex in the dynamic system, so that exploring the law of simulation and stock system is difficult. In conventional multiscale entropy method, the coarse-graining process is non-overlapping, however we apply a different coarse-graining process and obtain a surprising discovery. The result shows when the scales are 4 and 5, their entropy is nearly similar, which demonstrates MKLD is efficient to display characteristics of time series irreversibility.
A window-based time series feature extraction method.
Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife
2017-10-01
This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Du, Kongchang; Zhao, Ying; Lei, Jiaqiang
2017-09-01
In hydrological time series prediction, singular spectrum analysis (SSA) and discrete wavelet transform (DWT) are widely used as preprocessing techniques for artificial neural network (ANN) and support vector machine (SVM) predictors. These hybrid or ensemble models seem to largely reduce the prediction error. In current literature researchers apply these techniques to the whole observed time series and then obtain a set of reconstructed or decomposed time series as inputs to ANN or SVM. However, through two comparative experiments and mathematical deduction we found the usage of SSA and DWT in building hybrid models is incorrect. Since SSA and DWT adopt 'future' values to perform the calculation, the series generated by SSA reconstruction or DWT decomposition contain information of 'future' values. These hybrid models caused incorrect 'high' prediction performance and may cause large errors in practice.
Testing for nonlinearity in non-stationary physiological time series.
Guarín, Diego; Delgado, Edilson; Orozco, Álvaro
2011-01-01
Testing for nonlinearity is one of the most important preprocessing steps in nonlinear time series analysis. Typically, this is done by means of the linear surrogate data methods. But it is a known fact that the validity of the results heavily depends on the stationarity of the time series. Since most physiological signals are non-stationary, it is easy to falsely detect nonlinearity using the linear surrogate data methods. In this document, we propose a methodology to extend the procedure for generating constrained surrogate time series in order to assess nonlinearity in non-stationary data. The method is based on the band-phase-randomized surrogates, which consists (contrary to the linear surrogate data methods) in randomizing only a portion of the Fourier phases in the high frequency domain. Analysis of simulated time series showed that in comparison to the linear surrogate data method, our method is able to discriminate between linear stationarity, linear non-stationary and nonlinear time series. Applying our methodology to heart rate variability (HRV) records of five healthy patients, we encountered that nonlinear correlations are present in this non-stationary physiological signals.
Modeling and forecasting of KLCI weekly return using WT-ANN integrated model
NASA Astrophysics Data System (ADS)
Liew, Wei-Thong; Liong, Choong-Yeun; Hussain, Saiful Izzuan; Isa, Zaidi
2013-04-01
The forecasting of weekly return is one of the most challenging tasks in investment since the time series are volatile and non-stationary. In this study, an integrated model of wavelet transform and artificial neural network, WT-ANN is studied for modeling and forecasting of KLCI weekly return. First, the WT is applied to decompose the weekly return time series in order to eliminate noise. Then, a mathematical model of the time series is constructed using the ANN. The performance of the suggested model will be evaluated by root mean squared error (RMSE), mean absolute error (MAE), mean absolute percentage error (MAPE). The result shows that the WT-ANN model can be considered as a feasible and powerful model for time series modeling and prediction.
NASA Astrophysics Data System (ADS)
Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.
2008-11-01
We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.
Wang, Fang; Wang, Lin; Chen, Yuming
2017-08-31
In order to investigate the time-dependent cross-correlations of fine particulate (PM2.5) series among neighboring cities in Northern China, in this paper, we propose a new cross-correlation coefficient, the time-lagged q-L dependent height crosscorrelation coefficient (denoted by p q (τ, L)), which incorporates the time-lag factor and the fluctuation amplitude information into the analogous height cross-correlation analysis coefficient. Numerical tests are performed to illustrate that the newly proposed coefficient ρ q (τ, L) can be used to detect cross-correlations between two series with time lags and to identify different range of fluctuations at which two series possess cross-correlations. Applying the new coefficient to analyze the time-dependent cross-correlations of PM2.5 series between Beijing and the three neighboring cities of Tianjin, Zhangjiakou, and Baoding, we find that time lags between the PM2.5 series with larger fluctuations are longer than those between PM2.5 series withsmaller fluctuations. Our analysis also shows that cross-correlations between the PM2.5 series of two neighboring cities are significant and the time lags between two PM2.5 series of neighboring cities are significantly non-zero. These findings providenew scientific support on the view that air pollution in neighboring cities can affect one another not simultaneously but with a time lag.
Takayasu, Hideki; Takayasu, Misako
2017-01-01
We extend the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze binary sign time series data of price difference from the foreign exchange market. We model segments of the sign time series as Markov sequences and apply a local hypothesis test to evaluate the symmetries of independence and time reversion in different periods of the market. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. Using such analysis, we could not only segment the time series according the different behaviors but also characterize the segments in terms of statistical symmetries. As a particular result, we find that the foreign exchange market is essentially time reversible but this symmetry is broken when there is a strong external influence. PMID:28542208
NASA Astrophysics Data System (ADS)
Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.
2014-12-01
We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q<0 and greater than GHE when q>0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.
Optimizing Use of Water Management Systems during Changes of Hydrological Conditions
NASA Astrophysics Data System (ADS)
Výleta, Roman; Škrinár, Andrej; Danáčová, Michaela; Valent, Peter
2017-10-01
When designing the water management systems and their components, there is a need of more detail research on hydrological conditions of the river basin, runoff of which creates the main source of water in the reservoir. Over the lifetime of the water management systems the hydrological time series are never repeated in the same form which served as the input for the design of the system components. The design assumes the observed time series to be representative at the time of the system use. However, it is rather unrealistic assumption, because the hydrological past will not be exactly repeated over the design lifetime. When designing the water management systems, the specialists may occasionally face the insufficient or oversized capacity design, possibly wrong specification of the management rules which may lead to their non-optimal use. It is therefore necessary to establish a comprehensive approach to simulate the fluctuations in the interannual runoff (taking into account the current dry and wet periods) in the form of stochastic modelling techniques in water management practice. The paper deals with the methodological procedure of modelling the mean monthly flows using the stochastic Thomas-Fiering model, while modification of this model by Wilson-Hilferty transformation of independent random number has been applied. This transformation usually applies in the event of significant asymmetry in the observed time series. The methodological procedure was applied on the data acquired at the gauging station of Horné Orešany in the Parná Stream. Observed mean monthly flows for the period of 1.11.1980 - 31.10.2012 served as the model input information. After extrapolation the model parameters and Wilson-Hilferty transformation parameters the synthetic time series of mean monthly flows were simulated. Those have been compared with the observed hydrological time series using basic statistical characteristics (e. g. mean, standard deviation and skewness) for testing the quality of the model simulation. The synthetic hydrological series of monthly flows were created having the same statistical properties as the time series observed in the past. The compiled model was able to take into account the diversity of extreme hydrological situations in a form of synthetic series of mean monthly flows, while the occurrence of a set of flows was confirmed, which could and may occur in the future. The results of stochastic modelling in the form of synthetic time series of mean monthly flows, which takes into account the seasonal fluctuations of runoff within the year, could be applicable in engineering hydrology (e. g. for optimum use of the existing water management system that is related to reassessment of economic risks of the system).
Improved nonlinear prediction method
NASA Astrophysics Data System (ADS)
Adenan, Nur Hamiza; Md Noorani, Mohd Salmi
2014-06-01
The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.
NASA Astrophysics Data System (ADS)
Watkins, Nicholas; Clarke, Richard; Freeman, Mervyn
2002-11-01
We discuss how the ideal formalism of Computational Mechanics can be adapted to apply to a non-infinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the new concept of effective soficity is introduced. The benefits of these new concepts are demonstrated on simulated time series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the earth-based station under an electrical current pattern that is fixed with respect to the sun-earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, a hypothesis is advanced about model construction in general (see also Clarke et al; arXiv::cond-mat/0110228).
NASA Astrophysics Data System (ADS)
Caro Cuenca, Miguel; Esfahany, Sami Samiei; Hanssen, Ramon F.
2010-12-01
Persistent scatterer Radar Interferometry (PSI) can provide with a wealth of information on surface motion. These methods overcome the major limitations of the antecessor technique, interferometric SAR (InSAR), such as atmospheric disturbances, by detecting the scatterers which are slightly affected by noise. The time span that surface deformation processes are observed is limited by the satellite lifetime, which is usually less than 10 years. However most of deformation phenomena last longer. In order to fully monitor and comprehend the observed signal, acquisitions from different sensors can be merged. This is a complex task for one main reason. PSI methods provide with estimations that are relative in time to one of the acquisitions which is referred to as master or reference image. Therefore, time series acquired by different sensors will have different reference images and cannot be directly compared or joint unless they are set to the same time reference system. In global terms, the operation of translating from one to another reference systems consist of calculating a vertical offset, which is the total deformation that occurs between the two master times. To estimate this offset, different strategies can be applied, for example, using additional data such as leveling or GPS measurements. In this contribution we propose to use a least squares to merge PSI time series without any ancillary information. This method treats the time series individually, i.e. per PS, and requires some knowledge of the deformation signal, for example, if a polynomial would fairly describe the expected behavior. To test the proposed approach, we applied it to the southern Netherlands, where the surface is affected by ground water processes in abandoned mines. The time series were obtained after processing images provided by ERS1/2 and Envisat. The results were validated using in-situ water measurements, which show very high correlation with deformation time series.
Rigler, E. Joshua
2017-04-26
A theoretical basis and prototype numerical algorithm are provided that decompose regular time series of geomagnetic observations into three components: secular variation; solar quiet, and disturbance. Respectively, these three components correspond roughly to slow changes in the Earth’s internal magnetic field, periodic daily variations caused by quasi-stationary (with respect to the sun) electrical current systems in the Earth’s magnetosphere, and episodic perturbations to the geomagnetic baseline that are typically driven by fluctuations in a solar wind that interacts electromagnetically with the Earth’s magnetosphere. In contrast to similar algorithms applied to geomagnetic data in the past, this one addresses the issue of real time data acquisition directly by applying a time-causal, exponential smoother with “seasonal corrections” to the data as soon as they become available.
Segmentation of time series with long-range fractal correlations
Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.
2012-01-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997
Identification of human operator performance models utilizing time series analysis
NASA Technical Reports Server (NTRS)
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
ERIC Educational Resources Information Center
Zvoch, Keith
2016-01-01
Piecewise growth models (PGMs) were used to estimate and model changes in the preliteracy skill development of kindergartners in a moderately sized school district in the Pacific Northwest. PGMs were applied to interrupted time-series (ITS) data that arose within the context of a response-to-intervention (RtI) instructional framework. During the…
One nanosecond time synchronization using series and GPS
NASA Technical Reports Server (NTRS)
Buennagel, A. A.; Spitzmesser, D. J.; Young, L. E.
1983-01-01
Subnanosecond time sychronization between two remote rubidium frequency standards is verified by a traveling clock comparison. Using a novel, code ignorant Global Positioning System (GPS) receiver developed at JPL, the SERIES geodetic baseline measurement system is applied to establish the offset between the 1 Hz. outputs of the remote standards. Results of the two intercomparison experiments to date are presented as well as experimental details.
Scott L. Powell; Warren B. Cohen; Sean P. Healey; Robert E. Kennedy; Gretchen G. Moisen; Kenneth B. Pierce; Janet L. Ohmann
2010-01-01
Spatially and temporally explicit knowledge of biomass dynamics at broad scales is critical to understanding how forest disturbance and regrowth processes influence carbon dynamics. We modeled live, aboveground tree biomass using Forest Inventory and Analysis (FIA) field data and applied the models to 20+ year time-series of Landsat satellite imagery to...
Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro
2004-05-01
A delayed coincidence method, time-interval analysis (TIA), has been applied to successive alpha- alpha decay events on the millisecond time-scale. Such decay events are part of the (220)Rn-->(216)Po ( T(1/2) 145 ms) (Th-series) and (219)Rn-->(215)Po ( T(1/2) 1.78 ms) (Ac-series). By using TIA in addition to measurement of (226)Ra (U-series) from alpha-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject beta-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N(2) gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the (221)Fr-->(217)At ( T(1/2) 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the (225)Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples.
NASA Astrophysics Data System (ADS)
Guijarro, José A.; López, José A.; Aguilar, Enric; Domonkos, Peter; Venema, Victor; Sigró, Javier; Brunet, Manola
2017-04-01
After the successful inter-comparison of homogenization methods carried out in the COST Action ES0601 (HOME), many methods kept improving their algorithms, suggesting the need of performing new inter-comparison exercises. However, manual applications of the methodologies to a large number of testing networks cannot be afforded without involving the work of many researchers over an extended time. The alternative is to make the comparisons as automatic as possible, as in the MULTITEST project, which, funded by the Spanish Ministry of Economy and Competitiveness, tests homogenization methods by applying them to a large number of synthetic networks of monthly temperature and precipitation. One hundred networks of 10 series were sampled from different master networks containing 100 series of 720 values (60 years times 12 months). Three master temperature networks were built with different degree of cross-correlations between the series in order to simulate conditions of different station densities or climatic heterogeneity. Also three master synthetic networks were developed for precipitation, this time mimicking the characteristics of three different climates: Atlantic temperate, Mediterranean and monsoonal. Inhomogeneities were introduced in every network sampled from the master networks, and all publicly available homogenization methods that we could run in an automatic way were applied to them: ACMANT 3.0, Climatol 3.0, MASH 3.03, RHTestV4, USHCN v52d and HOMER 2.6. Most of them were tested with different settings, and their comparative results can be inspected in box-plot graphics of Root Mean Squared Errors and trend biases computed between the homogenized data and their original homogeneous series. In a first stage, inhomogeneities were applied to the synthetic homogeneous series with five different settings with increasing difficulty and realism: i) big shifts in half of the series; ii) the same with a strong seasonality; iii) short term platforms and local trends; iv) random number of shifts with random size and location in all series; and v) the same plus seasonality of random amplitude. The shifts were additive for temperature and multiplicative for precipitation. The second stage is dedicated to study the impact of the number of series in the networks, seasonalities other than sinusoidal, and the occurrence of simultaneous shifts in a high number of series. Finally, tests will be performed on a longer and more realistic benchmark, with varying number of missing data along time, similar to that used in the COST Action ES0601. These inter-comparisons will be valuable both to the users and to the developers of the tested packages, who can see how their algorithms behave under varied climate conditions.
A generalized conditional heteroscedastic model for temperature downscaling
NASA Astrophysics Data System (ADS)
Modarres, R.; Ouarda, T. B. M. J.
2014-11-01
This study describes a method for deriving the time varying second order moment, or heteroscedasticity, of local daily temperature and its association to large Coupled Canadian General Circulation Models predictors. This is carried out by applying a multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) approach to construct the conditional variance-covariance structure between General Circulation Models (GCMs) predictors and maximum and minimum temperature time series during 1980-2000. Two MGARCH specifications namely diagonal VECH and dynamic conditional correlation (DCC) are applied and 25 GCM predictors were selected for a bivariate temperature heteroscedastic modeling. It is observed that the conditional covariance between predictors and temperature is not very strong and mostly depends on the interaction between the random process governing temporal variation of predictors and predictants. The DCC model reveals a time varying conditional correlation between GCM predictors and temperature time series. No remarkable increasing or decreasing change is observed for correlation coefficients between GCM predictors and observed temperature during 1980-2000 while weak winter-summer seasonality is clear for both conditional covariance and correlation. Furthermore, the stationarity and nonlinearity Kwiatkowski-Phillips-Schmidt-Shin (KPSS) and Brock-Dechert-Scheinkman (BDS) tests showed that GCM predictors, temperature and their conditional correlation time series are nonlinear but stationary during 1980-2000 according to BDS and KPSS test results. However, the degree of nonlinearity of temperature time series is higher than most of the GCM predictors.
Deconvolution of mixing time series on a graph
Blocker, Alexander W.; Airoldi, Edoardo M.
2013-01-01
In many applications we are interested in making inference on latent time series from indirect measurements, which are often low-dimensional projections resulting from mixing or aggregation. Positron emission tomography, super-resolution, and network traffic monitoring are some examples. Inference in such settings requires solving a sequence of ill-posed inverse problems, yt = Axt, where the projection mechanism provides information on A. We consider problems in which A specifies mixing on a graph of times series that are bursty and sparse. We develop a multilevel state-space model for mixing times series and an efficient approach to inference. A simple model is used to calibrate regularization parameters that lead to efficient inference in the multilevel state-space model. We apply this method to the problem of estimating point-to-point traffic flows on a network from aggregate measurements. Our solution outperforms existing methods for this problem, and our two-stage approach suggests an efficient inference strategy for multilevel models of multivariate time series. PMID:25309135
A Method for Comparing Multivariate Time Series with Different Dimensions
Tapinos, Avraam; Mendes, Pedro
2013-01-01
In many situations it is desirable to compare dynamical systems based on their behavior. Similarity of behavior often implies similarity of internal mechanisms or dependency on common extrinsic factors. While there are widely used methods for comparing univariate time series, most dynamical systems are characterized by multivariate time series. Yet, comparison of multivariate time series has been limited to cases where they share a common dimensionality. A semi-metric is a distance function that has the properties of non-negativity, symmetry and reflexivity, but not sub-additivity. Here we develop a semi-metric – SMETS – that can be used for comparing groups of time series that may have different dimensions. To demonstrate its utility, the method is applied to dynamic models of biochemical networks and to portfolios of shares. The former is an example of a case where the dependencies between system variables are known, while in the latter the system is treated (and behaves) as a black box. PMID:23393554
Characterization of chaotic attractors under noise: A recurrence network perspective
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2016-12-01
We undertake a detailed numerical investigation to understand how the addition of white and colored noise to a chaotic time series changes the topology and the structure of the underlying attractor reconstructed from the time series. We use the methods and measures of recurrence plot and recurrence network generated from the time series for this analysis. We explicitly show that the addition of noise obscures the property of recurrence of trajectory points in the phase space which is the hallmark of every dynamical system. However, the structure of the attractor is found to be robust even upto high noise levels of 50%. An advantage of recurrence network measures over the conventional nonlinear measures is that they can be applied on short and non stationary time series data. By using the results obtained from the above analysis, we go on to analyse the light curves from a dominant black hole system and show that the recurrence network measures are capable of identifying the nature of noise contamination in a time series.
RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.
Stránský, V; Thinová, L
2017-11-01
In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
On entropy, financial markets and minority games
NASA Astrophysics Data System (ADS)
Zapart, Christopher A.
2009-04-01
The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... the applicable quantitative and liquidity measures contained in the Rule 5300, 5400 and 5500 Series... July 25, 2011, the Commission extended the time period in which to either approve the proposed rule... length of time before applying to list; (iv) applying the price requirement using closing prices, both...
Model-based Clustering of Categorical Time Series with Multinomial Logit Classification
NASA Astrophysics Data System (ADS)
Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea
2010-09-01
A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.
Using First Differences to Reduce Inhomogeneity in Radiosonde Temperature Datasets.
NASA Astrophysics Data System (ADS)
Free, Melissa; Angell, James K.; Durre, Imke; Lanzante, John; Peterson, Thomas C.; Seidel, Dian J.
2004-11-01
The utility of a “first difference” method for producing temporally homogeneous large-scale mean time series is assessed. Starting with monthly averages, the method involves dropping data around the time of suspected discontinuities and then calculating differences in temperature from one year to the next, resulting in a time series of year-to-year differences for each month at each station. These first difference time series are then combined to form large-scale means, and mean temperature time series are constructed from the first difference series. When applied to radiosonde temperature data, the method introduces random errors that decrease with the number of station time series used to create the large-scale time series and increase with the number of temporal gaps in the station time series. Root-mean-square errors for annual means of datasets produced with this method using over 500 stations are estimated at no more than 0.03 K, with errors in trends less than 0.02 K decade-1 for 1960 97 at 500 mb. For a 50-station dataset, errors in trends in annual global means introduced by the first differencing procedure may be as large as 0.06 K decade-1 (for six breaks per series), which is greater than the standard error of the trend. Although the first difference method offers significant resource and labor advantages over methods that attempt to adjust the data, it introduces an error in large-scale mean time series that may be unacceptable in some cases.
Time Series Forecasting of the Number of Malaysia Airlines and AirAsia Passengers
NASA Astrophysics Data System (ADS)
Asrah, N. M.; Nor, M. E.; Rahim, S. N. A.; Leng, W. K.
2018-04-01
The standard practice in forecasting process involved by fitting a model and further analysis on the residuals. If we know the distributional behaviour of the time series data, it can help us to directly analyse the model identification, parameter estimation, and model checking. In this paper, we want to compare the distributional behaviour data from the number of Malaysia Airlines (MAS) and AirAsia passenger’s. From the previous research, the AirAsia passengers are govern by geometric Brownian motion (GBM). The data were normally distributed, stationary and independent. Then, GBM was used to forecast the number of AirAsia passenger’s. The same methods were applied to MAS data and the results then were compared. Unfortunately, the MAS data were not govern by GBM. Then, the standard approach in time series forecasting will be applied to MAS data. From this comparison, we can conclude that the number of AirAsia passengers are always in peak season rather than MAS passengers.
Scale invariance in chaotic time series: Classical and quantum examples
NASA Astrophysics Data System (ADS)
Landa, Emmanuel; Morales, Irving O.; Stránský, Pavel; Fossion, Rubén; Velázquez, Victor; López Vieyra, J. C.; Frank, Alejandro
Important aspects of chaotic behavior appear in systems of low dimension, as illustrated by the Map Module 1. It is indeed a remarkable fact that all systems tha make a transition from order to disorder display common properties, irrespective of their exacta functional form. We discuss evidence for 1/f power spectra in the chaotic time series associated in classical and quantum examples, the one-dimensional map module 1 and the spectrum of 48Ca. A Detrended Fluctuation Analysis (DFA) method is applied to investigate the scaling properties of the energy fluctuations in the spectrum of 48Ca obtained with a large realistic shell model calculation (ANTOINE code) and with a random shell model (TBRE) calculation also in the time series obtained with the map mod 1. We compare the scale invariant properties of the 48Ca nuclear spectrum sith similar analyses applied to the RMT ensambles GOE and GDE. A comparison with the corresponding power spectra is made in both cases. The possible consequences of the results are discussed.
Observability of nonlinear dynamics: normalized results and a time-series approach.
Aguirre, Luis A; Bastos, Saulo B; Alves, Marcela A; Letellier, Christophe
2008-03-01
This paper investigates the observability of nonlinear dynamical systems. Two difficulties associated with previous studies are dealt with. First, a normalized degree observability is defined. This permits the comparison of different systems, which was not generally possible before. Second, a time-series approach is proposed based on omnidirectional nonlinear correlation functions to rank a set of time series of a system in terms of their potential use to reconstruct the original dynamics without requiring the knowledge of the system equations. The two approaches proposed in this paper and a former method were applied to five benchmark systems and an overall agreement of over 92% was found.
Time series analysis of InSAR data: Methods and trends
NASA Astrophysics Data System (ADS)
Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique
2016-05-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Time Series Analysis of Insar Data: Methods and Trends
NASA Technical Reports Server (NTRS)
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
An Iterative Time Windowed Signature Algorithm for Time Dependent Transcription Module Discovery
Meng, Jia; Gao, Shou-Jiang; Huang, Yufei
2010-01-01
An algorithm for the discovery of time varying modules using genome-wide expression data is present here. When applied to large-scale time serious data, our method is designed to discover not only the transcription modules but also their timing information, which is rarely annotated by the existing approaches. Rather than assuming commonly defined time constant transcription modules, a module is depicted as a set of genes that are co-regulated during a specific period of time, i.e., a time dependent transcription module (TDTM). A rigorous mathematical definition of TDTM is provided, which is serve as an objective function for retrieving modules. Based on the definition, an effective signature algorithm is proposed that iteratively searches the transcription modules from the time series data. The proposed method was tested on the simulated systems and applied to the human time series microarray data during Kaposi's sarcoma-associated herpesvirus (KSHV) infection. The result has been verified by Expression Analysis Systematic Explorer. PMID:21552463
Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S
2016-06-01
We extend dynamic generalized structured component analysis (GSCA) to enhance its data-analytic capability in structural equation modeling of multi-subject time series data. Time series data of multiple subjects are typically hierarchically structured, where time points are nested within subjects who are in turn nested within a group. The proposed approach, named multilevel dynamic GSCA, accommodates the nested structure in time series data. Explicitly taking the nested structure into account, the proposed method allows investigating subject-wise variability of the loadings and path coefficients by looking at the variance estimates of the corresponding random effects, as well as fixed loadings between observed and latent variables and fixed path coefficients between latent variables. We demonstrate the effectiveness of the proposed approach by applying the method to the multi-subject functional neuroimaging data for brain connectivity analysis, where time series data-level measurements are nested within subjects.
The long-term changes in total ozone, as derived from Dobson measurements at Arosa (1948-2001)
NASA Astrophysics Data System (ADS)
Krzyscin, J. W.
2003-04-01
The longest possible total ozone time series (Arosa, Switzerland) is examined for a detection of trends. Two-step procedure is proposed to estimate the long-term (decadal) variations in the ozone time series. The first step consists of a standard least-squares multiple regression applied to the total ozone monthly means to parameterize "natural" (related to the oscillations in the atmospheric dynamics) variations in the analyzed time series. The standard proxies for the dynamical ozone variations are used including; the 11-year solar activity cycle, and indices of QBO, ENSO and NAO. We use the detrended time series of temperature at 100 hPa and 500 hPa over Arosa to parameterize short-term variations (with time periods<1 year) in total ozone related to local changes in the meteorological conditions over the station. The second step consists of a smooth-curve fitting to the total ozone residuals (original minus modeled "natural" time series), the time derivation applied to this curve to obtain local trends, and bootstrapping of the residual time series to estimate the standard error of local trends. Locally weighted regression and the wavelet analysis methodology are used to extract the smooth component out of the residual time series. The time integral over the local trend values provides the cumulative long-term change since the data beginning. Examining the pattern of the cumulative change we see the periods with total ozone loss (the end of 50s up to early 60s - probably the effect of the nuclear bomb tests), recovery (mid 60s up to beginning of 70s), apparent decrease (beginning of 70s lasting to mid 90s - probably the effect of the atmosphere contamination by anthropogenic substances containing chlorine), and with a kind of stabilization or recovery (starting in the mid of 90s - probably the effect of the Montreal protocol to eliminate substances reducing the ozone layer). We can also estimate that a full ozone recovery (return to the undisturbed total ozone level from the beginning of 70s) is expected around 2050. We propose to calculate both time series of local trends and the cumulative long-term change instead single trend value derived as a slope of straight line fit to the data.
The Recalibrated Sunspot Number: Impact on Solar Cycle Predictions
NASA Astrophysics Data System (ADS)
Clette, F.; Lefevre, L.
2017-12-01
Recently and for the first time since their creation, the sunspot number and group number series were entirely revisited and a first fully recalibrated version was officially released in July 2015 by the World Data Center SILSO (Brussels). Those reference long-term series are widely used as input data or as a calibration reference by various solar cycle prediction methods. Therefore, past predictions may now need to be redone using the new sunspot series, and methods already used for predicting cycle 24 will require adaptations before attempting predictions of the next cycles.In order to clarify the nature of the applied changes, we describe the different corrections applied to the sunspot and group number series, which affect extended time periods and can reach up to 40%. While some changes simply involve constant scale factors, other corrections vary with time or follow the solar cycle modulation. Depending on the prediction method and on the selected time interval, this can lead to different responses and biases. Moreover, together with the new series, standard error estimates are also progressively added to the new sunspot numbers, which may help deriving more accurate uncertainties for predicted activity indices. We conclude on the new round of recalibration that is now undertaken in the framework of a broad multi-team collaboration articulated around upcoming ISSI workshops. We outline the future corrections that can still be expected in the future, as part of a permanent upgrading process and quality control. From now on, future sunspot-based predictive models should thus be made more adaptable, and regular updates of predictions should become common practice in order to track periodic upgrades of the sunspot number series, just like it is done when using other modern solar observational series.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Detailed Characterization of Hypervelocity Firings in a Long 120-MM Gun
1991-03-01
Figure 14. Experinental Pressure-Time Curves for Round2L Qf S d Phase Firing Seris.± 14 series led to similarly good agreement in modeling the second...Applied Ballistics Branch, and Messrs. R. May, D . Meier, and S. Little of Applied Concets Corporation are acknowledged for their uEual high level of...LB IfG AT aCTRACT RAOFORD ARMY AMUNITION PLANT. RADFORO, VA. D rWAOM9- .-Z-0003 NITROCELLULOSE - ACCEPTED BLEND NUMB I NITROGEN CNTENT 1 KI STARCH I
NASA Astrophysics Data System (ADS)
Nahar, Jannatun; Johnson, Fiona; Sharma, Ashish
2018-02-01
Conventional bias correction is usually applied on a grid-by-grid basis, meaning that the resulting corrections cannot address biases in the spatial distribution of climate variables. To solve this problem, a two-step bias correction method is proposed here to correct time series at multiple locations conjointly. The first step transforms the data to a set of statistically independent univariate time series, using a technique known as independent component analysis (ICA). The mutually independent signals can then be bias corrected as univariate time series and back-transformed to improve the representation of spatial dependence in the data. The spatially corrected data are then bias corrected at the grid scale in the second step. The method has been applied to two CMIP5 General Circulation Model simulations for six different climate regions of Australia for two climate variables—temperature and precipitation. The results demonstrate that the ICA-based technique leads to considerable improvements in temperature simulations with more modest improvements in precipitation. Overall, the method results in current climate simulations that have greater equivalency in space and time with observational data.
Dunea, Daniel; Pohoata, Alin; Iordache, Stefania
2015-07-01
The paper presents the screening of various feedforward neural networks (FANN) and wavelet-feedforward neural networks (WFANN) applied to time series of ground-level ozone (O3), nitrogen dioxide (NO2), and particulate matter (PM10 and PM2.5 fractions) recorded at four monitoring stations located in various urban areas of Romania, to identify common configurations with optimal generalization performance. Two distinct model runs were performed as follows: data processing using hourly-recorded time series of airborne pollutants during cold months (O3, NO2, and PM10), when residential heating increases the local emissions, and data processing using 24-h daily averaged concentrations (PM2.5) recorded between 2009 and 2012. Dataset variability was assessed using statistical analysis. Time series were passed through various FANNs. Each time series was decomposed in four time-scale components using three-level wavelets, which have been passed also through FANN, and recomposed into a single time series. The agreement between observed and modelled output was evaluated based on the statistical significance (r coefficient and correlation between errors and data). Daubechies db3 wavelet-Rprop FANN (6-4-1) utilization gave positive results for O3 time series optimizing the exclusive use of the FANN for hourly-recorded time series. NO2 was difficult to model due to time series specificity, but wavelet integration improved FANN performances. Daubechies db3 wavelet did not improve the FANN outputs for PM10 time series. Both models (FANN/WFANN) overestimated PM2.5 forecasted values in the last quarter of time series. A potential improvement of the forecasted values could be the integration of a smoothing algorithm to adjust the PM2.5 model outputs.
Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system
NASA Astrophysics Data System (ADS)
Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.
2017-05-01
We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.
The conditional resampling model STARS: weaknesses of the modeling concept and development
NASA Astrophysics Data System (ADS)
Menz, Christoph
2016-04-01
The Statistical Analogue Resampling Scheme (STARS) is based on a modeling concept of Werner and Gerstengarbe (1997). The model uses a conditional resampling technique to create a simulation time series from daily observations. Unlike other time series generators (such as stochastic weather generators) STARS only needs a linear regression specification of a single variable as the target condition for the resampling. Since its first implementation the algorithm was further extended in order to allow for a spatially distributed trend signal, to preserve the seasonal cycle and the autocorrelation of the observation time series (Orlovsky, 2007; Orlovsky et al., 2008). This evolved version was successfully used in several climate impact studies. However a detaild evaluation of the simulations revealed two fundamental weaknesses of the utilized resampling technique. 1. The restriction of the resampling condition on a single individual variable can lead to a misinterpretation of the change signal of other variables when the model is applied to a mulvariate time series. (F. Wechsung and M. Wechsung, 2014). As one example, the short-term correlations between precipitation and temperature (cooling of the near-surface air layer after a rainfall event) can be misinterpreted as a climatic change signal in the simulation series. 2. The model restricts the linear regression specification to the annual mean time series, refusing the specification of seasonal varying trends. To overcome these fundamental weaknesses a redevelopment of the whole algorithm was done. The poster discusses the main weaknesses of the earlier model implementation and the methods applied to overcome these in the new version. Based on the new model idealized simulations were conducted to illustrate the enhancement.
78 FR 47529 - Airworthiness Directives; Bombardier, Inc. Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... (AD) for certain Bombardier, Inc. Model CL-600-2B19 (Regional Jet Series 100 & 440) airplanes. This AD... To Shorten Compliance Time The Airline Pilots Association International stated it supports the NPRM.... (c) Applicability This AD applies to Bombardier, Inc. Model CL-600-2B19 (Regional Jet Series 100...
Detecting Land Cover Change by Trend and Seasonality of Remote Sensing Time Series
NASA Astrophysics Data System (ADS)
Oliveira, J. C.; Epiphanio, J. N.; Mello, M. P.
2013-05-01
Natural resource managers demand knowledge of information on the spatiotemporal dynamics of land use and land cover change, and detection and characteristics change over time is an initial step for the understanding of the mechanism of change. The propose of this research is the use the approach BFAST (Breaks For Additive Seasonal and Trend) for detects trend and seasonal changes within Normalized Difference Vegetation Index (NDVI) time series. BFAST integrates the decomposition of time series into trend, seasonal, and noise components with methods for detecting change within time series without the need to select a reference period, set a threshold, or define a change trajectory. BFAST iteratively estimates the time and number of changes, and characterizes change by its magnitude and direction. The general model is of the form Yt = Tt + St + et (t= 1,2,3,…, n) where Yt is the observed data at time t, Tt is the trend component, St is the seasonal component, and et is the remainder component. In this study was used MODIS NDVI time series datasets (MOD13Q1) over 11 years (2000 - 2010) on an intensive agricultural area in Mato Grosso - Brazil. At first it was applied a filter for noise reduction (4253H twice) over spectral curve of each MODIS pixel, and subsequently each time series was decomposed into seasonal, trend, and remainder components by BFAST. Were detected one abrupt change from a single pixel of forest and two abrupt changes on trend component to a pixel of the agricultural area. Figure 1 shows the number of phonological change with base in seasonal component for study area. This paper demonstrated the ability of the BFAST to detect long-term phenological change by analyzing time series while accounting for abrupt and gradual changes. The algorithm iteratively estimates the dates and number of changes occurring within seasonal and trend components, and characterizes changes by extracting the magnitude and direction of change. Changes occurring in the seasonal component indicate phenological changes, while changes occurring in the trend component indicate gradual and abrupt change. BFAST can be used to analyze different types of remotely sensed time series and can be applied to other time series such as econometrics, climatology, and hydrology. The algorithm used in this study is available in BFAT package for R from CRAN (http://cran.r-project.org/package=bfast).; Figure 1 - Number of the phonological change with base in seasonal component.
Modelling short time series in metabolomics: a functional data analysis approach.
Montana, Giovanni; Berk, Maurice; Ebbels, Tim
2011-01-01
Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.
Generalised Pareto distribution: impact of rounding on parameter estimation
NASA Astrophysics Data System (ADS)
Pasarić, Z.; Cindrić, K.
2018-05-01
Problems that occur when common methods (e.g. maximum likelihood and L-moments) for fitting a generalised Pareto (GP) distribution are applied to discrete (rounded) data sets are revealed by analysing the real, dry spell duration series. The analysis is subsequently performed on generalised Pareto time series obtained by systematic Monte Carlo (MC) simulations. The solution depends on the following: (1) the actual amount of rounding, as determined by the actual data range (measured by the scale parameter, σ) vs. the rounding increment (Δx), combined with; (2) applying a certain (sufficiently high) threshold and considering the series of excesses instead of the original series. For a moderate amount of rounding (e.g. σ/Δx ≥ 4), which is commonly met in practice (at least regarding the dry spell data), and where no threshold is applied, the classical methods work reasonably well. If cutting at the threshold is applied to rounded data—which is actually essential when dealing with a GP distribution—then classical methods applied in a standard way can lead to erroneous estimates, even if the rounding itself is moderate. In this case, it is necessary to adjust the theoretical location parameter for the series of excesses. The other solution is to add an appropriate uniform noise to the rounded data ("so-called" jittering). This, in a sense, reverses the process of rounding; and thereafter, it is straightforward to apply the common methods. Finally, if the rounding is too coarse (e.g. σ/Δx 1), then none of the above recipes would work; and thus, specific methods for rounded data should be applied.
Rainfall disaggregation for urban hydrology: Effects of spatial consistence
NASA Astrophysics Data System (ADS)
Müller, Hannes; Haberlandt, Uwe
2015-04-01
For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.
Real-Time Series Resistance Monitoring in PV Systems Without the Need for I-V Curves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deceglie, Michael G.; Silverman, Timothy J.; Marion, Bill
We apply the physical principles of a familiar method, suns-V oc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting I-V curves or constructing full series resistance-free I-V curves. RTSR is most readily deployable at the module level on microinverters or module-integrated electronics, but it can also be extended to full strings. We found that automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks,more » including broken ribbons, broken solder bonds, and contact problems in the junction or combiner box. We also describe the method in detail and describe a sample application to data collected from modules operating in the field.« less
Real-Time Series Resistance Monitoring in PV Systems Without the Need for IV Curves
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deceglie, Michael G.; Silverman, Timothy J.; Marion, Bill
We apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IV curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on micro-inverters or module-integrated electronics, but it can also be extended to full strings. Automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks, including broken ribbons, broken soldermore » bonds, and contact problems in the junction or combiner box. We describe the method in detail and describe a sample application to data collected from modules operating in the field.« less
NASA Astrophysics Data System (ADS)
Marcos-Garcia, Patricia; Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio
2016-04-01
Extreme natural phenomena, and more specifically droughts, constitute a serious environmental, economic and social issue in Southern Mediterranean countries, common in the Mediterranean Spanish basins due to the high temporal and spatial rainfall variability. Drought events are characterized by their complexity, being often difficult to identify and quantify both in time and space, and an universally accepted definition does not even exist. This fact, along with future uncertainty about the duration and intensity of the phenomena on account of climate change, makes necessary increasing the knowledge about the impacts of climate change on droughts in order to design management plans and mitigation strategies. The present abstract aims to evaluate the impact of climate change on both meteorological and hydrological droughts, through the use of a generalization of the Standardized Precipitation Index (SPI). We use the Standardized Flow Index (SFI) to assess the hydrological drought, using flow time series instead of rainfall time series. In the case of the meteorological droughts, the Standardized Precipitation and Evapotranspiration Index (SPEI) has been applied to assess the variability of temperature impacts. In order to characterize climate change impacts on droughts, we have used projections from the CORDEX project (Coordinated Regional Climate Downscaling Experiment). Future rainfall and temperature time series for short (2011-2040) and medium terms (2041-2070) were obtained, applying a quantile mapping method to correct the bias of these time series. Regarding the hydrological drought, the Témez hydrological model has been applied to simulate the impacts of future temperature and rainfall time series on runoff and river discharges. It is a conceptual, lumped and a few parameters hydrological model. Nevertheless, it is necessary to point out the time difference between the meteorological and the hydrological droughts. The case study is the Jucar river basin (Spain), a highly regulated system with a share of 80% of water use for irrigated agriculture. The results show that the climate change would increase the historical drought impacts in the river basin. Acknowledgments The study has been supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and European FEDER funds.
NASA Astrophysics Data System (ADS)
Zhang, Yongping; Shang, Pengjian; Xiong, Hui; Xia, Jianan
Time irreversibility is an important property of nonequilibrium dynamic systems. A visibility graph approach was recently proposed, and this approach is generally effective to measure time irreversibility of time series. However, its result may be unreliable when dealing with high-dimensional systems. In this work, we consider the joint concept of time irreversibility and adopt the phase-space reconstruction technique to improve this visibility graph approach. Compared with the previous approach, the improved approach gives a more accurate estimate for the irreversibility of time series, and is more effective to distinguish irreversible and reversible stochastic processes. We also use this approach to extract the multiscale irreversibility to account for the multiple inherent dynamics of time series. Finally, we apply the approach to detect the multiscale irreversibility of financial time series, and succeed to distinguish the time of financial crisis and the plateau. In addition, Asian stock indexes away from other indexes are clearly visible in higher time scales. Simulations and real data support the effectiveness of the improved approach when detecting time irreversibility.
Testing the shape of distributions of weather data
NASA Astrophysics Data System (ADS)
Baccon, Ana L. P.; Lunardi, José T.
2016-08-01
The characterization of the statistical distributions of observed weather data is of crucial importance both for the construction and for the validation of weather models, such as weather generators (WG's). An important class of WG's (e.g., the Richardson-type generators) reduce the time series of each variable to a time series of its residual elements, and the residuals are often assumed to be normally distributed. In this work we propose an approach to investigate if the shape assumed for the distribution of residuals is consistent or not with the observed data of a given site. Specifically, this procedure tests if the same distribution shape for the residuals noise is maintained along the time. The proposed approach is an adaptation to climate time series of a procedure first introduced to test the shapes of distributions of growth rates of business firms aggregated in large panels of short time series. We illustrate the procedure by applying it to the residuals time series of maximum temperature in a given location, and investigate the empirical consistency of two assumptions, namely i) the most common assumption that the distribution of the residuals is Gaussian and ii) that the residuals noise has a time invariant shape which coincides with the empirical distribution of all the residuals noise of the whole time series pooled together.
Analyzing developmental processes on an individual level using nonstationary time series modeling.
Molenaar, Peter C M; Sinclair, Katerina O; Rovine, Michael J; Ram, Nilam; Corneal, Sherry E
2009-01-01
Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in developmental processes over time using a multivariate nonstationary time series model. They apply this model to describe the changing relationships between a biological son and father and a stepson and stepfather at the individual level. The authors also explain how to use an extended Kalman filter with iteration and smoothing estimator to capture how dynamics change over time. Finally, they suggest further applications of the multivariate nonstationary time series model and detail the next steps in the development of statistical models used to analyze individual-level data.
Time series segmentation: a new approach based on Genetic Algorithm and Hidden Markov Model
NASA Astrophysics Data System (ADS)
Toreti, A.; Kuglitsch, F. G.; Xoplaki, E.; Luterbacher, J.
2009-04-01
The subdivision of a time series into homogeneous segments has been performed using various methods applied to different disciplines. In climatology, for example, it is accompanied by the well-known homogenization problem and the detection of artificial change points. In this context, we present a new method (GAMM) based on Hidden Markov Model (HMM) and Genetic Algorithm (GA), applicable to series of independent observations (and easily adaptable to autoregressive processes). A left-to-right hidden Markov model, estimating the parameters and the best-state sequence, respectively, with the Baum-Welch and Viterbi algorithms, was applied. In order to avoid the well-known dependence of the Baum-Welch algorithm on the initial condition, a Genetic Algorithm was developed. This algorithm is characterized by mutation, elitism and a crossover procedure implemented with some restrictive rules. Moreover the function to be minimized was derived following the approach of Kehagias (2004), i.e. it is the so-called complete log-likelihood. The number of states was determined applying a two-fold cross-validation procedure (Celeux and Durand, 2008). Being aware that the last issue is complex, and it influences all the analysis, a Multi Response Permutation Procedure (MRPP; Mielke et al., 1981) was inserted. It tests the model with K+1 states (where K is the state number of the best model) if its likelihood is close to K-state model. Finally, an evaluation of the GAMM performances, applied as a break detection method in the field of climate time series homogenization, is shown. 1. G. Celeux and J.B. Durand, Comput Stat 2008. 2. A. Kehagias, Stoch Envir Res 2004. 3. P.W. Mielke, K.J. Berry, G.W. Brier, Monthly Wea Rev 1981.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis.
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.
Random walker in temporally deforming higher-order potential forces observed in a financial crisis.
Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako
2009-11-01
Basic peculiarities of market price fluctuations are known to be well described by a recently developed random-walk model in a temporally deforming quadratic potential force whose center is given by a moving average of past price traces [M. Takayasu, T. Mizuno, and H. Takayasu, Physica A 370, 91 (2006)]. By analyzing high-frequency financial time series of exceptional events, such as bubbles and crashes, we confirm the appearance of higher-order potential force in the markets. We show statistical significance of its existence by applying the information criterion. This time series analysis is expected to be applied widely for detecting a nonstationary symptom in random phenomena.
Fasching, George E.
1977-03-08
An improved high-voltage pulse generator has been provided which is especially useful in ultrasonic testing of rock core samples. An N number of capacitors are charged in parallel to V volts and at the proper instance are coupled in series to produce a high-voltage pulse of N times V volts. Rapid switching of the capacitors from the paralleled charging configuration to the series discharging configuration is accomplished by using silicon-controlled rectifiers which are chain self-triggered following the initial triggering of a first one of the rectifiers connected between the first and second of the plurality of charging capacitors. A timing and triggering circuit is provided to properly synchronize triggering pulses to the first SCR at a time when the charging voltage is not being applied to the parallel-connected charging capacitors. Alternate circuits are provided for controlling the application of the charging voltage from a charging circuit to be applied to the parallel capacitors which provides a selection of at least two different intervals in which the charging voltage is turned "off" to allow the SCR's connecting the capacitors in series to turn "off" before recharging begins. The high-voltage pulse-generating circuit including the N capacitors and corresponding SCR's which connect the capacitors in series when triggered "on" further includes diodes and series-connected inductors between the parallel-connected charging capacitors which allow sufficiently fast charging of the capacitors for a high pulse repetition rate and yet allow considerable control of the decay time of the high-voltage pulses from the pulse-generating circuit.
Escudero, Javier; Hornero, Roberto; Abásolo, Daniel
2009-02-01
The mutual information (MI) is a measure of both linear and nonlinear dependences. It can be applied to a time series and a time-delayed version of the same sequence to compute the auto-mutual information function (AMIF). Moreover, the AMIF rate of decrease (AMIFRD) with increasing time delay in a signal is correlated with its entropy and has been used to characterize biomedical data. In this paper, we aimed at gaining insight into the dependence of the AMIFRD on several signal processing concepts and at illustrating its application to biomedical time series analysis. Thus, we have analysed a set of synthetic sequences with the AMIFRD. The results show that the AMIF decreases more quickly as bandwidth increases and that the AMIFRD becomes more negative as there is more white noise contaminating the time series. Additionally, this metric detected changes in the nonlinear dynamics of a signal. Finally, in order to illustrate the analysis of real biomedical signals with the AMIFRD, this metric was applied to electroencephalogram (EEG) signals acquired with eyes open and closed and to ictal and non-ictal intracranial EEG recordings.
Time Series Decomposition into Oscillation Components and Phase Estimation.
Matsuda, Takeru; Komaki, Fumiyasu
2017-02-01
Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.
Assessment of trend and seasonality in road accident data: an Iranian case study.
Razzaghi, Alireza; Bahrampour, Abbas; Baneshi, Mohammad Reza; Zolala, Farzaneh
2013-06-01
Road traffic accidents and their related deaths have become a major concern, particularly in developing countries. Iran has adopted a series of policies and interventions to control the high number of accidents occurring over the past few years. In this study we used a time series model to understand the trend of accidents, and ascertain the viability of applying ARIMA models on data from Taybad city. This study is a cross-sectional study. We used data from accidents occurring in Taybad between 2007 and 2011. We obtained the data from the Ministry of Health (MOH) and used the time series method with a time lag of one month. After plotting the trend, non-stationary data in mean and variance were removed using Box-Cox transformation and a differencing method respectively. The ACF and PACF plots were used to control the stationary situation. The traffic accidents in our study had an increasing trend over the five years of study. Based on ACF and PACF plots gained after applying Box-Cox transformation and differencing, data did not fit to a time series model. Therefore, neither ARIMA model nor seasonality were observed. Traffic accidents in Taybad have an upward trend. In addition, we expected either the AR model, MA model or ARIMA model to have a seasonal trend, yet this was not observed in this analysis. Several reasons may have contributed to this situation, such as uncertainty of the quality of data, weather changes, and behavioural factors that are not taken into account by time series analysis.
Near-Surface Flow Fields Deduced Using Correlation Tracking and Time-Distance Analysis
NASA Technical Reports Server (NTRS)
DeRosa, Marc; Duvall, T. L., Jr.; Toomre, Juri
1999-01-01
Near-photospheric flow fields on the Sun are deduced using two independent methods applied to the same time series of velocity images observed by SOI-MDI on SOHO. Differences in travel times between f modes entering and leaving each pixel measured using time-distance helioseismology are used to determine sites of supergranular outflows. Alternatively, correlation tracking analysis of mesogranular scales of motion applied to the same time series is used to deduce the near-surface flow field. These two approaches provide the means to assess the patterns and evolution of horizontal flows on supergranular scales even near disk center, which is not feasible with direct line-of-sight Doppler measurements. We find that the locations of the supergranular outflows seen in flow fields generated from correlation tracking coincide well with the locations of the outflows determined from the time-distance analysis, with a mean correlation coefficient after smoothing of bar-r(sub s) = 0.840. Near-surface velocity field measurements can used to study the evolution of the supergranular network, as merging and splitting events are observed to occur in these images. The data consist of one 2048-minute time series of high-resolution (0.6" pixels) line-of-sight velocity images taken by MDI on 1997 January 16-18 at a cadence of one minute.
NASA Astrophysics Data System (ADS)
Leite, Argentina; Paula Rocha, Ana; Eduarda Silva, Maria
2013-06-01
Heart Rate Variability (HRV) series exhibit long memory and time-varying conditional variance. This work considers the Fractionally Integrated AutoRegressive Moving Average (ARFIMA) models with Generalized AutoRegressive Conditional Heteroscedastic (GARCH) errors. ARFIMA-GARCH models may be used to capture and remove long memory and estimate the conditional volatility in 24 h HRV recordings. The ARFIMA-GARCH approach is applied to fifteen long term HRV series available at Physionet, leading to the discrimination among normal individuals, heart failure patients, and patients with atrial fibrillation.
Bendel, David; Beck, Ferdinand; Dittmer, Ulrich
2013-01-01
In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).
Detection of "noisy" chaos in a time series
NASA Technical Reports Server (NTRS)
Chon, K. H.; Kanters, J. K.; Cohen, R. J.; Holstein-Rathlou, N. H.
1997-01-01
Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series, and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations, and applied to heart rate variability data.
NASA Astrophysics Data System (ADS)
OświÈ©cimka, Paweł; Livi, Lorenzo; DroŻdŻ, Stanisław
2016-10-01
We investigate the scaling of the cross-correlations calculated for two-variable time series containing vertex properties in the context of complex networks. Time series of such observables are obtained by means of stationary, unbiased random walks. We consider three vertex properties that provide, respectively, short-, medium-, and long-range information regarding the topological role of vertices in a given network. In order to reveal the relation between these quantities, we applied the multifractal cross-correlation analysis technique, which provides information about the nonlinear effects in coupling of time series. We show that the considered network models are characterized by unique multifractal properties of the cross-correlation. In particular, it is possible to distinguish between Erdös-Rényi, Barabási-Albert, and Watts-Strogatz networks on the basis of fractal cross-correlation. Moreover, the analysis of protein contact networks reveals characteristics shared with both scale-free and small-world models.
Approximating high-dimensional dynamics by barycentric coordinates with linear programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirata, Yoshito, E-mail: yoshito@sat.t.u-tokyo.ac.jp; Aihara, Kazuyuki; Suzuki, Hideyuki
The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics ofmore » the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.« less
López-Caraballo, C. H.; Lazzús, J. A.; Salfate, I.; Rojas, P.; Rivera, M.; Palma-Chilla, L.
2015-01-01
An artificial neural network (ANN) based on particle swarm optimization (PSO) was developed for the time series prediction. The hybrid ANN+PSO algorithm was applied on Mackey-Glass chaotic time series in the short-term x(t + 6). The performance prediction was evaluated and compared with other studies available in the literature. Also, we presented properties of the dynamical system via the study of chaotic behaviour obtained from the predicted time series. Next, the hybrid ANN+PSO algorithm was complemented with a Gaussian stochastic procedure (called stochastic hybrid ANN+PSO) in order to obtain a new estimator of the predictions, which also allowed us to compute the uncertainties of predictions for noisy Mackey-Glass chaotic time series. Thus, we studied the impact of noise for several cases with a white noise level (σ N) from 0.01 to 0.1. PMID:26351449
López-Caraballo, C H; Lazzús, J A; Salfate, I; Rojas, P; Rivera, M; Palma-Chilla, L
2015-01-01
An artificial neural network (ANN) based on particle swarm optimization (PSO) was developed for the time series prediction. The hybrid ANN+PSO algorithm was applied on Mackey-Glass chaotic time series in the short-term x(t + 6). The performance prediction was evaluated and compared with other studies available in the literature. Also, we presented properties of the dynamical system via the study of chaotic behaviour obtained from the predicted time series. Next, the hybrid ANN+PSO algorithm was complemented with a Gaussian stochastic procedure (called stochastic hybrid ANN+PSO) in order to obtain a new estimator of the predictions, which also allowed us to compute the uncertainties of predictions for noisy Mackey-Glass chaotic time series. Thus, we studied the impact of noise for several cases with a white noise level (σ(N)) from 0.01 to 0.1.
Approximating high-dimensional dynamics by barycentric coordinates with linear programming.
Hirata, Yoshito; Shiro, Masanori; Takahashi, Nozomu; Aihara, Kazuyuki; Suzuki, Hideyuki; Mas, Paloma
2015-01-01
The increasing development of novel methods and techniques facilitates the measurement of high-dimensional time series but challenges our ability for accurate modeling and predictions. The use of a general mathematical model requires the inclusion of many parameters, which are difficult to be fitted for relatively short high-dimensional time series observed. Here, we propose a novel method to accurately model a high-dimensional time series. Our method extends the barycentric coordinates to high-dimensional phase space by employing linear programming, and allowing the approximation errors explicitly. The extension helps to produce free-running time-series predictions that preserve typical topological, dynamical, and/or geometric characteristics of the underlying attractors more accurately than the radial basis function model that is widely used. The method can be broadly applied, from helping to improve weather forecasting, to creating electronic instruments that sound more natural, and to comprehensively understanding complex biological data.
Detecting dynamical changes in time series by using the Jensen Shannon divergence
NASA Astrophysics Data System (ADS)
Mateos, D. M.; Riveaud, L. E.; Lamberti, P. W.
2017-08-01
Most of the time series in nature are a mixture of signals with deterministic and random dynamics. Thus the distinction between these two characteristics becomes important. Distinguishing between chaotic and aleatory signals is difficult because they have a common wide band power spectrum, a delta like autocorrelation function, and share other features as well. In general, signals are presented as continuous records and require to be discretized for being analyzed. In this work, we introduce different schemes for discretizing and for detecting dynamical changes in time series. One of the main motivations is to detect transitions between the chaotic and random regime. The tools here used here originate from the Information Theory. The schemes proposed are applied to simulated and real life signals, showing in all cases a high proficiency for detecting changes in the dynamics of the associated time series.
Analysing the Image Building Effects of TV Advertisements Using Internet Community Data
NASA Astrophysics Data System (ADS)
Uehara, Hiroshi; Sato, Tadahiko; Yoshida, Kenichi
This paper proposes a method to measure the effects of TV advertisements on the Internet bulletin boards. It aims to clarify how the viewes' interests on TV advertisements are reflected on their images on the promoted products. Two kinds of time series data are generated based on the proposed method. First one represents the time series fluctuation of the interests on the TV advertisements. Another one represents the time series fluctuation of the images on the products. By analysing the correlations between these two time series data, we try to clarify the implicit relationship between the viewer's interests on the TV advertisement and their images on the promoted products. By applying the proposed method to an Internet bulletin board that deals with certain cosmetic brand, we show that the images on the products vary depending on the difference of the interests on each TV advertisement.
Bayesian dynamic modeling of time series of dengue disease case counts
López-Quílez, Antonio; Torres-Prieto, Alexander
2017-01-01
The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941
NASA Astrophysics Data System (ADS)
Chorozoglou, D.; Kugiumtzis, D.; Papadimitriou, E.
2018-06-01
The seismic hazard assessment in the area of Greece is attempted by studying the earthquake network structure, such as small-world and random. In this network, a node represents a seismic zone in the study area and a connection between two nodes is given by the correlation of the seismic activity of two zones. To investigate the network structure, and particularly the small-world property, the earthquake correlation network is compared with randomized ones. Simulations on multivariate time series of different length and number of variables show that for the construction of randomized networks the method randomizing the time series performs better than methods randomizing directly the original network connections. Based on the appropriate randomization method, the network approach is applied to time series of earthquakes that occurred between main shocks in the territory of Greece spanning the period 1999-2015. The characterization of networks on sliding time windows revealed that small-world structure emerges in the last time interval, shortly before the main shock.
Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob
2016-08-01
The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.
Modeling BAS Dysregulation in Bipolar Disorder.
Hamaker, Ellen L; Grasman, Raoul P P P; Kamphuis, Jan Henk
2016-08-01
Time series analysis is a technique that can be used to analyze the data from a single subject and has great potential to investigate clinically relevant processes like affect regulation. This article uses time series models to investigate the assumed dysregulation of affect that is associated with bipolar disorder. By formulating a number of alternative models that capture different kinds of theoretically predicted dysregulation, and by comparing these in both bipolar patients and controls, we aim to illustrate the heuristic potential this method of analysis has for clinical psychology. We argue that, not only can time series analysis elucidate specific maladaptive dynamics associated with psychopathology, it may also be clinically applied in symptom monitoring and the evaluation of therapeutic interventions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamışlıoğlu, Miraç, E-mail: m.kamislioglu@gmail.com; Külahcı, Fatih, E-mail: fatihkulahci@firat.edu.tr
Nonlinear time series analysis techniques have large application areas on the geoscience and geophysics fields. Modern nonlinear methods are provided considerable evidence for explain seismicity phenomena. In this study nonlinear time series analysis, fractal analysis and spectral analysis have been carried out for researching the chaotic behaviors of release radon gas ({sup 222}Rn) concentration occurring during seismic events. Nonlinear time series analysis methods (Lyapunov exponent, Hurst phenomenon, correlation dimension and false nearest neighbor) were applied for East Anatolian Fault Zone (EAFZ) Turkey and its surroundings where there are about 35,136 the radon measurements for each region. In this paper weremore » investigated of {sup 222}Rn behavior which it’s used in earthquake prediction studies.« less
Rai, Shesh N; Trainor, Patrick J; Khosravi, Farhad; Kloecker, Goetz; Panchapakesan, Balaji
2016-01-01
The development of biosensors that produce time series data will facilitate improvements in biomedical diagnostics and in personalized medicine. The time series produced by these devices often contains characteristic features arising from biochemical interactions between the sample and the sensor. To use such characteristic features for determining sample class, similarity-based classifiers can be utilized. However, the construction of such classifiers is complicated by the variability in the time domains of such series that renders the traditional distance metrics such as Euclidean distance ineffective in distinguishing between biological variance and time domain variance. The dynamic time warping (DTW) algorithm is a sequence alignment algorithm that can be used to align two or more series to facilitate quantifying similarity. In this article, we evaluated the performance of DTW distance-based similarity classifiers for classifying time series that mimics electrical signals produced by nanotube biosensors. Simulation studies demonstrated the positive performance of such classifiers in discriminating between time series containing characteristic features that are obscured by noise in the intensity and time domains. We then applied a DTW distance-based k -nearest neighbors classifier to distinguish the presence/absence of mesenchymal biomarker in cancer cells in buffy coats in a blinded test. Using a train-test approach, we find that the classifier had high sensitivity (90.9%) and specificity (81.8%) in differentiating between EpCAM-positive MCF7 cells spiked in buffy coats and those in plain buffy coats.
Multivariate multiscale entropy of financial markets
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun
2017-11-01
In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Visualizing frequent patterns in large multivariate time series
NASA Astrophysics Data System (ADS)
Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.
2011-01-01
The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.
Sample entropy applied to the analysis of synthetic time series and tachograms
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.
2017-01-01
Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.
Inflow forecasting model construction with stochastic time series for coordinated dam operation
NASA Astrophysics Data System (ADS)
Kim, T.; Jung, Y.; Kim, H.; Heo, J. H.
2014-12-01
Dam inflow forecasting is one of the most important tasks in dam operation for an effective water resources management and control. In general, dam inflow forecasting with stochastic time series model is possible to apply when the data is stationary because most of stochastic process based on stationarity. However, recent hydrological data cannot be satisfied the stationarity anymore because of climate change. Therefore a stochastic time series model, which can consider seasonality and trend in the data series, named SARIMAX(Seasonal Autoregressive Integrated Average with eXternal variable) model were constructed in this study. This SARIMAX model could increase the performance of stochastic time series model by considering the nonstationarity components and external variable such as precipitation. For application, the models were constructed for four coordinated dams on Han river in South Korea with monthly time series data. As a result, the models of each dam have similar performance and it would be possible to use the model for coordinated dam operation.Acknowledgement This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-NH-12-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.
NASA Astrophysics Data System (ADS)
Kbaier Ben Ismail, Dhouha; Lazure, Pascal; Puillat, Ingrid
2016-10-01
In marine sciences, many fields display high variability over a large range of spatial and temporal scales, from seconds to thousands of years. The longer recorded time series, with an increasing sampling frequency, in this field are often nonlinear, nonstationary, multiscale and noisy. Their analysis faces new challenges and thus requires the implementation of adequate and specific methods. The objective of this paper is to highlight time series analysis methods already applied in econometrics, signal processing, health, etc. to the environmental marine domain, assess advantages and inconvenients and compare classical techniques with more recent ones. Temperature, turbidity and salinity are important quantities for ecosystem studies. The authors here consider the fluctuations of sea level, salinity, turbidity and temperature recorded from the MAREL Carnot system of Boulogne-sur-Mer (France), which is a moored buoy equipped with physico-chemical measuring devices, working in continuous and autonomous conditions. In order to perform adequate statistical and spectral analyses, it is necessary to know the nature of the considered time series. For this purpose, the stationarity of the series and the occurrence of unit-root are addressed with the Augmented-Dickey Fuller tests. As an example, the harmonic analysis is not relevant for temperature, turbidity and salinity due to the nonstationary condition, except for the nearly stationary sea level datasets. In order to consider the dominant frequencies associated to the dynamics, the large number of data provided by the sensors should enable the estimation of Fourier spectral analysis. Different power spectra show a complex variability and reveal an influence of environmental factors such as tides. However, the previous classical spectral analysis, namely the Blackman-Tukey method, requires not only linear and stationary data but also evenly-spaced data. Interpolating the time series introduces numerous artifacts to the data. The Lomb-Scargle algorithm is adapted to unevenly-spaced data and is used as an alternative. The limits of the method are also set out. It was found that beyond 50% of missing measures, few significant frequencies are detected, several seasonalities are no more visible, and even a whole range of high frequency disappears progressively. Furthermore, two time-frequency decomposition methods, namely wavelets and Hilbert-Huang Transformation (HHT), are applied for the analysis of the entire dataset. Using the Continuous Wavelet Transform (CWT), some properties of the time series are determined. Then, the inertial wave and several low-frequency tidal waves are identified by the application of the Empirical Mode Decomposition (EMD). Finally, EMD based Time Dependent Intrinsic Correlation (TDIC) analysis is applied to consider the correlation between two nonstationary time series.
Zhang, Fang; Wagner, Anita K; Soumerai, Stephen B; Ross-Degnan, Dennis
2009-02-01
Interrupted time series (ITS) is a strong quasi-experimental research design, which is increasingly applied to estimate the effects of health services and policy interventions. We describe and illustrate two methods for estimating confidence intervals (CIs) around absolute and relative changes in outcomes calculated from segmented regression parameter estimates. We used multivariate delta and bootstrapping methods (BMs) to construct CIs around relative changes in level and trend, and around absolute changes in outcome based on segmented linear regression analyses of time series data corrected for autocorrelated errors. Using previously published time series data, we estimated CIs around the effect of prescription alerts for interacting medications with warfarin on the rate of prescriptions per 10,000 warfarin users per month. Both the multivariate delta method (MDM) and the BM produced similar results. BM is preferred for calculating CIs of relative changes in outcomes of time series studies, because it does not require large sample sizes when parameter estimates are obtained correctly from the model. Caution is needed when sample size is small.
Time series smoother for effect detection.
You, Cheng; Lin, Dennis K J; Young, S Stanley
2018-01-01
In environmental epidemiology, it is often encountered that multiple time series data with a long-term trend, including seasonality, cannot be fully adjusted by the observed covariates. The long-term trend is difficult to separate from abnormal short-term signals of interest. This paper addresses how to estimate the long-term trend in order to recover short-term signals. Our case study demonstrates that the current spline smoothing methods can result in significant positive and negative cross-correlations from the same dataset, depending on how the smoothing parameters are chosen. To circumvent this dilemma, three classes of time series smoothers are proposed to detrend time series data. These smoothers do not require fine tuning of parameters and can be applied to recover short-term signals. The properties of these smoothers are shown with both a case study using a factorial design and a simulation study using datasets generated from the original dataset. General guidelines are provided on how to discover short-term signals from time series with a long-term trend. The benefit of this research is that a problem is identified and characteristics of possible solutions are determined.
dynGENIE3: dynamical GENIE3 for the inference of gene networks from time series expression data.
Huynh-Thu, Vân Anh; Geurts, Pierre
2018-02-21
The elucidation of gene regulatory networks is one of the major challenges of systems biology. Measurements about genes that are exploited by network inference methods are typically available either in the form of steady-state expression vectors or time series expression data. In our previous work, we proposed the GENIE3 method that exploits variable importance scores derived from Random forests to identify the regulators of each target gene. This method provided state-of-the-art performance on several benchmark datasets, but it could however not specifically be applied to time series expression data. We propose here an adaptation of the GENIE3 method, called dynamical GENIE3 (dynGENIE3), for handling both time series and steady-state expression data. The proposed method is evaluated extensively on the artificial DREAM4 benchmarks and on three real time series expression datasets. Although dynGENIE3 does not systematically yield the best performance on each and every network, it is competitive with diverse methods from the literature, while preserving the main advantages of GENIE3 in terms of scalability.
Time series smoother for effect detection
Lin, Dennis K. J.; Young, S. Stanley
2018-01-01
In environmental epidemiology, it is often encountered that multiple time series data with a long-term trend, including seasonality, cannot be fully adjusted by the observed covariates. The long-term trend is difficult to separate from abnormal short-term signals of interest. This paper addresses how to estimate the long-term trend in order to recover short-term signals. Our case study demonstrates that the current spline smoothing methods can result in significant positive and negative cross-correlations from the same dataset, depending on how the smoothing parameters are chosen. To circumvent this dilemma, three classes of time series smoothers are proposed to detrend time series data. These smoothers do not require fine tuning of parameters and can be applied to recover short-term signals. The properties of these smoothers are shown with both a case study using a factorial design and a simulation study using datasets generated from the original dataset. General guidelines are provided on how to discover short-term signals from time series with a long-term trend. The benefit of this research is that a problem is identified and characteristics of possible solutions are determined. PMID:29684033
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
NASA Astrophysics Data System (ADS)
Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.
2018-01-01
This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.
Detection of chaotic determinism in time series from randomly forced maps
NASA Technical Reports Server (NTRS)
Chon, K. H.; Kanters, J. K.; Cohen, R. J.; Holstein-Rathlou, N. H.
1997-01-01
Time series from biological system often display fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". Despite this effort, it has been difficult to establish the presence of chaos in time series from biological sytems. The output from a biological system is probably the result of both its internal dynamics, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series, and if this determinism has chaotic attributes, i.e., a positive characteristic exponent that leads to sensitivity to initial conditions. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations, and applied to heart rate variability data.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Graphic analysis and multifractal on percolation-based return interval series
NASA Astrophysics Data System (ADS)
Pei, A. Q.; Wang, J.
2015-05-01
A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing
NASA Astrophysics Data System (ADS)
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture—for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments—as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series—daily Poaceae pollen concentrations over the period 2006-2014—was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Algorithmic complexity of real financial markets
NASA Astrophysics Data System (ADS)
Mansilla, R.
2001-12-01
A new approach to the understanding of complex behavior of financial markets index using tools from thermodynamics and statistical physics is developed. Physical complexity, a quantity rooted in the Kolmogorov-Chaitin theory is applied to binary sequences built up from real time series of financial markets indexes. The study is based on NASDAQ and Mexican IPC data. Different behaviors of this quantity are shown when applied to the intervals of series placed before crashes and to intervals when no financial turbulence is observed. The connection between our results and the efficient market hypothesis is discussed.
Correlation filtering in financial time series (Invited Paper)
NASA Astrophysics Data System (ADS)
Aste, T.; Di Matteo, Tiziana; Tumminello, M.; Mantegna, R. N.
2005-05-01
We apply a method to filter relevant information from the correlation coefficient matrix by extracting a network of relevant interactions. This method succeeds to generate networks with the same hierarchical structure of the Minimum Spanning Tree but containing a larger amount of links resulting in a richer network topology allowing loops and cliques. In Tumminello et al.,1 we have shown that this method, applied to a financial portfolio of 100 stocks in the USA equity markets, is pretty efficient in filtering relevant information about the clustering of the system and its hierarchical structure both on the whole system and within each cluster. In particular, we have found that triangular loops and 4 element cliques have important and significant relations with the market structure and properties. Here we apply this filtering procedure to the analysis of correlation in two different kind of interest rate time series (16 Eurodollars and 34 US interest rates).
An approach to checking case-crossover analyses based on equivalence with time-series methods.
Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L
2008-03-01
The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.
Analysis of Multipsectral Time Series for supporting Forest Management Plans
NASA Astrophysics Data System (ADS)
Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.
2010-05-01
Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.
A Report on Applying EEGnet to Discriminate Human State Effects on Task Performance
2018-01-01
whether we could identify what task the participant was performing from differences in the recorded brain time series . We modeled the relationship...between input data (brain time series ) and output labels (task A and task B) as an unknown function, and we found an optimal approximation of that...this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of
NASA Astrophysics Data System (ADS)
Kelly, Brandon C.; Hughes, Philip A.; Aller, Hugh D.; Aller, Margo F.
2003-07-01
We introduce an algorithm for applying a cross-wavelet transform to analysis of quasi-periodic variations in a time series and introduce significance tests for the technique. We apply a continuous wavelet transform and the cross-wavelet algorithm to the Pearson-Readhead VLBI survey sources using data obtained from the University of Michigan 26 m paraboloid at observing frequencies of 14.5, 8.0, and 4.8 GHz. Thirty of the 62 sources were chosen to have sufficient data for analysis, having at least 100 data points for a given time series. Of these 30 sources, a little more than half exhibited evidence for quasi-periodic behavior in at least one observing frequency, with a mean characteristic period of 2.4 yr and standard deviation of 1.3 yr. We find that out of the 30 sources, there were about four timescales for every 10 time series, and about half of those sources showing quasi-periodic behavior repeated the behavior in at least one other observing frequency.
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
Sarker, Hillol; Tyburski, Matthew; Rahman, Md. Mahbubur; Hovsepian, Karen; Sharmin, Moushumi; Epstein, David H.; Preston, Kenzie L.; Furr-Holden, C. Debra; Milam, Adam; Nahum-Shani, Inbal; al’Absi, Mustafa; Kumar, Santosh
2016-01-01
Management of daily stress can be greatly improved by delivering sensor-triggered just-in-time interventions (JITIs) on mobile devices. The success of such JITIs critically depends on being able to mine the time series of noisy sensor data to find the most opportune moments. In this paper, we propose a time series pattern mining method to detect significant stress episodes in a time series of discontinuous and rapidly varying stress data. We apply our model to 4 weeks of physiological, GPS, and activity data collected from 38 users in their natural environment to discover patterns of stress in real-life. We find that the duration of a prior stress episode predicts the duration of the next stress episode and stress in mornings and evenings is lower than during the day. We then analyze the relationship between stress and objectively rated disorder in the surrounding neighborhood and develop a model to predict stressful episodes. PMID:28058409
Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M
2015-08-01
To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Burrell, A. L.; Evans, J. P.; Liu, Y.
2017-12-01
Dryland degradation is an issue of international significance as dryland regions play a substantial role in global food production. Remotely sensed data provide the only long term, large scale record of changes within dryland ecosystems. The Residual Trend, or RESTREND, method is applied to satellite observations to detect dryland degradation. Whilst effective in most cases, it has been shown that the RESTREND method can fail to identify degraded pixels if the relationship between vegetation and precipitation has broken-down as a result of severe or rapid degradation. This study presents an extended version of the RESTREND methodology that incorporates the Breaks For Additive Seasonal and Trend method to identify step changes in the time series that are related to significant structural changes in the ecosystem, e.g. land use changes. When applied to Australia, this new methodology, termed Time Series Segmentation and Residual Trend analysis (TSS-RESTREND), was able to detect degradation in 5.25% of pixels compared to only 2.0% for RESTREND alone. This modified methodology was then assessed in two regions with known histories of degradation where it was found to accurately capture both the timing and directionality of ecosystem change.
Frequency Analysis of Modis Ndvi Time Series for Determining Hotspot of Land Degradation in Mongolia
NASA Astrophysics Data System (ADS)
Nasanbat, E.; Sharav, S.; Sanjaa, T.; Lkhamjav, O.; Magsar, E.; Tuvdendorj, B.
2018-04-01
This study examines MODIS NDVI satellite imagery time series can be used to determine hotspot of land degradation area in whole Mongolia. The trend statistical analysis of Mann-Kendall was applied to a 16-year MODIS NDVI satellite imagery record, based on 16-day composited temporal data (from May to September) for growing seasons and from 2000 to 2016. We performed to frequency analysis that resulting NDVI residual trend pattern would enable successful determined of negative and positive changes in photo synthetically health vegetation. Our result showed that negative and positive values and generated a map of significant trends. Also, we examined long-term of meteorological parameters for the same period. The result showed positive and negative NDVI trends concurred with land cover types change representing an improve or a degrade in vegetation, respectively. Also, integrated the climate parameters which were precipitation and air temperature changes in the same time period seem to have had an affecting on huge NDVI trend area. The time series trend analysis approach applied successfully determined hotspot of an improvement and a degraded area due to land degradation and desertification.
Permutation entropy of finite-length white-noise time series.
Little, Douglas J; Kane, Deb M
2016-08-01
Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.
On Stabilizing the Variance of Dynamic Functional Brain Connectivity Time Series
Fransson, Peter
2016-01-01
Abstract Assessment of dynamic functional brain connectivity based on functional magnetic resonance imaging (fMRI) data is an increasingly popular strategy to investigate temporal dynamics of the brain's large-scale network architecture. Current practice when deriving connectivity estimates over time is to use the Fisher transformation, which aims to stabilize the variance of correlation values that fluctuate around varying true correlation values. It is, however, unclear how well the stabilization of signal variance performed by the Fisher transformation works for each connectivity time series, when the true correlation is assumed to be fluctuating. This is of importance because many subsequent analyses either assume or perform better when the time series have stable variance or adheres to an approximate Gaussian distribution. In this article, using simulations and analysis of resting-state fMRI data, we analyze the effect of applying different variance stabilization strategies on connectivity time series. We focus our investigation on the Fisher transformation, the Box–Cox (BC) transformation and an approach that combines both transformations. Our results show that, if the intention of stabilizing the variance is to use metrics on the time series, where stable variance or a Gaussian distribution is desired (e.g., clustering), the Fisher transformation is not optimal and may even skew connectivity time series away from being Gaussian. Furthermore, we show that the suboptimal performance of the Fisher transformation can be substantially improved by including an additional BC transformation after the dynamic functional connectivity time series has been Fisher transformed. PMID:27784176
On Stabilizing the Variance of Dynamic Functional Brain Connectivity Time Series.
Thompson, William Hedley; Fransson, Peter
2016-12-01
Assessment of dynamic functional brain connectivity based on functional magnetic resonance imaging (fMRI) data is an increasingly popular strategy to investigate temporal dynamics of the brain's large-scale network architecture. Current practice when deriving connectivity estimates over time is to use the Fisher transformation, which aims to stabilize the variance of correlation values that fluctuate around varying true correlation values. It is, however, unclear how well the stabilization of signal variance performed by the Fisher transformation works for each connectivity time series, when the true correlation is assumed to be fluctuating. This is of importance because many subsequent analyses either assume or perform better when the time series have stable variance or adheres to an approximate Gaussian distribution. In this article, using simulations and analysis of resting-state fMRI data, we analyze the effect of applying different variance stabilization strategies on connectivity time series. We focus our investigation on the Fisher transformation, the Box-Cox (BC) transformation and an approach that combines both transformations. Our results show that, if the intention of stabilizing the variance is to use metrics on the time series, where stable variance or a Gaussian distribution is desired (e.g., clustering), the Fisher transformation is not optimal and may even skew connectivity time series away from being Gaussian. Furthermore, we show that the suboptimal performance of the Fisher transformation can be substantially improved by including an additional BC transformation after the dynamic functional connectivity time series has been Fisher transformed.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460
Bernaola-Galván, Pedro A; Gómez-Extremera, Manuel; Romance, A Ramón; Carpena, Pedro
2017-09-01
The correlation properties of the magnitude of a time series are associated with nonlinear and multifractal properties and have been applied in a great variety of fields. Here we have obtained the analytical expression of the autocorrelation of the magnitude series (C_{|x|}) of a linear Gaussian noise as a function of its autocorrelation (C_{x}). For both, models and natural signals, the deviation of C_{|x|} from its expectation in linear Gaussian noises can be used as an index of nonlinearity that can be applied to relatively short records and does not require the presence of scaling in the time series under study. In a model of artificial Gaussian multifractal signal we use this approach to analyze the relation between nonlinearity and multifractallity and show that the former implies the latter but the reverse is not true. We also apply this approach to analyze experimental data: heart-beat records during rest and moderate exercise. For each individual subject, we observe higher nonlinearities during rest. This behavior is also achieved on average for the analyzed set of 10 semiprofessional soccer players. This result agrees with the fact that other measures of complexity are dramatically reduced during exercise and can shed light on its relationship with the withdrawal of parasympathetic tone and/or the activation of sympathetic activity during physical activity.
NASA Astrophysics Data System (ADS)
Bernaola-Galván, Pedro A.; Gómez-Extremera, Manuel; Romance, A. Ramón; Carpena, Pedro
2017-09-01
The correlation properties of the magnitude of a time series are associated with nonlinear and multifractal properties and have been applied in a great variety of fields. Here we have obtained the analytical expression of the autocorrelation of the magnitude series (C|x |) of a linear Gaussian noise as a function of its autocorrelation (Cx). For both, models and natural signals, the deviation of C|x | from its expectation in linear Gaussian noises can be used as an index of nonlinearity that can be applied to relatively short records and does not require the presence of scaling in the time series under study. In a model of artificial Gaussian multifractal signal we use this approach to analyze the relation between nonlinearity and multifractallity and show that the former implies the latter but the reverse is not true. We also apply this approach to analyze experimental data: heart-beat records during rest and moderate exercise. For each individual subject, we observe higher nonlinearities during rest. This behavior is also achieved on average for the analyzed set of 10 semiprofessional soccer players. This result agrees with the fact that other measures of complexity are dramatically reduced during exercise and can shed light on its relationship with the withdrawal of parasympathetic tone and/or the activation of sympathetic activity during physical activity.
Quantifying Selection with Pool-Seq Time Series Data.
Taus, Thomas; Futschik, Andreas; Schlötterer, Christian
2017-11-01
Allele frequency time series data constitute a powerful resource for unraveling mechanisms of adaptation, because the temporal dimension captures important information about evolutionary forces. In particular, Evolve and Resequence (E&R), the whole-genome sequencing of replicated experimentally evolving populations, is becoming increasingly popular. Based on computer simulations several studies proposed experimental parameters to optimize the identification of the selection targets. No such recommendations are available for the underlying parameters selection strength and dominance. Here, we introduce a highly accurate method to estimate selection parameters from replicated time series data, which is fast enough to be applied on a genome scale. Using this new method, we evaluate how experimental parameters can be optimized to obtain the most reliable estimates for selection parameters. We show that the effective population size (Ne) and the number of replicates have the largest impact. Because the number of time points and sequencing coverage had only a minor effect, we suggest that time series analysis is feasible without major increase in sequencing costs. We anticipate that time series analysis will become routine in E&R studies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Morton, A.
2017-10-01
Identifying erratic or unstable time-series is an area of interest to many fields. Recently, there have been successful developments towards this goal. These new developed methodologies however come from domains where it is typical to have several thousand or more temporal observations. This creates a challenge when attempting to apply these methodologies to time-series with much fewer temporal observations such as for socio-cultural understanding, a domain where a typical time series of interest might only consist of 20-30 annual observations. Most existing methodologies simply cannot say anything interesting with so few data points, yet researchers are still tasked to work within in the confines of the data. Recently a method for characterizing instability in a time series with limitedtemporal observations was published. This method, Attribute Stability Index (ASI), uses an approximate entropy based method tocharacterize a time series' instability. In this paper we propose an explicitly spatially weighted extension of the Attribute StabilityIndex. By including a mechanism to account for spatial autocorrelation, this work represents a novel approach for the characterizationof space-time instability. As a case study we explore national youth male unemployment across the world from 1991-2014.
NASA Astrophysics Data System (ADS)
Müller, H.; Haberlandt, U.
2018-01-01
Rainfall time series of high temporal resolution and spatial density are crucial for urban hydrology. The multiplicative random cascade model can be used for temporal rainfall disaggregation of daily data to generate such time series. Here, the uniform splitting approach with a branching number of 3 in the first disaggregation step is applied. To achieve a final resolution of 5 min, subsequent steps after disaggregation are necessary. Three modifications at different disaggregation levels are tested in this investigation (uniform splitting at Δt = 15 min, linear interpolation at Δt = 7.5 min and Δt = 3.75 min). Results are compared both with observations and an often used approach, based on the assumption that a time steps with Δt = 5.625 min, as resulting if a branching number of 2 is applied throughout, can be replaced with Δt = 5 min (called the 1280 min approach). Spatial consistence is implemented in the disaggregated time series using a resampling algorithm. In total, 24 recording stations in Lower Saxony, Northern Germany with a 5 min resolution have been used for the validation of the disaggregation procedure. The urban-hydrological suitability is tested with an artificial combined sewer system of about 170 hectares. The results show that all three variations outperform the 1280 min approach regarding reproduction of wet spell duration, average intensity, fraction of dry intervals and lag-1 autocorrelation. Extreme values with durations of 5 min are also better represented. For durations of 1 h, all approaches show only slight deviations from the observed extremes. The applied resampling algorithm is capable to achieve sufficient spatial consistence. The effects on the urban hydrological simulations are significant. Without spatial consistence, flood volumes of manholes and combined sewer overflow are strongly underestimated. After resampling, results using disaggregated time series as input are in the range of those using observed time series. Best overall performance regarding rainfall statistics are obtained by the method in which the disaggregation process ends at time steps with 7.5 min duration, deriving the 5 min time steps by linear interpolation. With subsequent resampling this method leads to a good representation of manhole flooding and combined sewer overflow volume in terms of hydrological simulations and outperforms the 1280 min approach.
Røislien, Jo; Winje, Brita
2013-09-20
Clinical studies frequently include repeated measurements of individuals, often for long periods. We present a methodology for extracting common temporal features across a set of individual time series observations. In particular, the methodology explores extreme observations within the time series, such as spikes, as a possible common temporal phenomenon. Wavelet basis functions are attractive in this sense, as they are localized in both time and frequency domains simultaneously, allowing for localized feature extraction from a time-varying signal. We apply wavelet basis function decomposition of individual time series, with corresponding wavelet shrinkage to remove noise. We then extract common temporal features using linear principal component analysis on the wavelet coefficients, before inverse transformation back to the time domain for clinical interpretation. We demonstrate the methodology on a subset of a large fetal activity study aiming to identify temporal patterns in fetal movement (FM) count data in order to explore formal FM counting as a screening tool for identifying fetal compromise and thus preventing adverse birth outcomes. Copyright © 2013 John Wiley & Sons, Ltd.
Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M
2016-01-26
Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP).
NASA Astrophysics Data System (ADS)
Chen, Yonghong; Bressler, Steven L.; Knuth, Kevin H.; Truccolo, Wilson A.; Ding, Mingzhou
2006-06-01
In this article we consider the stochastic modeling of neurobiological time series from cognitive experiments. Our starting point is the variable-signal-plus-ongoing-activity model. From this model a differentially variable component analysis strategy is developed from a Bayesian perspective to estimate event-related signals on a single trial basis. After subtracting out the event-related signal from recorded single trial time series, the residual ongoing activity is treated as a piecewise stationary stochastic process and analyzed by an adaptive multivariate autoregressive modeling strategy which yields power, coherence, and Granger causality spectra. Results from applying these methods to local field potential recordings from monkeys performing cognitive tasks are presented.
Future mission studies: Forecasting solar flux directly from its chaotic time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.
1991-01-01
The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.
NASA Astrophysics Data System (ADS)
Dong, Keqiang; Gao, You; Jing, Liming
2015-02-01
The presence of cross-correlation in complex systems has long been noted and studied in a broad range of physical applications. We here focus on an aero-engine system as an example of a complex system. By applying the detrended cross-correlation (DCCA) coefficient method to aero-engine time series, we investigate the effects of the data length and the time scale on the detrended cross-correlation coefficients ρ DCCA ( T, s). We then show, for a twin-engine aircraft, that the engine fuel flow time series derived from the left engine and the right engine exhibit much stronger cross-correlations than the engine exhaust-gas temperature series derived from the left engine and the right engine do.
Testing the effectiveness of family therapeutic assessment: a case study using a time-series design.
Smith, Justin D; Wolf, Nicole J; Handler, Leonard; Nash, Michael R
2009-11-01
We describe a family Therapeutic Assessment (TA) case study employing 2 assessors, 2 assessment rooms, and a video link. In the study, we employed a daily measures time-series design with a pretreatment baseline and follow-up period to examine the family TA treatment model. In addition to being an illustrative addition to a number of clinical reports suggesting the efficacy of family TA, this study is the first to apply a case-based time-series design to test whether family TA leads to clinical improvement and also illustrates when that improvement occurs. Results support the trajectory of change proposed by Finn (2007), the TA model's creator, who posits that benefits continue beyond the formal treatment itself.
The influence of trading volume on market efficiency: The DCCA approach
NASA Astrophysics Data System (ADS)
Sukpitak, Jessada; Hengpunya, Varagorn
2016-09-01
For a single market, the cross-correlation between market efficiency and trading volume, which is an indicator of market liquidity, is attentively analysed. The study begins with creating time series of market efficiency by applying time-varying Hurst exponent with one year sliding window to daily closing prices. The time series of trading volume corresponding to the same time period used for the market efficiency is derived from one year moving average of daily trading volume. Subsequently, the detrended cross-correlation coefficient is employed to quantify the degree of cross-correlation between the two time series. It was found that values of cross-correlation coefficient of all considered stock markets are close to 0 and are clearly out of range in which correlation being considered significant in almost every time scale. Obtained results show that the market liquidity in term of trading volume hardly has effect on the market efficiency.
Brigode, Pierre; Brissette, Francois; Nicault, Antoine; ...
2016-09-06
Over the last decades, different methods have been used by hydrologists to extend observed hydro-climatic time series, based on other data sources, such as tree rings or sedimentological datasets. For example, tree ring multi-proxies have been studied for the Caniapiscau Reservoir in northern Québec (Canada), leading to the reconstruction of flow time series for the last 150 years. In this paper, we applied a new hydro-climatic reconstruction method on the Caniapiscau Reservoir and compare the obtained streamflow time series against time series derived from dendrohydrology by other authors on the same catchment and study the natural streamflow variability over themore » 1881–2011 period in that region. This new reconstruction is based not on natural proxies but on a historical reanalysis of global geopotential height fields, and aims firstly to produce daily climatic time series, which are then used as inputs to a rainfall–runoff model in order to obtain daily streamflow time series. The performances of the hydro-climatic reconstruction were quantified over the observed period, and showed good performances, in terms of both monthly regimes and interannual variability. The streamflow reconstructions were then compared to two different reconstructions performed on the same catchment by using tree ring data series, one being focused on mean annual flows and the other on spring floods. In terms of mean annual flows, the interannual variability in the reconstructed flows was similar (except for the 1930–1940 decade), with noteworthy changes seen in wetter and drier years. For spring floods, the reconstructed interannual variabilities were quite similar for the 1955–2011 period, but strongly different between 1880 and 1940. Here, the results emphasize the need to apply different reconstruction methods on the same catchments. Indeed, comparisons such as those above highlight potential differences between available reconstructions and, finally, allow a retrospective analysis of the proposed reconstructions of past hydro-climatological variabilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brigode, Pierre; Brissette, Francois; Nicault, Antoine
Over the last decades, different methods have been used by hydrologists to extend observed hydro-climatic time series, based on other data sources, such as tree rings or sedimentological datasets. For example, tree ring multi-proxies have been studied for the Caniapiscau Reservoir in northern Québec (Canada), leading to the reconstruction of flow time series for the last 150 years. In this paper, we applied a new hydro-climatic reconstruction method on the Caniapiscau Reservoir and compare the obtained streamflow time series against time series derived from dendrohydrology by other authors on the same catchment and study the natural streamflow variability over themore » 1881–2011 period in that region. This new reconstruction is based not on natural proxies but on a historical reanalysis of global geopotential height fields, and aims firstly to produce daily climatic time series, which are then used as inputs to a rainfall–runoff model in order to obtain daily streamflow time series. The performances of the hydro-climatic reconstruction were quantified over the observed period, and showed good performances, in terms of both monthly regimes and interannual variability. The streamflow reconstructions were then compared to two different reconstructions performed on the same catchment by using tree ring data series, one being focused on mean annual flows and the other on spring floods. In terms of mean annual flows, the interannual variability in the reconstructed flows was similar (except for the 1930–1940 decade), with noteworthy changes seen in wetter and drier years. For spring floods, the reconstructed interannual variabilities were quite similar for the 1955–2011 period, but strongly different between 1880 and 1940. Here, the results emphasize the need to apply different reconstruction methods on the same catchments. Indeed, comparisons such as those above highlight potential differences between available reconstructions and, finally, allow a retrospective analysis of the proposed reconstructions of past hydro-climatological variabilities.« less
Assessment of New Load Schedules for the Machine Calibration of a Force Balance
NASA Technical Reports Server (NTRS)
Ulbrich, N.; Gisler, R.; Kew, R.
2015-01-01
New load schedules for the machine calibration of a six-component force balance are currently being developed and evaluated at the NASA Ames Balance Calibration Laboratory. One of the proposed load schedules is discussed in the paper. It has a total of 2082 points that are distributed across 16 load series. Several criteria were applied to define the load schedule. It was decided, for example, to specify the calibration load set in force balance format as this approach greatly simplifies the definition of the lower and upper bounds of the load schedule. In addition, all loads are assumed to be applied in a calibration machine by using the one-factor-at-a-time approach. At first, all single-component loads are applied in six load series. Then, three two-component load series are applied. They consist of the load pairs (N1, N2), (S1, S2), and (RM, AF). Afterwards, four three-component load series are applied. They consist of the combinations (N1, N2, AF), (S1, S2, AF), (N1, N2, RM), and (S1, S2, RM). In the next step, one four-component load series is applied. It is the load combination (N1, N2, S1, S2). Finally, two five-component load series are applied. They are the load combination (N1, N2, S1, S2, AF) and (N1, N2, S1, S2, RM). The maximum difference between loads of two subsequent data points of the load schedule is limited to 33 % of capacity. This constraint helps avoid unwanted load "jumps" in the load schedule that can have a negative impact on the performance of a calibration machine. Only loadings of the single- and two-component load series are loaded to 100 % of capacity. This approach was selected because it keeps the total number of calibration points to a reasonable limit while still allowing for the application of some of the more complex load combinations. Data from two of NASA's force balances is used to illustrate important characteristics of the proposed 2082-point calibration load schedule.
An application of HOMER and ACMANT for homogenising monthly precipitation records in Ireland
NASA Astrophysics Data System (ADS)
Coll, John; Curley, Mary; Domonkos, Peter; Aguilar, Enric; Walsh, Seamus; Sweeney, John
2015-04-01
Climate change studies based only on raw long-term data are potentially flawed due to the many breaks introduced from non-climatic sources. Consequently, accurate climate data is an essential prerequisite for basing climate related decision making on; and quality controlled, homogenised climate data are becoming integral to European Union Member State efforts to deliver climate services. Ireland has a good repository of monthly precipitation data at approximately 1900 locations stored in the Met Éireann database. The record length at individual precipitation stations varies greatly. However, an audit of the data established the continuous record length at each station and the number of missing months, and based on this two initial subsets of station series (n = 88 and n = 110) were identified for preliminary homogenisation efforts. The HOMER joint detection algorithm was applied to the combined network of these 198 longer station series on an Ireland-wide basis where contiguous intact monthly records ranged from ~40 to 71 years (1941 - 2010). HOMER detected 91 breaks in total in the country-wide series analysis distributed across 63 (~32%) of the 71 year series records analysed. In a separate approach, four sub-series clusters (n = 38 - 61) for the 1950 - 2010 period were used in a parallel analysis applying both ACMANT and HOMER to a regionalised split of the 198 series. By comparison ACMANT detected a considerably higher number of breaks across the four regional series clusters, 238 distributed across 123 (~62%) of the 61 year series records analysed. These preliminary results indicate a relatively high proportion of detected breaks in the series, a situation not generally reflected in observed later 20th century precipitation records across Europe (Domonkos, 2014). However, this elevated ratio of series with detected breaks (~32% in HOMER and ~62% in ACMANT) parallels the break detection rate in a recent analysis of series in the Netherlands (Buishand et al 2013). In the case of Ireland, the climate is even more markedly maritime than that of the Netherlands and the spatial correlations between the Irish series are high (>0.8). Therefore it is likely that both HOMER and ACMANT are detecting relatively small breaks in the series; e.g. the overall range of correction amplitudes derived by HOMER were small and only applied to sections of the corrected series. As Ireland has a relatively dense network of highly correlated station series, we anticipate continued high detection rates as the analysis is extended to incorporate a greater number of station series, and that the ongoing work will quantify the extent of any breaks in Ireland's monthly precipitation series. KEY WORDS: Ireland, precipitation, time series, homogenisation, HOMER, ACMANT. References Buishand, T.A., DeMartino, G., Spreeuw, J.N., Brandsma, T. (2013). Homogeneity of precipitation series in the Netherlands and their trends in the past century. International Journal of Climatology. 33:815-833 Domonkos, P. (2014). Homogenisation of precipitation time series with ACMANT. Theoretical and Applied Climatology. 118:1-2. DOI 10.1007/s00704-014-1298-5.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
NASA Astrophysics Data System (ADS)
Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar
2016-02-01
The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time series or relations among phase shifted time series.
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671
NASA Astrophysics Data System (ADS)
Saturnino, Diana; Langlais, Benoit; Amit, Hagay; Mandea, Mioara; Civet, François; Beucler, Éric
2017-04-01
A complete description of the main geomagnetic field temporal variation is crucial to understand dynamics in the core. This variation, termed secular variation (SV), is known with high accuracy at ground magnetic observatory locations. However the description of its spatial variability is hampered by the globally uneven distribution of the observatories. For the past two decades a global coverage of the field changes has been allowed by satellites. Their surveys of the geomagnetic field have been used to derive and improve global spherical harmonic (SH) models through some strict data selection schemes to minimise external field contributions. But discrepancies remain between ground measurements and field predictions by these models. Indeed, the global models do not reproduce small spatial scales of the field temporal variations. To overcome this problem we propose a modified Virtual Observatory (VO) approach by defining a globally homogeneous mesh of VOs at satellite altitude. With this approach we directly extract time series of the field and its temporal variation from satellite measurements as it is done at observatory locations. As satellite measurements are acquired at different altitudes a correction for the altitude is needed. Therefore, we apply an Equivalent Source Dipole (ESD) technique for each VO and each given time interval to reduce all measurements to a unique location, leading to time series similar to those available at ground magnetic observatories. Synthetic data is first used to validate the new VO-ESD approach. Then, we apply our scheme to measurements from the Swarm mission. For the first time, a 2.5 degrees resolution global mesh of VO times series is built. The VO-ESD derived time series are locally compared to ground observations as well as to satellite-based model predictions. The approach is able to describe detailed temporal variations of the field at local scales. The VO-ESD time series are also used to derive global SH models. Without regularization these models describe well the secular trend of the magnetic field. The derivation of longer VO-ESD time series, as more data will be made available, will allow the study of field temporal variations features such as geomagnetic jerks.
Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.
Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus
2012-01-02
Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop. Copyright © 2011 Elsevier Inc. All rights reserved.
Multifractal detrended cross-correlation analysis for two nonstationary signals.
Zhou, Wei-Xing
2008-06-01
We propose a method called multifractal detrended cross-correlation analysis to investigate the multifractal behaviors in the power-law cross-correlations between two time series or higher-dimensional quantities recorded simultaneously, which can be applied to diverse complex systems such as turbulence, finance, ecology, physiology, geophysics, and so on. The method is validated with cross-correlated one- and two-dimensional binomial measures and multifractal random walks. As an example, we illustrate the method by analyzing two financial time series.
Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Hou, Zhangshuan; Meng, Da
2016-07-17
In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.
Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes
NASA Astrophysics Data System (ADS)
Zhou, Wei-Xing; Sornette, Didier
We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.
Multifractality Signatures in Quasars Time Series. I. 3C 273
NASA Astrophysics Data System (ADS)
Belete, A. Bewketu; Bravo, J. P.; Canto Martins, B. L.; Leão, I. C.; De Araujo, J. M.; De Medeiros, J. R.
2018-05-01
The presence of multifractality in a time series shows different correlations for different time scales as well as intermittent behaviour that cannot be captured by a single scaling exponent. The identification of a multifractal nature allows for a characterization of the dynamics and of the intermittency of the fluctuations in non-linear and complex systems. In this study, we search for a possible multifractal structure (multifractality signature) of the flux variability in the quasar 3C 273 time series for all electromagnetic wavebands at different observation points, and the origins for the observed multifractality. This study is intended to highlight how the scaling behaves across the different bands of the selected candidate which can be used as an additional new technique to group quasars based on the fractal signature observed in their time series and determine whether quasars are non-linear physical systems or not. The Multifractal Detrended Moving Average algorithm (MFDMA) has been used to study the scaling in non-linear, complex and dynamic systems. To achieve this goal, we applied the backward (θ = 0) MFDMA method for one-dimensional signals. We observe weak multifractal (close to monofractal) behaviour in some of the time series of our candidate except in the mm, UV and X-ray bands. The non-linear temporal correlation is the main source of the observed multifractality in the time series whereas the heaviness of the distribution contributes less.
Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015
2012-01-01
Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561
Bai, Chunmei; Li, Yusong
2014-08-01
Accurately predicting the transport of contaminants in the field is subject to multiple sources of uncertainty due to the variability of geological settings, the complexity of field measurements, and the scarcity of data. Such uncertainties can be amplified when modeling some emerging contaminants, such as engineered nanomaterials, when a fundamental understanding of their fate and transport is lacking. Typical field work includes collecting concentration at a certain location for an extended period of time, or measuring the movement of plume for an extended period time, which would result in a time series of observation data. This work presents an effort to evaluate the possibility of applying time series analysis, particularly, autoregressive integrated moving average (ARIMA) models, to forecast contaminant transport and distribution in the subsurface environment. ARIMA modeling was first assessed in terms of its capability to forecast tracer transport at two field sites, which had different levels of heterogeneity. After that, this study evaluated the applicability of ARIMA modeling to predict the transport of engineered nanomaterials at field sites, including field measured data of nanoscale zero valent iron and (nZVI) and numerically generated data for the transport of nano-fullerene aggregates (nC60). This proof-of-concept effort demonstrates the possibility of applying ARIMA to predict the contaminant transport in the subsurface environment. Like many other statistical models, ARIMA modeling is only descriptive and not explanatory. The limitation and the challenge associated with applying ARIMA modeling to contaminant transport in the subsurface are also discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bai, Chunmei; Li, Yusong
2014-08-01
Accurately predicting the transport of contaminants in the field is subject to multiple sources of uncertainty due to the variability of geological settings, the complexity of field measurements, and the scarcity of data. Such uncertainties can be amplified when modeling some emerging contaminants, such as engineered nanomaterials, when a fundamental understanding of their fate and transport is lacking. Typical field work includes collecting concentration at a certain location for an extended period of time, or measuring the movement of plume for an extended period time, which would result in a time series of observation data. This work presents an effort to evaluate the possibility of applying time series analysis, particularly, autoregressive integrated moving average (ARIMA) models, to forecast contaminant transport and distribution in the subsurface environment. ARIMA modeling was first assessed in terms of its capability to forecast tracer transport at two field sites, which had different levels of heterogeneity. After that, this study evaluated the applicability of ARIMA modeling to predict the transport of engineered nanomaterials at field sites, including field measured data of nanoscale zero valent iron and (nZVI) and numerically generated data for the transport of nano-fullerene aggregates (nC60). This proof-of-concept effort demonstrates the possibility of applying ARIMA to predict the contaminant transport in the subsurface environment. Like many other statistical models, ARIMA modeling is only descriptive and not explanatory. The limitation and the challenge associated with applying ARIMA modeling to contaminant transport in the subsurface are also discussed.
Multiscale entropy-based methods for heart rate variability complexity analysis
NASA Astrophysics Data System (ADS)
Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio
2015-03-01
Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.
NASA Astrophysics Data System (ADS)
Lasaponara, Rosa; Lanorte, Antonio; Lovallo, Michele; Telesca, Luciano
2015-04-01
Time series can fruitfully support fire monitoring and management from statistical analysis of fire occurrence (Tuia et al. 2008) to danger estimation (lasaponara 2005), damage evaluation (Lanorte et al 2014) and post fire recovery (Lanorte et al. 2014). In this paper, the time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers. Reference Lanorte A, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbanceInternational Journal of Applied Earth Observation and Geoinformation 26 441-446 Lanorte A, M Danese, R Lasaponara, B Murgante 2014 Multiscale mapping of burn area and severity using multisensor satellite data and spatial autocorrelation analysis International Journal of Applied Earth Observation and Geoinformation 20, 42-51 Tuia D, F Ratle, R Lasaponara, L Telesca, M Kanevski 2008 Scan statistics analysis of forest fire clusters Communications in Nonlinear Science and Numerical Simulation 13 (8), 1689-1694 Telesca L, R Lasaponara 2006 Pre and post fire behavioral trends revealed in satellite NDVI time series Geophysical Research Letters 33 (14) Lasaponara R 2005 Intercomparison of AVHRR based fire susceptibility indicators for the Mediterranean ecosystems of southern Italy International Journal of Remote Sensing 26 (5), 853-870
Information extraction from dynamic PS-InSAR time series using machine learning
NASA Astrophysics Data System (ADS)
van de Kerkhof, B.; Pankratius, V.; Chang, L.; van Swol, R.; Hanssen, R. F.
2017-12-01
Due to the increasing number of SAR satellites, with shorter repeat intervals and higher resolutions, SAR data volumes are exploding. Time series analyses of SAR data, i.e. Persistent Scatterer (PS) InSAR, enable the deformation monitoring of the built environment at an unprecedented scale, with hundreds of scatterers per km2, updated weekly. Potential hazards, e.g. due to failure of aging infrastructure, can be detected at an early stage. Yet, this requires the operational data processing of billions of measurement points, over hundreds of epochs, updating this data set dynamically as new data come in, and testing whether points (start to) behave in an anomalous way. Moreover, the quality of PS-InSAR measurements is ambiguous and heterogeneous, which will yield false positives and false negatives. Such analyses are numerically challenging. Here we extract relevant information from PS-InSAR time series using machine learning algorithms. We cluster (group together) time series with similar behaviour, even though they may not be spatially close, such that the results can be used for further analysis. First we reduce the dimensionality of the dataset in order to be able to cluster the data, since applying clustering techniques on high dimensional datasets often result in unsatisfying results. Our approach is to apply t-distributed Stochastic Neighbor Embedding (t-SNE), a machine learning algorithm for dimensionality reduction of high-dimensional data to a 2D or 3D map, and cluster this result using Density-Based Spatial Clustering of Applications with Noise (DBSCAN). The results show that we are able to detect and cluster time series with similar behaviour, which is the starting point for more extensive analysis into the underlying driving mechanisms. The results of the methods are compared to conventional hypothesis testing as well as a Self-Organising Map (SOM) approach. Hypothesis testing is robust and takes the stochastic nature of the observations into account, but is time consuming. Therefore, we successively apply our machine learning approach with the hypothesis testing approach in order to benefit from both the reduced computation time of the machine learning approach as from the robust quality metrics of hypothesis testing. We acknowledge support from NASA AISTNNX15AG84G (PI V. Pankratius)
NASA Astrophysics Data System (ADS)
Jiao, Quanjun; Zhang, Xiao; Sun, Qi
2018-03-01
The availability of dense time series of Landsat images pro-vides a great chance to reconstruct forest disturbance and change history with high temporal resolution, medium spatial resolution and long period. This proposal aims to apply forest change detection method in Hainan Jianfengling Forest Park using yearly Landsat time-series images. A simple detection method from the dense time series Landsat NDVI images will be used to reconstruct forest change history (afforestation and deforestation). The mapping result showed a large decrease occurred in the extent of closed forest from 1980s to 1990s. From the beginning of the 21st century, we found an increase in forest areas with the implementation of forestry measures such as the prohibition of cutting and sealing in our study area. Our findings provide an effective approach for quickly detecting forest changes in tropical original forest, especially for afforestation and deforestation, and a comprehensive analysis tool for forest resource protection.
Long series of geomagnetic measurements - unique at satellite era
NASA Astrophysics Data System (ADS)
Mandea, Mioara; Balasis, Georgios
2017-04-01
We have long appreciated that magnetic measurements obtained at Earth's surface are of great value in characterizing geomagnetic field behavior and then probing the deep interior of our Planet. The existence of new magnetic satellite missions data offer a new detailed global understanding of the geomagnetic field. However, when our interest moves over long-time scales, the very long series of measurements play an important role. Here, we firstly provide an updated series of geomagnetic declination in Paris, shortly after a very special occasion: its value has reached zero after some 350 years of westerly values. We take this occasion to emphasize the importance of long series of continuous measurements, mainly when various techniques are used to detect the abrupt changes in geomagnetic field, the geomagnetic jerks. Many novel concepts originated in dynamical systems or information theory have been developed, partly motivated by specific research questions from the geosciences. This continuously extending toolbox of nonlinear time series analysis is a key to understand the complexity of geomagnetic field. Here, motivated by these efforts, a series of entropy analysis are applied to geomagnetic field time series aiming to detect dynamical complex changes associated with geomagnetic jerks.
Temporal evolution of total ozone and circulation patterns over European mid-latitudes
NASA Astrophysics Data System (ADS)
Monge Sanz, B. M.; Casale, G. R.; Palmieri, S.; Siani, A. M.
2003-04-01
Linear correlation analysis and the running correlation technique are used to investigate the interannual and interdecadal variations of total ozone (TO) over several mid-latitude European locations. The study includes the longest series of ozone data, that of the Swiss station of Arosa. TO series have been related to time series of two circulation indices, the North Atlantic Oscillation Index (NAOI) and the Arctic Oscillation Index (AOI). The analysis has been performed with monthly data, and both series containing all the months of the year and winter (DJFM) series have been used. Special attention has been given to winter series, which exhibit very high correlation coefficients with NAOI and AOI; interannual variations of this relationship are studied by applying the running correlation technique. TO and circulation indices data series have been also partitioned into their different time-scale components with the Kolmogorov-Zurbenko method. Long-term components indicate the existence of strong opposite connection between total ozone and circulation patterns over the studied region during the last three decades. However, it is also observed that this relation has not always been so, and in previous times differences in the correlation amplitude and sign have been detected.
Low Streamflow Forcasting using Minimum Relative Entropy
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2013-12-01
Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.
NASA Astrophysics Data System (ADS)
Heinemeier, Jan; Jungner, Högne; Lindroos, Alf; Ringbom, Åsa; von Konow, Thorborg; Rud, Niels
1997-03-01
A method for refining lime mortar samples for 14C dating has been developed. It includes mechanical and chemical separation of mortar carbonate with optical control of the purity of the samples. The method has been applied to a large series of AMS datings on lime mortar from three medieval churches on the Åland Islands, Finland. The datings show convincing internal consistency and confine the construction time of the churches to AD 1280-1380 with a most probable date just before AD 1300. We have also applied the method to the controversial Newport Tower, Rhode Island, USA. Our mortar datings confine the building to colonial time in the 17th century and thus refute claims of Viking origin of the tower. For the churches, a parallel series of datings of organic (charcoal) inclusions in the mortar show less reliable results than the mortar samples, which is ascribed to poor association with the construction time.
NASA Astrophysics Data System (ADS)
Gábor Hatvani, István; Kern, Zoltán; Leél-Őssy, Szabolcs; Demény, Attila
2018-01-01
Uneven spacing is a common feature of sedimentary paleoclimate records, in many cases causing difficulties in the application of classical statistical and time series methods. Although special statistical tools do exist to assess unevenly spaced data directly, the transformation of such data into a temporally equidistant time series which may then be examined using commonly employed statistical tools remains, however, an unachieved goal. The present paper, therefore, introduces an approach to obtain evenly spaced time series (using cubic spline fitting) from unevenly spaced speleothem records with the application of a spectral guidance to avoid the spectral bias caused by interpolation and retain the original spectral characteristics of the data. The methodology was applied to stable carbon and oxygen isotope records derived from two stalagmites from the Baradla Cave (NE Hungary) dating back to the late 18th century. To show the benefit of the equally spaced records to climate studies, their coherence with climate parameters is explored using wavelet transform coherence and discussed. The obtained equally spaced time series are available at https://doi.org/10.1594/PANGAEA.875917.
Trading with the Future and Futures Trading. Series on Public Issues No. 14.
ERIC Educational Resources Information Center
Auernheimer, Leonardo
In this booklet, one of a series intended to apply economic principles to major social and political issues of the day, it is proposed that speculation is often misunderstood, particularly in the operation of the futures markets. These are markets in which obligations to consummate sales and purchases at some time in the future are traded at a…
Predicting hepatitis B monthly incidence rates using weighted Markov chains and time series methods.
Shahdoust, Maryam; Sadeghifar, Majid; Poorolajal, Jalal; Javanrooh, Niloofar; Amini, Payam
2015-01-01
Hepatitis B (HB) is a major global mortality. Accurately predicting the trend of the disease can provide an appropriate view to make health policy disease prevention. This paper aimed to apply three different to predict monthly incidence rates of HB. This historical cohort study was conducted on the HB incidence data of Hamadan Province, the west of Iran, from 2004 to 2012. Weighted Markov Chain (WMC) method based on Markov chain theory and two time series models including Holt Exponential Smoothing (HES) and SARIMA were applied on the data. The results of different applied methods were compared to correct percentages of predicted incidence rates. The monthly incidence rates were clustered into two clusters as state of Markov chain. The correct predicted percentage of the first and second clusters for WMC, HES and SARIMA methods was (100, 0), (84, 67) and (79, 47) respectively. The overall incidence rate of HBV is estimated to decrease over time. The comparison of results of the three models indicated that in respect to existing seasonality trend and non-stationarity, the HES had the most accurate prediction of the incidence rates.
Applying dynamic Bayesian networks to perturbed gene expression data.
Dojer, Norbert; Gambin, Anna; Mizera, Andrzej; Wilczyński, Bartek; Tiuryn, Jerzy
2006-05-08
A central goal of molecular biology is to understand the regulatory mechanisms of gene transcription and protein synthesis. Because of their solid basis in statistics, allowing to deal with the stochastic aspects of gene expressions and noisy measurements in a natural way, Bayesian networks appear attractive in the field of inferring gene interactions structure from microarray experiments data. However, the basic formalism has some disadvantages, e.g. it is sometimes hard to distinguish between the origin and the target of an interaction. Two kinds of microarray experiments yield data particularly rich in information regarding the direction of interactions: time series and perturbation experiments. In order to correctly handle them, the basic formalism must be modified. For example, dynamic Bayesian networks (DBN) apply to time series microarray data. To our knowledge the DBN technique has not been applied in the context of perturbation experiments. We extend the framework of dynamic Bayesian networks in order to incorporate perturbations. Moreover, an exact algorithm for inferring an optimal network is proposed and a discretization method specialized for time series data from perturbation experiments is introduced. We apply our procedure to realistic simulations data. The results are compared with those obtained by standard DBN learning techniques. Moreover, the advantages of using exact learning algorithm instead of heuristic methods are analyzed. We show that the quality of inferred networks dramatically improves when using data from perturbation experiments. We also conclude that the exact algorithm should be used when it is possible, i.e. when considered set of genes is small enough.
NASA Astrophysics Data System (ADS)
Wang, Duan; Podobnik, Boris; Horvatić, Davor; Stanley, H. Eugene
2011-04-01
We propose a modified time lag random matrix theory in order to study time-lag cross correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross correlations between the returns. The magnitude of the cross correlations constitutes “bad news” for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time-lag cross correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross correlations between returns (or magnitudes) can be modeled with the autocorrelations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find ten indices that are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
Wang, Duan; Podobnik, Boris; Horvatić, Davor; Stanley, H Eugene
2011-04-01
We propose a modified time lag random matrix theory in order to study time-lag cross correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross correlations between the returns. The magnitude of the cross correlations constitutes "bad news" for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time-lag cross correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross correlations between returns (or magnitudes) can be modeled with the autocorrelations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find ten indices that are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
Wavelet analysis in ecology and epidemiology: impact of statistical tests
Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario
2014-01-01
Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the ‘beta-surrogate’ method. PMID:24284892
Wavelet analysis in ecology and epidemiology: impact of statistical tests.
Cazelles, Bernard; Cazelles, Kévin; Chavez, Mario
2014-02-06
Wavelet analysis is now frequently used to extract information from ecological and epidemiological time series. Statistical hypothesis tests are conducted on associated wavelet quantities to assess the likelihood that they are due to a random process. Such random processes represent null models and are generally based on synthetic data that share some statistical characteristics with the original time series. This allows the comparison of null statistics with those obtained from original time series. When creating synthetic datasets, different techniques of resampling result in different characteristics shared by the synthetic time series. Therefore, it becomes crucial to consider the impact of the resampling method on the results. We have addressed this point by comparing seven different statistical testing methods applied with different real and simulated data. Our results show that statistical assessment of periodic patterns is strongly affected by the choice of the resampling method, so two different resampling techniques could lead to two different conclusions about the same time series. Moreover, our results clearly show the inadequacy of resampling series generated by white noise and red noise that are nevertheless the methods currently used in the wide majority of wavelets applications. Our results highlight that the characteristics of a time series, namely its Fourier spectrum and autocorrelation, are important to consider when choosing the resampling technique. Results suggest that data-driven resampling methods should be used such as the hidden Markov model algorithm and the 'beta-surrogate' method.
Binding Isotherms and Time Courses Readily from Magnetic Resonance.
Xu, Jia; Van Doren, Steven R
2016-08-16
Evidence is presented that binding isotherms, simple or biphasic, can be extracted directly from noninterpreted, complex 2D NMR spectra using principal component analysis (PCA) to reveal the largest trend(s) across the series. This approach renders peak picking unnecessary for tracking population changes. In 1:1 binding, the first principal component captures the binding isotherm from NMR-detected titrations in fast, slow, and even intermediate and mixed exchange regimes, as illustrated for phospholigand associations with proteins. Although the sigmoidal shifts and line broadening of intermediate exchange distorts binding isotherms constructed conventionally, applying PCA directly to these spectra along with Pareto scaling overcomes the distortion. Applying PCA to time-domain NMR data also yields binding isotherms from titrations in fast or slow exchange. The algorithm readily extracts from magnetic resonance imaging movie time courses such as breathing and heart rate in chest imaging. Similarly, two-step binding processes detected by NMR are easily captured by principal components 1 and 2. PCA obviates the customary focus on specific peaks or regions of images. Applying it directly to a series of complex data will easily delineate binding isotherms, equilibrium shifts, and time courses of reactions or fluctuations.
A probabilistic method for constructing wave time-series at inshore locations using model scenarios
Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.
2014-01-01
Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.
Correlation and Stacking of Relative Paleointensity and Oxygen Isotope Data
NASA Astrophysics Data System (ADS)
Lurcock, P. C.; Channell, J. E.; Lee, D.
2012-12-01
The transformation of a depth-series into a time-series is routinely implemented in the geological sciences. This transformation often involves correlation of a depth-series to an astronomically calibrated time-series. Eyeball tie-points with linear interpolation are still regularly used, although these have the disadvantages of being non-repeatable and not based on firm correlation criteria. Two automated correlation methods are compared: the simulated annealing algorithm (Huybers and Wunsch, 2004) and the Match protocol (Lisiecki and Lisiecki, 2002). Simulated annealing seeks to minimize energy (cross-correlation) as "temperature" is slowly decreased. The Match protocol divides records into intervals, applies penalty functions that constrain accumulation rates, and minimizes the sum of the squares of the differences between two series while maintaining the data sequence in each series. Paired relative paleointensity (RPI) and oxygen isotope records, such as those from IODP Site U1308 and/or reference stacks such as LR04 and PISO, are warped using known warping functions, and then the un-warped and warped time-series are correlated to evaluate the efficiency of the correlation methods. Correlations are performed in tandem to simultaneously optimize RPI and oxygen isotope data. Noise spectra are introduced at differing levels to determine correlation efficiency as noise levels change. A third potential method, known as dynamic time warping, involves minimizing the sum of distances between correlated point pairs across the whole series. A "cost matrix" between the two series is analyzed to find a least-cost path through the matrix. This least-cost path is used to nonlinearly map the time/depth of one record onto the depth/time of another. Dynamic time warping can be expanded to more than two dimensions and used to stack multiple time-series. This procedure can improve on arithmetic stacks, which often lose coherent high-frequency content during the stacking process.
tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.
Becker, Alexander D; Grenfell, Bryan T
2017-01-01
tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.
RankExplorer: Visualization of Ranking Changes in Large Time Series Data.
Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin
2012-12-01
For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.
Dou, Chao
2016-01-01
The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. PMID:28090205
Miao, Beibei; Dou, Chao; Jin, Xuebo
2016-01-01
The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always "dirty," which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the "dirty" data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. .
Movie-maps of low-latitude magnetic storm disturbance
NASA Astrophysics Data System (ADS)
Love, Jeffrey J.; Gannon, Jennifer L.
2010-06-01
We present 29 movie-maps of low-latitude horizontal-intensity magnetic disturbance for the years 1999-2006: 28 recording magnetic storms and 1 magnetically quiescent period. The movie-maps are derived from magnetic vector time series data collected at up to 25 ground-based observatories. Using a technique similar to that used in the calculation of Dst, a quiet time baseline is subtracted from the time series from each observatory. The remaining disturbance time series are shown in a polar coordinate system that accommodates both Earth rotation and the universal time dependence of magnetospheric disturbance. Each magnetic storm recorded in the movie-maps is different. While some standard interpretations about the storm time equatorial ring current appear to apply to certain moments and certain phases of some storms, the movie-maps also show substantial variety in the local time distribution of low-latitude magnetic disturbance, especially during storm commencements and storm main phases. All movie-maps are available at the U.S. Geological Survey Geomagnetism Program Web site (http://geomag.usgs.gov).
Fluctuations in Wikipedia access-rate and edit-event data
NASA Astrophysics Data System (ADS)
Kämpf, Mirko; Tismer, Sebastian; Kantelhardt, Jan W.; Muchnik, Lev
2012-12-01
Internet-based social networks often reflect extreme events in nature and society by drastic increases in user activity. We study and compare the dynamics of the two major complex processes necessary for information spread via the online encyclopedia ‘Wikipedia’, i.e., article editing (information upload) and article access (information viewing) based on article edit-event time series and (hourly) user access-rate time series for all articles. Daily and weekly activity patterns occur in addition to fluctuations and bursting activity. The bursts (i.e., significant increases in activity for an extended period of time) are characterized by a power-law distribution of durations of increases and decreases. For describing the recurrence and clustering of bursts we investigate the statistics of the return intervals between them. We find stretched exponential distributions of return intervals in access-rate time series, while edit-event time series yield simple exponential distributions. To characterize the fluctuation behavior we apply detrended fluctuation analysis (DFA), finding that most article access-rate time series are characterized by strong long-term correlations with fluctuation exponents α≈0.9. The results indicate significant differences in the dynamics of information upload and access and help in understanding the complex process of collecting, processing, validating, and distributing information in self-organized social networks.
Liao, Fuyuan; Jan, Yih-Kuen
2012-06-01
This paper presents a recurrence network approach for the analysis of skin blood flow dynamics in response to loading pressure. Recurrence is a fundamental property of many dynamical systems, which can be explored in phase spaces constructed from observational time series. A visualization tool of recurrence analysis called recurrence plot (RP) has been proved to be highly effective to detect transitions in the dynamics of the system. However, it was found that delay embedding can produce spurious structures in RPs. Network-based concepts have been applied for the analysis of nonlinear time series recently. We demonstrate that time series with different types of dynamics exhibit distinct global clustering coefficients and distributions of local clustering coefficients and that the global clustering coefficient is robust to the embedding parameters. We applied the approach to study skin blood flow oscillations (BFO) response to loading pressure. The results showed that global clustering coefficients of BFO significantly decreased in response to loading pressure (p<0.01). Moreover, surrogate tests indicated that such a decrease was associated with a loss of nonlinearity of BFO. Our results suggest that the recurrence network approach can practically quantify the nonlinear dynamics of BFO.
Temporal data mining for the quality assessment of hemodialysis services.
Bellazzi, Riccardo; Larizza, Cristiana; Magni, Paolo; Bellazzi, Roberto
2005-05-01
This paper describes the temporal data mining aspects of a research project that deals with the definition of methods and tools for the assessment of the clinical performance of hemodialysis (HD) services, on the basis of the time series automatically collected during hemodialysis sessions. Intelligent data analysis and temporal data mining techniques are applied to gain insight and to discover knowledge on the causes of unsatisfactory clinical results. In particular, two new methods for association rule discovery and temporal rule discovery are applied to the time series. Such methods exploit several pre-processing techniques, comprising data reduction, multi-scale filtering and temporal abstractions. We have analyzed the data of more than 5800 dialysis sessions coming from 43 different patients monitored for 19 months. The qualitative rules associating the outcome parameters and the measured variables were examined by the domain experts, which were able to distinguish between rules confirming available background knowledge and unexpected but plausible rules. The new methods proposed in the paper are suitable tools for knowledge discovery in clinical time series. Their use in the context of an auditing system for dialysis management helped clinicians to improve their understanding of the patients' behavior.
NASA Astrophysics Data System (ADS)
Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej
2015-04-01
In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear trend - only for 4 stations the size of linear trend was exactly the same for two periods of time. In one case, the nature of the trend has changed from negative (16-year time series) for positive (18-year time series). The average value of a linear trends for 16-year time series is 1,5 mm/decade, but their spatial distribution is not uniform. The average value of linear trends for all 18-year time series is 2,0 mm/decade, with better spatial distribution and smaller discrepancies.
Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng
2017-07-01
Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.
NASA Astrophysics Data System (ADS)
Perone, A.; Lombardi, F.; Marchetti, M.; Tognetti, R.; Lasserre, B.
2016-10-01
Tree rings reveal climatic variations through years, but also the effect of solar activity in influencing the climate on a large scale. In order to investigate the role of solar cycles on climatic variability and to analyse their influences on tree growth, we focused on tree-ring chronologies of Araucaria angustifolia and Araucaria araucana in four study areas: Irati and Curitiba in Brazil, Caviahue in Chile, and Tolhuaca in Argentina. We obtained an average tree-ring chronology of 218, 117, 439, and 849 years for these areas, respectively. Particularly, the older chronologies also included the period of the Maunder and Dalton minima. To identify periodicities and trends observable in tree growth, the time series were analysed using spectral, wavelet and cross-wavelet techniques. Analysis based on the Multitaper method of annual growth rates identified 2 cycles with periodicities of 11 (Schwebe cycle) and 5.5 years (second harmonic of Schwebe cycle). In the Chilean and Argentinian sites, significant agreement between the time series of tree rings and the 11-year solar cycle was found during the periods of maximum solar activity. Results also showed oscillation with periods of 2-7 years, probably induced by local environmental variations, and possibly also related to the El-Niño events. Moreover, the Morlet complex wavelet analysis was applied to study the most relevant variability factors affecting tree-ring time series. Finally, we applied the cross-wavelet spectral analysis to evaluate the time lags between tree-ring and sunspot-number time series, as well as for the interaction between tree rings, the Southern Oscillation Index (SOI) and temperature and precipitation. Trees sampled in Chile and Argentina showed more evident responses of fluctuations in tree-ring time series to the variations of short and long periodicities in comparison with the Brazilian ones. These results provided new evidence on the solar activity-climate pattern-tree ring connections over centuries.
NASA Astrophysics Data System (ADS)
Hentze, Konrad; Thonfeld, Frank; Menz, Gunter
2017-10-01
In the discourse on land reform assessments, a significant lack of spatial and time-series data has been identified, especially with respect to Zimbabwe's ;Fast-Track Land Reform Programme; (FTLRP). At the same time, interest persists among land use change scientists to evaluate causes of land use change and therefore to increase the explanatory power of remote sensing products. This study recognizes these demands and aims to provide input on both levels: Evaluating the potential of satellite remote sensing time-series to answer questions which evolved after intensive land redistribution efforts in Zimbabwe; and investigating how time-series analysis of Normalized Difference Vegetation Index (NDVI) can be enhanced to provide information on land reform induced land use change. To achieve this, two time-series methods are applied to MODIS NDVI data: Seasonal Trend Analysis (STA) and Breakpoint Analysis for Additive Season and Trend (BFAST). In our first analysis, a link of agricultural productivity trends to different land tenure regimes shows that regional clustering of trends is more dominant than a relationship between tenure and trend with a slightly negative slope for all regimes. We demonstrate that clusters of strong negative and positive productivity trends are results of changing irrigation patterns. To locate emerging and fallow irrigation schemes in semi-arid Zimbabwe, a new multi-method approach is developed which allows to map changes from bimodal seasonal phenological patterns to unimodal and vice versa. With an enhanced breakpoint analysis through the combination of STA and BFAST, we are able to provide a technique that can be applied on large scale to map status and development of highly productive cropping systems, which are key for food production, national export and local employment. We therefore conclude that the combination of existing and accessible time-series analysis methods: is able to achieve both: overcoming demonstrated limitations of MODIS based trend analysis and enhancing knowledge of Zimbabwe's FTLRP.
NASA Astrophysics Data System (ADS)
Wu, Xiaoping; Abbondanza, Claudio; Altamimi, Zuheir; Chin, T. Mike; Collilieux, Xavier; Gross, Richard S.; Heflin, Michael B.; Jiang, Yan; Parker, Jay W.
2015-05-01
The current International Terrestrial Reference Frame is based on a piecewise linear site motion model and realized by reference epoch coordinates and velocities for a global set of stations. Although linear motions due to tectonic plates and glacial isostatic adjustment dominate geodetic signals, at today's millimeter precisions, nonlinear motions due to earthquakes, volcanic activities, ice mass losses, sea level rise, hydrological changes, and other processes become significant. Monitoring these (sometimes rapid) changes desires consistent and precise realization of the terrestrial reference frame (TRF) quasi-instantaneously. Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates. In addition to secular, periodic, and stochastic components for station coordinates, the Kalman filter state variables also include daily Earth orientation parameters and transformation parameters from input data frames to the combined TRF. Local tie measurements among colocated stations are used at their known or nominal epochs of observation, with comotion constraints applied to almost all colocated stations. The filter/smoother approach unifies different geodetic time series in a single geocentric frame. Fragmented and multitechnique tracking records at colocation sites are bridged together to form longer and coherent motion time series. While the time series approach to TRF reflects the reality of a changing Earth more closely than the linear approximation model, the filter/smoother is computationally powerful and flexible to facilitate incorporation of other data types and more advanced characterization of stochastic behavior of geodetic time series.
Ranking streamflow model performance based on Information theory metrics
NASA Astrophysics Data System (ADS)
Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas
2016-04-01
The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.
NASA Astrophysics Data System (ADS)
Nole, Gabriele; Scorza, Francesco; Lanorte, Antonio; Manzi, Teresa; Lasaponara, Rosa
2015-04-01
This paper aims to present the development of a tool to integrate time series from active and passive satellite sensors (such as of MODIS, Vegetation, Landsat, ASTER, COSMO, Sentinel) into a virtual laboratory to support studies on landscape and archaeological landscape, investigation on environmental changes, estimation and monitoring of natural and anthropogenic risks. The virtual laboratory is composed by both data and open source tools specifically developed for the above mentioned applications. Results obtained for investigations carried out using the implemented tools for monitoring land degradation issues and subtle changes ongoing on forestry and natural areas are herein presented. In detail MODIS, SPOT Vegetation and Landsat time series were analyzed comparing results of different statistical analyses and the results integrated with ancillary data and evaluated with field survey. The comparison of the outputs we obtained for the Basilicata Region from satellite data analyses and independent data sets clearly pointed out the reliability for the diverse change analyses we performed, at the pixel level, using MODIS, SPOT Vegetation and Landsat TM data. Next steps are going to be implemented to further advance the current Virtual Laboratory tools, by extending current facilities adding new computational algorithms and applying to other geographic regions. Acknowledgement This research was performed within the framework of the project PO FESR Basilicata 2007/2013 - Progetto di cooperazione internazionale MITRA "Remote Sensing tecnologies for Natural and Cultural heritage Degradation Monitoring for Preservation and valorization" funded by Basilicata Region Reference 1. A. Lanorte, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbance International Journal of Applied Earth Observation and Geoinformation 2441-446 2. G Calamita, A Lanorte, R Lasaponara, B Murgante, G Nole 2013 Analyzing urban sprawl applying spatial autocorrelation techniques to multi-temporal satellite data. Urban and Regional Data Management: UDMS Annual 2013, 161 3. R Lasaponara 2013 Geospatial analysis from space: Advanced approaches for data processing, information extraction and interpretation International Journal of Applied Earth Observations and Geoinformation 20 . 1-3 4. R Lasaponara, A Lanorte 2011 Satellite time-series analysis International Journal of Remote Sensing 33 (15), 4649-4652 5. G Nolè, M Danese, B Murgante, R Lasaponara, A Lanorte Using spatial autocorrelation techniques and multi-temporal satellite data for analyzing urban sprawl Computational Science and Its Applications-ICCSA 2012, 512-527
NASA Astrophysics Data System (ADS)
Jia, Duo; Wang, Cangjiao; Lei, Shaogang
2018-01-01
Mapping vegetation dynamic types in mining areas is significant for revealing the mechanisms of environmental damage and for guiding ecological construction. Dynamic types of vegetation can be identified by applying interannual normalized difference vegetation index (NDVI) time series. However, phase differences and time shifts in interannual time series decrease mapping accuracy in mining regions. To overcome these problems and to increase the accuracy of mapping vegetation dynamics, an interannual Landsat time series for optimum vegetation growing status was constructed first by using the enhanced spatial and temporal adaptive reflectance fusion model algorithm. We then proposed a Markov random field optimized semisupervised Gaussian dynamic time warping kernel-based fuzzy c-means (FCM) cluster algorithm for interannual NDVI time series to map dynamic vegetation types in mining regions. The proposed algorithm has been tested in the Shengli mining region and Shendong mining region, which are typical representatives of China's open-pit and underground mining regions, respectively. Experiments show that the proposed algorithm can solve the problems of phase differences and time shifts to achieve better performance when mapping vegetation dynamic types. The overall accuracies for the Shengli and Shendong mining regions were 93.32% and 89.60%, respectively, with improvements of 7.32% and 25.84% when compared with the original semisupervised FCM algorithm.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-09
... Airworthiness Directives; The Boeing Company Model 737-100, -200, -200C, -300, -400, and -500 Series Airplanes..., -200, -200C, -300, -400, and - 500 series airplanes. That AD currently requires a one-time inspection... 16211, March 31, 2006). The existing AD applies to all Model 737-100, -200, -200C, -300, -400, and -500...
Kanai, Masashi; Okamoto, Kazuya; Yamamoto, Yosuke; Yoshioka, Akira; Hiramoto, Shuji; Nozaki, Akira; Nishikawa, Yoshitaka; Yamaguchi, Daisuke; Tomono, Teruko; Nakatsui, Masahiko; Baba, Mika; Morita, Tatsuya; Matsumoto, Shigemi; Kuroda, Tomohiro; Okuno, Yasushi; Muto, Manabu
2017-01-01
Background We aimed to develop an adaptable prognosis prediction model that could be applied at any time point during the treatment course for patients with cancer receiving chemotherapy, by applying time-series real-world big data. Methods Between April 2004 and September 2014, 4,997 patients with cancer who had received systemic chemotherapy were registered in a prospective cohort database at the Kyoto University Hospital. Of these, 2,693 patients with a death record were eligible for inclusion and divided into training (n = 1,341) and test (n = 1,352) cohorts. In total, 3,471,521 laboratory data at 115,738 time points, representing 40 laboratory items [e.g., white blood cell counts and albumin (Alb) levels] that were monitored for 1 year before the death event were applied for constructing prognosis prediction models. All possible prediction models comprising three different items from 40 laboratory items (40C3 = 9,880) were generated in the training cohort, and the model selection was performed in the test cohort. The fitness of the selected models was externally validated in the validation cohort from three independent settings. Results A prognosis prediction model utilizing Alb, lactate dehydrogenase, and neutrophils was selected based on a strong ability to predict death events within 1–6 months and a set of six prediction models corresponding to 1,2, 3, 4, 5, and 6 months was developed. The area under the curve (AUC) ranged from 0.852 for the 1 month model to 0.713 for the 6 month model. External validation supported the performance of these models. Conclusion By applying time-series real-world big data, we successfully developed a set of six adaptable prognosis prediction models for patients with cancer receiving chemotherapy. PMID:28837592
Janik, M; Bossew, P; Kurihara, O
2018-07-15
Machine learning is a class of statistical techniques which has proven to be a powerful tool for modelling the behaviour of complex systems, in which response quantities depend on assumed controls or predictors in a complicated way. In this paper, as our first purpose, we propose the application of machine learning to reconstruct incomplete or irregularly sampled data of time series indoor radon ( 222 Rn). The physical assumption underlying the modelling is that Rn concentration in the air is controlled by environmental variables such as air temperature and pressure. The algorithms "learn" from complete sections of multivariate series, derive a dependence model and apply it to sections where the controls are available, but not the response (Rn), and in this way complete the Rn series. Three machine learning techniques are applied in this study, namely random forest, its extension called the gradient boosting machine and deep learning. For a comparison, we apply the classical multiple regression in a generalized linear model version. Performance of the models is evaluated through different metrics. The performance of the gradient boosting machine is found to be superior to that of the other techniques. By applying learning machines, we show, as our second purpose, that missing data or periods of Rn series data can be reconstructed and resampled on a regular grid reasonably, if data of appropriate physical controls are available. The techniques also identify to which degree the assumed controls contribute to imputing missing Rn values. Our third purpose, though no less important from the viewpoint of physics, is identifying to which degree physical, in this case environmental variables, are relevant as Rn predictors, or in other words, which predictors explain most of the temporal variability of Rn. We show that variables which contribute most to the Rn series reconstruction, are temperature, relative humidity and day of the year. The first two are physical predictors, while "day of the year" is a statistical proxy or surrogate for missing or unknown predictors. Copyright © 2018 Elsevier B.V. All rights reserved.
Introduction to Time Series Analysis
NASA Technical Reports Server (NTRS)
Hardin, J. C.
1986-01-01
The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.
Panel data analysis of cardiotocograph (CTG) data.
Horio, Hiroyuki; Kikuchi, Hitomi; Ikeda, Tomoaki
2013-01-01
Panel data analysis is a statistical method, widely used in econometrics, which deals with two-dimensional panel data collected over time and over individuals. Cardiotocograph (CTG) which monitors fetal heart rate (FHR) using Doppler ultrasound and uterine contraction by strain gage is commonly used in intrapartum treatment of pregnant women. Although the relationship between FHR waveform pattern and the outcome such as umbilical blood gas data at delivery has long been analyzed, there exists no accumulated FHR patterns from large number of cases. As time-series economic fluctuations in econometrics such as consumption trend has been studied using panel data which consists of time-series and cross-sectional data, we tried to apply this method to CTG data. The panel data composed of a symbolized segment of FHR pattern can be easily handled, and a perinatologist can get the whole FHR pattern view from the microscopic level of time-series FHR data.
Reibling, Nadine
2013-09-01
This paper outlines the capabilities of pooled cross-sectional time series methodology for the international comparison of health system performance in population health. It shows how common model specifications can be improved so that they not only better address the specific nature of time series data on population health but are also more closely aligned with our theoretical expectations of the effect of healthcare systems. Three methodological innovations for this field of applied research are discussed: (1) how dynamic models help us understand the timing of effects, (2) how parameter heterogeneity can be used to compare performance across countries, and (3) how multiple imputation can be used to deal with incomplete data. We illustrate these methodological strategies with an analysis of infant mortality rates in 21 OECD countries between 1960 and 2008 using OECD Health Data. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.
Estimating the effective spatial resolution of an AVHRR time series
Meyer, D.J.
1996-01-01
A method is proposed to estimate the spatial degradation of geometrically rectified AVHRR data resulting from misregistration and off-nadir viewing, and to infer the cumulative effect of these degradations over time. Misregistrations are measured using high resolution imagery as a geometric reference, and pixel sizes are computed directly from satellite zenith angles. The influence or neighbouring features on a nominal 1 km by 1 km pixel over a given site is estimated from the above information, and expressed as a spatial distribution whose spatial frequency response is used to define an effective field-of-view (EFOV) for a time series. In a demonstration of the technique applied to images from the Conterminous U.S. AVHRR data set, an EFOV of 3·1km in the east-west dimension and 19 km in the north-south dimension was estimated for a time series accumulated over a grasslands test site.
FBST for Cointegration Problems
NASA Astrophysics Data System (ADS)
Diniz, M.; Pereira, C. A. B.; Stern, J. M.
2008-11-01
In order to estimate causal relations, the time series econometrics has to be aware of spurious correlation, a problem first mentioned by Yule [21]. To solve the problem, one can work with differenced series or use multivariate models like VAR or VEC models. In this case, the analysed series are going to present a long run relation i.e. a cointegration relation. Even though the Bayesian literature about inference on VAR/VEC models is quite advanced, Bauwens et al. [2] highlight that "the topic of selecting the cointegrating rank has not yet given very useful and convincing results." This paper presents the Full Bayesian Significance Test applied to cointegration rank selection tests in multivariate (VAR/VEC) time series models and shows how to implement it using available in the literature and simulated data sets. A standard non-informative prior is assumed.
Volatility behavior of visibility graph EMD financial time series from Ising interacting system
NASA Astrophysics Data System (ADS)
Zhang, Bo; Wang, Jun; Fang, Wen
2015-08-01
A financial market dynamics model is developed and investigated by stochastic Ising system, where the Ising model is the most popular ferromagnetic model in statistical physics systems. Applying two graph based analysis and multiscale entropy method, we investigate and compare the statistical volatility behavior of return time series and the corresponding IMF series derived from the empirical mode decomposition (EMD) method. And the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, we find that the degree distribution of visibility graph for the simulation series has the power law tails, and the assortative network exhibits the mixing pattern property. All these features are in agreement with the real market data, the research confirms that the financial model established by the Ising system is reasonable.
NASA Astrophysics Data System (ADS)
Di Piazza, A.; Cordano, E.; Eccel, E.
2012-04-01
The issue of climate change detection is considered a major challenge. In particular, high temporal resolution climate change scenarios are required in the evaluation of the effects of climate change on agricultural management (crop suitability, yields, risk assessment, etc.) energy production and water management. In this work, a "Weather Generator" technique was used for downscaling climate change scenarios for temperature. An R package (RMAWGEN, Cordano and Eccel, 2011 - available on http://cran.r-project.org) was developed aiming to generate synthetic daily weather conditions by using the theory of vectorial auto-regressive models (VAR). The VAR model was chosen for its ability in maintaining the temporal and spatial correlations among variables. In particular, observed time series of daily maximum and minimum temperature are transformed into "new" normally-distributed variable time series which are used to calibrate the parameters of a VAR model by using ordinary least square methods. Therefore the implemented algorithm, applied to monthly mean climatic values downscaled by Global Climate Model predictions, can generate several stochastic daily scenarios where the statistical consistency among series is saved. Further details are present in RMAWGEN documentation. An application is presented here by using a dataset with daily temperature time series recorded in 41 different sites of Trentino region for the period 1958-2010. Temperature time series were pre-processed to fill missing values (by a site-specific calibrated Inverse Distance Weighting algorithm, corrected with elevation) and to remove inhomogeneities. Several climatic indices were taken into account, useful for several impact assessment applications, and their time trends within the time series were analyzed. The indices go from the more classical ones, as annual mean temperatures, seasonal mean temperatures and their anomalies (from the reference period 1961-1990) to the climate change indices selected from the list recommended by the World Meteorological Organization Commission for Climatology (WMO-CCL) and the Research Programme on Climate Variability and Predictability (CLIVAR) project's Expert Team on Climate Change Detection, Monitoring and Indices (ETCCDMI). Each index was applied to both observed (and processed) data and to synthetic time series produced by the Weather Generator, over the thirty year reference period 1981-2010, in order to validate the procedure. Climate projections were statistically downscaled for a selection of sites for the two 30-year periods 2021-2050 and 2071-2099 of the European project "Ensembles" multi-model output (scenario A1B). The use of several climatic indices strengthens the trend analysis of both the generated synthetic series and future climate projections.
Analysis of crude oil markets with improved multiscale weighted permutation entropy
NASA Astrophysics Data System (ADS)
Niu, Hongli; Wang, Jun; Liu, Cheng
2018-03-01
Entropy measures are recently extensively used to study the complexity property in nonlinear systems. Weighted permutation entropy (WPE) can overcome the ignorance of the amplitude information of time series compared with PE and shows a distinctive ability to extract complexity information from data having abrupt changes in magnitude. Improved (or sometimes called composite) multi-scale (MS) method possesses the advantage of reducing errors and improving the accuracy when applied to evaluate multiscale entropy values of not enough long time series. In this paper, we combine the merits of WPE and improved MS to propose the improved multiscale weighted permutation entropy (IMWPE) method for complexity investigation of a time series. Then it is validated effective through artificial data: white noise and 1 / f noise, and real market data of Brent and Daqing crude oil. Meanwhile, the complexity properties of crude oil markets are explored respectively of return series, volatility series with multiple exponents and EEMD-produced intrinsic mode functions (IMFs) which represent different frequency components of return series. Moreover, the instantaneous amplitude and frequency of Brent and Daqing crude oil are analyzed by the Hilbert transform utilized to each IMF.
NASA Technical Reports Server (NTRS)
Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.
1986-01-01
Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.
Discovering time-lagged rules from microarray data using gene profile classifiers
2011-01-01
Background Gene regulatory networks have an essential role in every process of life. In this regard, the amount of genome-wide time series data is becoming increasingly available, providing the opportunity to discover the time-delayed gene regulatory networks that govern the majority of these molecular processes. Results This paper aims at reconstructing gene regulatory networks from multiple genome-wide microarray time series datasets. In this sense, a new model-free algorithm called GRNCOP2 (Gene Regulatory Network inference by Combinatorial OPtimization 2), which is a significant evolution of the GRNCOP algorithm, was developed using combinatorial optimization of gene profile classifiers. The method is capable of inferring potential time-delay relationships with any span of time between genes from various time series datasets given as input. The proposed algorithm was applied to time series data composed of twenty yeast genes that are highly relevant for the cell-cycle study, and the results were compared against several related approaches. The outcomes have shown that GRNCOP2 outperforms the contrasted methods in terms of the proposed metrics, and that the results are consistent with previous biological knowledge. Additionally, a genome-wide study on multiple publicly available time series data was performed. In this case, the experimentation has exhibited the soundness and scalability of the new method which inferred highly-related statistically-significant gene associations. Conclusions A novel method for inferring time-delayed gene regulatory networks from genome-wide time series datasets is proposed in this paper. The method was carefully validated with several publicly available data sets. The results have demonstrated that the algorithm constitutes a usable model-free approach capable of predicting meaningful relationships between genes, revealing the time-trends of gene regulation. PMID:21524308
Sensitivity analysis of machine-learning models of hydrologic time series
NASA Astrophysics Data System (ADS)
O'Reilly, A. M.
2017-12-01
Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.
NASA Astrophysics Data System (ADS)
Wang, H.; Cheng, J.
2017-12-01
A method to Synthesis natural electric and magnetic Time series is proposed whereby the time series of local site are derived using an Impulse Response and a reference (STIR). The method is based on the assumption that the external source of magnetic fields are uniform, and the electric and magnetic fields acquired at the surface satisfy a time-independent linear relation in frequency domain.According to the convolution theorem, we can synthesize natural electric and magnetic time series using the impulse responses of inter-station transfer functions with a reference. Applying this method, two impulse responses need to be estimated: the quasi-MT impulse response tensor and the horizontal magnetic impulse response tensor. These impulse response tensors relate the local horizontal electric and magnetic components with the horizontal magnetic components at a reference site, respectively. Some clean segments of times series are selected to estimate impulse responses by using least-square (LS) method. STIR is similar with STIN (Wang, 2017), but STIR does not need to estimate the inter-station transfer functions, and the synthesized data are more accurate in high frequency, where STIN fails when the inter-station transfer functions are contaminated severely. A test with good quality of MT data shows that synthetic time-series are similar to natural electric and magnetic time series. For contaminated AMT example, when this method is used to remove noise present at the local site, the scatter of MT sounding curves are clear reduced, and the data quality are improved. *This work is funded by National Key R&D Program of China(2017YFC0804105),National Natural Science Foundation of China (41604064, 51574250), State Key Laboratory of Coal Resources and Safe Mining ,China University of Mining & Technology,(SKLCRSM16DC09)
Automatic location of L/H transition times for physical studies with a large statistical basis
NASA Astrophysics Data System (ADS)
González, S.; Vega, J.; Murari, A.; Pereira, A.; Dormido-Canto, S.; Ramírez, J. M.; contributors, JET-EFDA
2012-06-01
Completely automatic techniques to estimate and validate L/H transition times can be essential in L/H transition analyses. The generation of databases with hundreds of transition times and without human intervention is an important step to accomplish (a) L/H transition physics analysis, (b) validation of L/H theoretical models and (c) creation of L/H scaling laws. An entirely unattended methodology is presented in this paper to build large databases of transition times in JET using time series. The proposed technique has been applied to a dataset of 551 JET discharges between campaigns C21 and C26. A prediction with discharges that show a clear signature in time series is made through the locating properties of the wavelet transform. It is an accurate prediction and the uncertainty interval is ±3.2 ms. The discharges with a non-clear pattern in the time series use an L/H mode classifier based on discharges with a clear signature. In this case, the estimation error shows a distribution with mean and standard deviation of 27.9 ms and 37.62 ms, respectively. Two different regression methods have been applied to the measurements acquired at the transition times identified by the automatic system. The obtained scaling laws for the threshold power are not significantly different from those obtained using the data at the transition times determined manually by the experts. The automatic methods allow performing physical studies with a large number of discharges, showing, for example, that there are statistically different types of transitions characterized by different scaling laws.
Automated Bayesian model development for frequency detection in biological time series.
Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J
2011-06-24
A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.
Automated Bayesian model development for frequency detection in biological time series
2011-01-01
Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910
OXYGEN 18 EXCHANGE REACTIONS OF ALDEHYDES AND KETONES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrn, Marianne; Calvin, Melvin
1965-12-01
Using infra-red spectroscopy, the equilibrium exchange times have been determined for a series of ketones, aromatic aldehydes, and {beta}-ketoesters reacting with oxygen 18 enriched water. These exchange times have been evaluated in terms of steric and electronic considerations, and applied to a discussion of the exchange times of chlorophylls a and b and chlorophyll derivatives.
ERIC Educational Resources Information Center
Hoeppner, Bettina B.; Goodwin, Matthew S.; Velicer, Wayne F.; Heltshe, James
2007-01-01
The advent of telemetric devices that sample data extensively over time has facilitated single subject or idiographic research to intensively study a single person over time. One of the challenges of idiographic research is combining single subject results to determine generalizability across subjects. This article demonstrates the first…
NASA Astrophysics Data System (ADS)
Gálvez-Coyt, Gonzalo; Muñoz-Diosdado, Alejandro; Peralta, José; Balderas-López, José; Angulo-Brown, Fernando
2012-06-01
Higuchi's method is a procedure that, if applied appropriately, can determine in a reliable way the fractal dimension D of time series; this fractal dimension permits to characterize the degree of correlation of the series. However, when analyzing some time series with Higuchi's method, there are oscillations at the right-hand side of the graph, which can cause a mistaken determination of the fractal dimension. In this work, an appropriate explanation is given to this type of behaviour. Using the seismogram as a time series and the properties of the P and S waves, it is possible to use the properties of Higuchi's method to previously detect the arrival of the earthquake shacking stage, some seconds in advance, approximately 30-35 s in the case of Mexico City. Thus, we propose the Higuchi's method to characterize and detect the P waves in order to estimate the strength of the forthcoming S waves.
A Computer Program for the Generation of ARIMA Data
ERIC Educational Resources Information Center
Green, Samuel B.; Noles, Keith O.
1977-01-01
The autoregressive integrated moving averages model (ARIMA) has been applied to time series data in psychological and educational research. A program is described that generates ARIMA data of a known order. The program enables researchers to explore statistical properties of ARIMA data and simulate systems producing time dependent observations.…
31 CFR 344.10 - What are Special Zero Interest securities?
Code of Federal Regulations, 2014 CFR
2014-07-01
...-STATE AND LOCAL GOVERNMENT SERIES Special Zero Interest Securities § 344.10 What are Special Zero.... The provisions of subpart B of this part (Time Deposit securities) apply except as specified in... zero interest securities available after October 28, 1996, are zero interest Time Deposit securities...
31 CFR 344.10 - What are Special Zero Interest securities?
Code of Federal Regulations, 2010 CFR
2010-07-01
...-STATE AND LOCAL GOVERNMENT SERIES Special Zero Interest Securities § 344.10 What are Special Zero.... The provisions of subpart B of this part (Time Deposit securities) apply except as specified in... zero interest securities available after October 28, 1996, are zero interest Time Deposit securities...
31 CFR 344.10 - What are Special Zero Interest securities?
Code of Federal Regulations, 2013 CFR
2013-07-01
...-STATE AND LOCAL GOVERNMENT SERIES Special Zero Interest Securities § 344.10 What are Special Zero.... The provisions of subpart B of this part (Time Deposit securities) apply except as specified in... zero interest securities available after October 28, 1996, are zero interest Time Deposit securities...
New insights into soil temperature time series modeling: linear or nonlinear?
NASA Astrophysics Data System (ADS)
Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram
2018-03-01
Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and ANFIS (respectively) were optimized with the particle swarm optimization (PSO) algorithm in conjunction with the wavelet transform and nonlinear methods (Wavelet-MLP & Wavelet-ANFIS). A comparison of the proposed methodology with individual and hybrid nonlinear models in predicting DST time series indicates the lowest Akaike Information Criterion (AIC) index value, which considers model simplicity and accuracy simultaneously at different depths and stations. The methodology presented in this study can thus serve as an excellent alternative to complex nonlinear methods that are normally employed to examine DST.
New Insights into Signed Path Coefficient Granger Causality Analysis.
Zhang, Jian; Li, Chong; Jiang, Tianzi
2016-01-01
Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of "signed path coefficient Granger causality," a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an "excitatory" or "inhibitory" influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation.
Fractal analysis of the short time series in a visibility graph method
NASA Astrophysics Data System (ADS)
Li, Ruixue; Wang, Jiang; Yu, Haitao; Deng, Bin; Wei, Xile; Chen, Yingyuan
2016-05-01
The aim of this study is to evaluate the performance of the visibility graph (VG) method on short fractal time series. In this paper, the time series of Fractional Brownian motions (fBm), characterized by different Hurst exponent H, are simulated and then mapped into a scale-free visibility graph, of which the degree distributions show the power-law form. The maximum likelihood estimation (MLE) is applied to estimate power-law indexes of degree distribution, and in this progress, the Kolmogorov-Smirnov (KS) statistic is used to test the performance of estimation of power-law index, aiming to avoid the influence of droop head and heavy tail in degree distribution. As a result, we find that the MLE gives an optimal estimation of power-law index when KS statistic reaches its first local minimum. Based on the results from KS statistic, the relationship between the power-law index and the Hurst exponent is reexamined and then amended to meet short time series. Thus, a method combining VG, MLE and KS statistics is proposed to estimate Hurst exponents from short time series. Lastly, this paper also offers an exemplification to verify the effectiveness of the combined method. In addition, the corresponding results show that the VG can provide a reliable estimation of Hurst exponents.
Miranian, A; Abdollahzade, M
2013-02-01
Local modeling approaches, owing to their ability to model different operating regimes of nonlinear systems and processes by independent local models, seem appealing for modeling, identification, and prediction applications. In this paper, we propose a local neuro-fuzzy (LNF) approach based on the least-squares support vector machines (LSSVMs). The proposed LNF approach employs LSSVMs, which are powerful in modeling and predicting time series, as local models and uses hierarchical binary tree (HBT) learning algorithm for fast and efficient estimation of its parameters. The HBT algorithm heuristically partitions the input space into smaller subdomains by axis-orthogonal splits. In each partitioning, the validity functions automatically form a unity partition and therefore normalization side effects, e.g., reactivation, are prevented. Integration of LSSVMs into the LNF network as local models, along with the HBT learning algorithm, yield a high-performance approach for modeling and prediction of complex nonlinear time series. The proposed approach is applied to modeling and predictions of different nonlinear and chaotic real-world and hand-designed systems and time series. Analysis of the prediction results and comparisons with recent and old studies demonstrate the promising performance of the proposed LNF approach with the HBT learning algorithm for modeling and prediction of nonlinear and chaotic systems and time series.
Wang, Yi Kan; Hurley, Daniel G.; Schnell, Santiago; Print, Cristin G.; Crampin, Edmund J.
2013-01-01
We develop a new regression algorithm, cMIKANA, for inference of gene regulatory networks from combinations of steady-state and time-series gene expression data. Using simulated gene expression datasets to assess the accuracy of reconstructing gene regulatory networks, we show that steady-state and time-series data sets can successfully be combined to identify gene regulatory interactions using the new algorithm. Inferring gene networks from combined data sets was found to be advantageous when using noisy measurements collected with either lower sampling rates or a limited number of experimental replicates. We illustrate our method by applying it to a microarray gene expression dataset from human umbilical vein endothelial cells (HUVECs) which combines time series data from treatment with growth factor TNF and steady state data from siRNA knockdown treatments. Our results suggest that the combination of steady-state and time-series datasets may provide better prediction of RNA-to-RNA interactions, and may also reveal biological features that cannot be identified from dynamic or steady state information alone. Finally, we consider the experimental design of genomics experiments for gene regulatory network inference and show that network inference can be improved by incorporating steady-state measurements with time-series data. PMID:23967277
Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns
NASA Astrophysics Data System (ADS)
Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto
2017-09-01
Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.
Zeng, Nianyin; Wang, Zidong; Li, Yurong; Du, Min; Cao, Jie; Liu, Xiaohui
2013-12-01
In this paper, the expectation maximization (EM) algorithm is applied to the modeling of the nano-gold immunochromatographic assay (nano-GICA) via available time series of the measured signal intensities of the test and control lines. The model for the nano-GICA is developed as the stochastic dynamic model that consists of a first-order autoregressive stochastic dynamic process and a noisy measurement. By using the EM algorithm, the model parameters, the actual signal intensities of the test and control lines, as well as the noise intensity can be identified simultaneously. Three different time series data sets concerning the target concentrations are employed to demonstrate the effectiveness of the introduced algorithm. Several indices are also proposed to evaluate the inferred models. It is shown that the model fits the data very well.
Time series analysis of ozone data in Isfahan
NASA Astrophysics Data System (ADS)
Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.
2008-07-01
Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Correlates of depression in bipolar disorder
Moore, Paul J.; Little, Max A.; McSharry, Patrick E.; Goodwin, Guy M.; Geddes, John R.
2014-01-01
We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = –0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson–Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. PMID:24352942
Fractal dynamics of heartbeat time series of young persons with metabolic syndrome
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.; Alonso-Martínez, A.; Ramírez-Hernández, L.; Martínez-Hernández, G.
2012-10-01
Many physiological systems have been in recent years quantitatively characterized using fractal analysis. We applied it to study heart variability of young subjects with metabolic syndrome (MS); we examined the RR time series (time between two R waves in ECG) with the detrended fluctuation analysis (DFA) method, the Higuchi's fractal dimension method and the multifractal analysis to detect the possible presence of heart problems. The results show that although the young persons have MS, the majority do not present alterations in the heart dynamics. However, there were cases where the fractal parameter values differed significantly from the healthy people values.
Cardiovascular response to acute stress in freely moving rats: time-frequency analysis.
Loncar-Turukalo, Tatjana; Bajic, Dragana; Japundzic-Zigon, Nina
2008-01-01
Spectral analysis of cardiovascular series is an important tool for assessing the features of the autonomic control of the cardiovascular system. In this experiment Wistar rats ecquiped with intraarterial catheter for blood pressure (BP) recording were exposed to stress induced by blowing air. The problem of non stationary data was overcomed applying the Smoothed Pseudo Wigner Villle (SPWV) time-frequency distribution. Spectral analysis was done before stress, during stress, immediately after stress and later in recovery. The spectral indices were calculated for both systolic blood pressure (SBP) and pulse interval (PI) series. The time evolution of spectral indices showed perturbed sympathovagal balance.
Radon anomalies: When are they possible to be detected?
NASA Astrophysics Data System (ADS)
Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik
2017-04-01
Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission rate, but the results are strongly dependent on the length of the time window and/or type of frequency filtering. More importantly, when raw time-series contain cyclic components (e.g. seasonal or diurnal variation), the quest of anomalies related to transients becomes meaningless. We conclude that an objective identification of transient changes can be performed only after filtering the raw time-series for the physically meaningful frequency content.
NASA Astrophysics Data System (ADS)
Olafsdottir, Kristin B.; Mudelsee, Manfred
2013-04-01
Estimation of the Pearson's correlation coefficient between two time series to evaluate the influences of one time depended variable on another is one of the most often used statistical method in climate sciences. Various methods are used to estimate confidence interval to support the correlation point estimate. Many of them make strong mathematical assumptions regarding distributional shape and serial correlation, which are rarely met. More robust statistical methods are needed to increase the accuracy of the confidence intervals. Bootstrap confidence intervals are estimated in the Fortran 90 program PearsonT (Mudelsee, 2003), where the main intention was to get an accurate confidence interval for correlation coefficient between two time series by taking the serial dependence of the process that generated the data into account. However, Monte Carlo experiments show that the coverage accuracy for smaller data sizes can be improved. Here we adapt the PearsonT program into a new version called PearsonT3, by calibrating the confidence interval to increase the coverage accuracy. Calibration is a bootstrap resampling technique, which basically performs a second bootstrap loop or resamples from the bootstrap resamples. It offers, like the non-calibrated bootstrap confidence intervals, robustness against the data distribution. Pairwise moving block bootstrap is used to preserve the serial correlation of both time series. The calibration is applied to standard error based bootstrap Student's t confidence intervals. The performances of the calibrated confidence intervals are examined with Monte Carlo simulations, and compared with the performances of confidence intervals without calibration, that is, PearsonT. The coverage accuracy is evidently better for the calibrated confidence intervals where the coverage error is acceptably small (i.e., within a few percentage points) already for data sizes as small as 20. One form of climate time series is output from numerical models which simulate the climate system. The method is applied to model data from the high resolution ocean model, INALT01 where the relationship between the Agulhas Leakage and the North Brazil Current is evaluated. Preliminary results show significant correlation between the two variables when there is 10 year lag between them, which is more or less the time that takes the Agulhas Leakage water to reach the North Brazil Current. Mudelsee, M., 2003. Estimating Pearson's correlation coefficient with bootstrap confidence interval from serially dependent time series. Mathematical Geology 35, 651-665.
A Nonlinear Dynamical Systems based Model for Stochastic Simulation of Streamflow
NASA Astrophysics Data System (ADS)
Erkyihun, S. T.; Rajagopalan, B.; Zagona, E. A.
2014-12-01
Traditional time series methods model the evolution of the underlying process as a linear or nonlinear function of the autocorrelation. These methods capture the distributional statistics but are incapable of providing insights into the dynamics of the process, the potential regimes, and predictability. This work develops a nonlinear dynamical model for stochastic simulation of streamflows. In this, first a wavelet spectral analysis is employed on the flow series to isolate dominant orthogonal quasi periodic timeseries components. The periodic bands are added denoting the 'signal' component of the time series and the residual being the 'noise' component. Next, the underlying nonlinear dynamics of this combined band time series is recovered. For this the univariate time series is embedded in a d-dimensional space with an appropriate lag T to recover the state space in which the dynamics unfolds. Predictability is assessed by quantifying the divergence of trajectories in the state space with time, as Lyapunov exponents. The nonlinear dynamics in conjunction with a K-nearest neighbor time resampling is used to simulate the combined band, to which the noise component is added to simulate the timeseries. We demonstrate this method by applying it to the data at Lees Ferry that comprises of both the paleo reconstructed and naturalized historic annual flow spanning 1490-2010. We identify interesting dynamics of the signal in the flow series and epochal behavior of predictability. These will be of immense use for water resources planning and management.
NASA Astrophysics Data System (ADS)
Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen
2016-04-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.
Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.
Estimating survival rates with time series of standing age‐structure data
Udevitz, Mark S.; Gogan, Peter J.
2012-01-01
It has long been recognized that age‐structure data contain useful information for assessing the status and dynamics of wildlife populations. For example, age‐specific survival rates can be estimated with just a single sample from the age distribution of a stable, stationary population. For a population that is not stable, age‐specific survival rates can be estimated using techniques such as inverse methods that combine time series of age‐structure data with other demographic data. However, estimation of survival rates using these methods typically requires numerical optimization, a relatively long time series of data, and smoothing or other constraints to provide useful estimates. We developed general models for possibly unstable populations that combine time series of age‐structure data with other demographic data to provide explicit maximum likelihood estimators of age‐specific survival rates with as few as two years of data. As an example, we applied these methods to estimate survival rates for female bison (Bison bison) in Yellowstone National Park, USA. This approach provides a simple tool for monitoring survival rates based on age‐structure data.
A time series model: First-order integer-valued autoregressive (INAR(1))
NASA Astrophysics Data System (ADS)
Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.
2017-07-01
Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.
Extended AIC model based on high order moments and its application in the financial market
NASA Astrophysics Data System (ADS)
Mao, Xuegeng; Shang, Pengjian
2018-07-01
In this paper, an extended method of traditional Akaike Information Criteria(AIC) is proposed to detect the volatility of time series by combining it with higher order moments, such as skewness and kurtosis. Since measures considering higher order moments are powerful in many aspects, the properties of asymmetry and flatness can be observed. Furthermore, in order to reduce the effect of noise and other incoherent features, we combine the extended AIC algorithm with multiscale wavelet analysis, in which the newly extended AIC algorithm is applied to wavelet coefficients at several scales and the time series are reconstructed by wavelet transform. After that, we create AIC planes to derive the relationship among AIC values using variance, skewness and kurtosis respectively. When we test this technique on the financial market, the aim is to analyze the trend and volatility of the closing price of stock indices and classify them. And we also adapt multiscale analysis to measure complexity of time series over a range of scales. Empirical results show that the singularity of time series in stock market can be detected via extended AIC algorithm.
Long-Term Stability of Radio Sources in VLBI Analysis
NASA Technical Reports Server (NTRS)
Engelhardt, Gerald; Thorandt, Volkmar
2010-01-01
Positional stability of radio sources is an important requirement for modeling of only one source position for the complete length of VLBI data of presently more than 20 years. The stability of radio sources can be verified by analyzing time series of radio source coordinates. One approach is a statistical test for normal distribution of residuals to the weighted mean for each radio source component of the time series. Systematic phenomena in the time series can thus be detected. Nevertheless, an inspection of rate estimation and weighted root-mean-square (WRMS) variations about the mean is also necessary. On the basis of the time series computed by the BKG group in the frame of the ICRF2 working group, 226 stable radio sources with an axis stability of 10 as could be identified. They include 100 ICRF2 axes-defining sources which are determined independently of the method applied in the ICRF2 working group. 29 stable radio sources with a source structure index of less than 3.0 can also be used to increase the number of 295 ICRF2 defining sources.
Asymptotic scaling properties and estimation of the generalized Hurst exponents in financial data
NASA Astrophysics Data System (ADS)
Buonocore, R. J.; Aste, T.; Di Matteo, T.
2017-04-01
We propose a method to measure the Hurst exponents of financial time series. The scaling of the absolute moments against the aggregation horizon of real financial processes and of both uniscaling and multiscaling synthetic processes converges asymptotically towards linearity in log-log scale. In light of this we found appropriate a modification of the usual scaling equation via the introduction of a filter function. We devised a measurement procedure which takes into account the presence of the filter function without the need of directly estimating it. We verified that the method is unbiased within the errors by applying it to synthetic time series with known scaling properties. Finally we show an application to empirical financial time series where we fit the measured scaling exponents via a second or a fourth degree polynomial, which, because of theoretical constraints, have respectively only one and two degrees of freedom. We found that on our data set there is not clear preference between the second or fourth degree polynomial. Moreover the study of the filter functions of each time series shows common patterns of convergence depending on the momentum degree.
Dynamic correlations at different time-scales with empirical mode decomposition
NASA Astrophysics Data System (ADS)
Nava, Noemi; Di Matteo, T.; Aste, Tomaso
2018-07-01
We introduce a simple approach which combines Empirical Mode Decomposition (EMD) and Pearson's cross-correlations over rolling windows to quantify dynamic dependency at different time scales. The EMD is a tool to separate time series into implicit components which oscillate at different time-scales. We apply this decomposition to intraday time series of the following three financial indices: the S&P 500 (USA), the IPC (Mexico) and the VIX (volatility index USA), obtaining time-varying multidimensional cross-correlations at different time-scales. The correlations computed over a rolling window are compared across the three indices, across the components at different time-scales and across different time lags. We uncover a rich heterogeneity of interactions, which depends on the time-scale and has important lead-lag relations that could have practical use for portfolio management, risk estimation and investment decisions.
Cohen, Michael X
2017-09-27
The number of simultaneously recorded electrodes in neuroscience is steadily increasing, providing new opportunities for understanding brain function, but also new challenges for appropriately dealing with the increase in dimensionality. Multivariate source separation analysis methods have been particularly effective at improving signal-to-noise ratio while reducing the dimensionality of the data and are widely used for cleaning, classifying and source-localizing multichannel neural time series data. Most source separation methods produce a spatial component (that is, a weighted combination of channels to produce one time series); here, this is extended to apply source separation to a time series, with the idea of obtaining a weighted combination of successive time points, such that the weights are optimized to satisfy some criteria. This is achieved via a two-stage source separation procedure, in which an optimal spatial filter is first constructed and then its optimal temporal basis function is computed. This second stage is achieved with a time-delay-embedding matrix, in which additional rows of a matrix are created from time-delayed versions of existing rows. The optimal spatial and temporal weights can be obtained by solving a generalized eigendecomposition of covariance matrices. The method is demonstrated in simulated data and in an empirical electroencephalogram study on theta-band activity during response conflict. Spatiotemporal source separation has several advantages, including defining empirical filters without the need to apply sinusoidal narrowband filters. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Bivariate analysis of floods in climate impact assessments.
Brunner, Manuela Irene; Sikorska, Anna E; Seibert, Jan
2018-03-01
Climate impact studies regarding floods usually focus on peak discharges and a bivariate assessment of peak discharges and hydrograph volumes is not commonly included. A joint consideration of peak discharges and hydrograph volumes, however, is crucial when assessing flood risks for current and future climate conditions. Here, we present a methodology to develop synthetic design hydrographs for future climate conditions that jointly consider peak discharges and hydrograph volumes. First, change factors are derived based on a regional climate model and are applied to observed precipitation and temperature time series. Second, the modified time series are fed into a calibrated hydrological model to simulate runoff time series for future conditions. Third, these time series are used to construct synthetic design hydrographs. The bivariate flood frequency analysis used in the construction of synthetic design hydrographs takes into account the dependence between peak discharges and hydrograph volumes, and represents the shape of the hydrograph. The latter is modeled using a probability density function while the dependence between the design variables peak discharge and hydrograph volume is modeled using a copula. We applied this approach to a set of eight mountainous catchments in Switzerland to construct catchment-specific and season-specific design hydrographs for a control and three scenario climates. Our work demonstrates that projected climate changes have an impact not only on peak discharges but also on hydrograph volumes and on hydrograph shapes both at an annual and at a seasonal scale. These changes are not necessarily proportional which implies that climate impact assessments on future floods should consider more flood characteristics than just flood peaks. Copyright © 2017. Published by Elsevier B.V.
Holmes, E A; Bonsall, M B; Hales, S A; Mitchell, H; Renner, F; Blackwell, S E; Watson, P; Goodwin, G M; Di Simplicio, M
2016-01-01
Treatment innovation for bipolar disorder has been hampered by a lack of techniques to capture a hallmark symptom: ongoing mood instability. Mood swings persist during remission from acute mood episodes and impair daily functioning. The last significant treatment advance remains Lithium (in the 1970s), which aids only the minority of patients. There is no accepted way to establish proof of concept for a new mood-stabilizing treatment. We suggest that combining insights from mood measurement with applied mathematics may provide a step change: repeated daily mood measurement (depression) over a short time frame (1 month) can create individual bipolar mood instability profiles. A time-series approach allows comparison of mood instability pre- and post-treatment. We test a new imagery-focused cognitive therapy treatment approach (MAPP; Mood Action Psychology Programme) targeting a driver of mood instability, and apply these measurement methods in a non-concurrent multiple baseline design case series of 14 patients with bipolar disorder. Weekly mood monitoring and treatment target data improved for the whole sample combined. Time-series analyses of daily mood data, sampled remotely (mobile phone/Internet) for 28 days pre- and post-treatment, demonstrated improvements in individuals' mood stability for 11 of 14 patients. Thus the findings offer preliminary support for a new imagery-focused treatment approach. They also indicate a step in treatment innovation without the requirement for trials in illness episodes or relapse prevention. Importantly, daily measurement offers a description of mood instability at the individual patient level in a clinically meaningful time frame. This costly, chronic and disabling mental illness demands innovation in both treatment approaches (whether pharmacological or psychological) and measurement tool: this work indicates that daily measurements can be used to detect improvement in individual mood stability for treatment innovation (MAPP). PMID:26812041
NASA Astrophysics Data System (ADS)
Jiang, Weiping; Deng, Liansheng; Zhou, Xiaohui; Ma, Yifang
2014-05-01
Higher-order ionospheric (HIO) corrections are proposed to become a standard part for precise GPS data analysis. For this study, we deeply investigate the impacts of the HIO corrections on the coordinate time series by implementing re-processing of the GPS data from Crustal Movement Observation Network of China (CMONOC). Nearly 13 year data are used in our three processing runs: (a) run NO, without HOI corrections, (b) run IG, both second- and third-order corrections are modeled using the International Geomagnetic Reference Field 11 (IGRF11) to model the magnetic field, (c) run ID, the same with IG but dipole magnetic model are applied. Both spectral analysis and noise analysis are adopted to investigate these effects. Results show that for CMONOC stations, HIO corrections are found to have brought an overall improvement. After the corrections are applied, the noise amplitudes decrease, with the white noise amplitudes showing a more remarkable variation. Low-latitude sites are more affected. For different coordinate components, the impacts vary. The results of an analysis of stacked periodograms show that there is a good match between the seasonal amplitudes and the HOI corrections, and the observed variations in the coordinate time series are related to HOI effects. HOI delays partially explain the seasonal amplitudes in the coordinate time series, especially for the U component. The annual amplitudes for all components are decreased for over one-half of the selected CMONOC sites. Additionally, the semi-annual amplitudes for the sites are much more strongly affected by the corrections. However, when diplole model is used, the results are not as optimistic as IGRF model. Analysis of dipole model indicate that HIO delay lead to the increase of noise amplitudes, and that HIO delays with dipole model can generate false periodic signals. When dipole model are used in modeling HIO terms, larger residual and noise are brought in rather than the effective improvements.
Mayaud, C; Wagner, T; Benischke, R; Birk, S
2014-04-16
The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the synthetic system allow to deduce that similar aquifer properties are relevant in both systems. In particular, the heterogeneity of aquifer parameters appears to be a controlling factor. Moreover, the location of the overflow connecting the sub-catchments of the two springs is found to be of primary importance, regarding the occurrence of inter-catchment flow. This further supports our current understanding of an overflow zone located in the upper part of the Lurbach karst aquifer. Thus, time series analysis of single events can potentially be used to characterize transient inter-catchment flow behavior of karst systems.
Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.
2014-01-01
Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the synthetic system allow to deduce that similar aquifer properties are relevant in both systems. In particular, the heterogeneity of aquifer parameters appears to be a controlling factor. Moreover, the location of the overflow connecting the sub-catchments of the two springs is found to be of primary importance, regarding the occurrence of inter-catchment flow. This further supports our current understanding of an overflow zone located in the upper part of the Lurbach karst aquifer. Thus, time series analysis of single events can potentially be used to characterize transient inter-catchment flow behavior of karst systems. PMID:24748687
NASA Astrophysics Data System (ADS)
Zheng, Jinde; Pan, Haiyang; Cheng, Junsheng
2017-02-01
To timely detect the incipient failure of rolling bearing and find out the accurate fault location, a novel rolling bearing fault diagnosis method is proposed based on the composite multiscale fuzzy entropy (CMFE) and ensemble support vector machines (ESVMs). Fuzzy entropy (FuzzyEn), as an improvement of sample entropy (SampEn), is a new nonlinear method for measuring the complexity of time series. Since FuzzyEn (or SampEn) in single scale can not reflect the complexity effectively, multiscale fuzzy entropy (MFE) is developed by defining the FuzzyEns of coarse-grained time series, which represents the system dynamics in different scales. However, the MFE values will be affected by the data length, especially when the data are not long enough. By combining information of multiple coarse-grained time series in the same scale, the CMFE algorithm is proposed in this paper to enhance MFE, as well as FuzzyEn. Compared with MFE, with the increasing of scale factor, CMFE obtains much more stable and consistent values for a short-term time series. In this paper CMFE is employed to measure the complexity of vibration signals of rolling bearings and is applied to extract the nonlinear features hidden in the vibration signals. Also the physically meanings of CMFE being suitable for rolling bearing fault diagnosis are explored. Based on these, to fulfill an automatic fault diagnosis, the ensemble SVMs based multi-classifier is constructed for the intelligent classification of fault features. Finally, the proposed fault diagnosis method of rolling bearing is applied to experimental data analysis and the results indicate that the proposed method could effectively distinguish different fault categories and severities of rolling bearings.
Kipiński, Lech; König, Reinhard; Sielużycki, Cezary; Kordecki, Wojciech
2011-10-01
Stationarity is a crucial yet rarely questioned assumption in the analysis of time series of magneto- (MEG) or electroencephalography (EEG). One key drawback of the commonly used tests for stationarity of encephalographic time series is the fact that conclusions on stationarity are only indirectly inferred either from the Gaussianity (e.g. the Shapiro-Wilk test or Kolmogorov-Smirnov test) or the randomness of the time series and the absence of trend using very simple time-series models (e.g. the sign and trend tests by Bendat and Piersol). We present a novel approach to the analysis of the stationarity of MEG and EEG time series by applying modern statistical methods which were specifically developed in econometrics to verify the hypothesis that a time series is stationary. We report our findings of the application of three different tests of stationarity--the Kwiatkowski-Phillips-Schmidt-Schin (KPSS) test for trend or mean stationarity, the Phillips-Perron (PP) test for the presence of a unit root and the White test for homoscedasticity--on an illustrative set of MEG data. For five stimulation sessions, we found already for short epochs of duration of 250 and 500 ms that, although the majority of the studied epochs of single MEG trials were usually mean-stationary (KPSS test and PP test), they were classified as nonstationary due to their heteroscedasticity (White test). We also observed that the presence of external auditory stimulation did not significantly affect the findings regarding the stationarity of the data. We conclude that the combination of these tests allows a refined analysis of the stationarity of MEG and EEG time series.
Burby, Joshua W.; Lacker, Daniel
2016-01-01
Systems as diverse as the interacting species in a community, alleles at a genetic locus, and companies in a market are characterized by competition (over resources, space, capital, etc) and adaptation. Neutral theory, built around the hypothesis that individual performance is independent of group membership, has found utility across the disciplines of ecology, population genetics, and economics, both because of the success of the neutral hypothesis in predicting system properties and because deviations from these predictions provide information about the underlying dynamics. However, most tests of neutrality are weak, based on static system properties such as species-abundance distributions or the number of singletons in a sample. Time-series data provide a window onto a system’s dynamics, and should furnish tests of the neutral hypothesis that are more powerful to detect deviations from neutrality and more informative about to the type of competitive asymmetry that drives the deviation. Here, we present a neutrality test for time-series data. We apply this test to several microbial time-series and financial time-series and find that most of these systems are not neutral. Our test isolates the covariance structure of neutral competition, thus facilitating further exploration of the nature of asymmetry in the covariance structure of competitive systems. Much like neutrality tests from population genetics that use relative abundance distributions have enabled researchers to scan entire genomes for genes under selection, we anticipate our time-series test will be useful for quick significance tests of neutrality across a range of ecological, economic, and sociological systems for which time-series data are available. Future work can use our test to categorize and compare the dynamic fingerprints of particular competitive asymmetries (frequency dependence, volatility smiles, etc) to improve forecasting and management of complex adaptive systems. PMID:27689714
Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu
2011-01-01
The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.
2011-01-01
Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572
GPS Time Series and Geodynamic Implications for the Hellenic Arc Area, Greece
NASA Astrophysics Data System (ADS)
Hollenstein, Ch.; Heller, O.; Geiger, A.; Kahle, H.-G.; Veis, G.
The quantification of crustal deformation and its temporal behavior is an important contribution to earthquake hazard assessment. With GPS measurements, especially from continuous operating stations, pre-, co-, post- and interseismic movements can be recorded and monitored. We present results of a continuous GPS network which has been operated in the Hellenic Arc area, Greece, since 1995. In order to obtain coordinate time series of high precision which are representative for crustal deformation, a main goal was to eliminate effects which are not of tectonic origin. By applying different steps of improvement, non-tectonic irregularities were reduced significantly, and the precision could be improved by an average of 40%. The improved time series are used to study the crustal movements in space and time. They serve as a base for the estimation of velocities and for the visualization of the movements in terms of trajectories. Special attention is given to large earthquakes (M>6), which occurred near GPS sites during the measuring time span.
NASA Astrophysics Data System (ADS)
Donner, Reik V.; Potirakis, Stelios M.; Barbosa, Susana M.; Matos, Jose A. O.
2015-04-01
The presence or absence of long-range correlations in environmental radioactivity fluctuations has recently attracted considerable interest. Among a multiplicity of practically relevant applications, identifying and disentangling the environmental factors controlling the variable concentrations of the radioactive noble gas Radon is important for estimating its effect on human health and the efficiency of possible measures for reducing the corresponding exposition. In this work, we present a critical re-assessment of a multiplicity of complementary methods that have been previously applied for evaluating the presence of long-range correlations and fractal scaling in environmental Radon variations with a particular focus on the specific properties of the underlying time series. As an illustrative case study, we subsequently re-analyze two high-frequency records of indoor Radon concentrations from Coimbra, Portugal, each of which spans several months of continuous measurements at a high temporal resolution of five minutes. Our results reveal that at the study site, Radon concentrations exhibit complex multi-scale dynamics with qualitatively different properties at different time-scales: (i) essentially white noise in the high-frequency part (up to time-scales of about one hour), (ii) spurious indications of a non-stationary, apparently long-range correlated process (at time scales between hours and one day) arising from marked periodic components probably related to tidal frequencies, and (iii) low-frequency variability indicating a true long-range dependent process, which might be dominated by a response to meteorological drivers. In the presence of such multi-scale variability, common estimators of long-range memory in time series are necessarily prone to fail if applied to the raw data without previous separation of time-scales with qualitatively different dynamics. We emphasize that similar properties can be found in other types of geophysical time series (for example, tide gauge records), calling for a careful application of time series analysis tools when studying such data.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen
2016-04-01
Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5
Lutaif, N.A.; Palazzo, R.; Gontijo, J.A.R.
2014-01-01
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile. PMID:24519093
Lutaif, N A; Palazzo, R; Gontijo, J A R
2014-01-01
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.
A Geodetic Strain Rate Model for the Pacific-North American Plate Boundary, western United States
NASA Astrophysics Data System (ADS)
Kreemer, C.; Hammond, W. C.; Blewitt, G.; Holland, A. A.; Bennett, R. A.
2012-04-01
We present a model of crustal strain rates derived from GPS measurements of horizontal station velocities in the Pacific-North American plate boundary in the western United States. The model reflects a best estimate of present-day deformation from the San Andreas fault system in the west to the Basin and Range province in the east. Of the total 2,846 GPS velocities used in the model, 1,197 are derived by ourselves, and 1,649 are taken from (mostly) published results. The velocities derived by ourselves (the "UNR solution") are estimated from GPS position time-series of continuous and semi-continuous stations for which data are publicly available. We estimated ITRF2005 positions from 2002-2011.5 using JPL's GIPSY-OASIS II software with ambiguity resolution applied using our custom Ambizap software. Only stations with time-series that span at least 2.25 years are considered. We removed from the time-series continental-scale common-mode errors using a spatially-varying filtering technique. Velocity uncertainties (typically 0.1-0.3 mm/yr) assume that the time-series contain flicker plus white noise. We used a subset of stations on the stable parts of the Pacific and North American plates to estimate the Pacific-North American pole of rotation. This pole is applied as a boundary condition to the model and the North American - ITRF2005 pole is used to rotate our velocities into a North America fixed reference frame. We do not include parts of the time-series that show curvature due to post-seismic deformation after major earthquakes and we also exclude stations whose time-series display a significant unexplained non-linearity or that are near volcanic centers. Transient effects longer than the observation period (i.e., slow viscoelastic relaxation) are left in the data. We added to the UNR solution velocities from 12 other studies. The velocities are transformed onto the UNR solution's reference frame by estimating and applying a translation and rotation that minimizes the velocities at collocated stations. We removed obvious outliers and velocities in areas that we identified to undergo subsidence likely due to excessive water pumping. For the strain rate calculations we excluded GPS stations with anomalous vertical motion or annual horizontal periodicity, which are indicators of local site instability. First, we used the stations from the UNR solution to create a Delaunay triangulation and estimated the horizontal strain rate components (and rigid body rotation) for each triangle in a linear least-squares inversion using the horizontal velocities as input. Some level of spatial damping was applied to minimize unnecessary spatial variation in the model parameters. The strain rates estimates were then used as a priori strain rate variances in a method that fits continuous bi-cubic Bessel spline functions through the velocity gradient field while minimizing the weighted misfit to all velocities. A minimal level of spatial smoothing of the variances was applied. The strain rate tensor model is shown by contours of the second invariant of the tensor, which is a measure of the amplitude that is coordinate frame independent. We also show a map of the tensor style and of the signal-to-noise ratio of the model.
Multiresolution forecasting for futures trading using wavelet decompositions.
Zhang, B L; Coggins, R; Jabri, M A; Dersch, D; Flower, B
2001-01-01
We investigate the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). We apply the Bayesian method of automatic relevance determination to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the individual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representation, or by another perceptron which learns the weight of each scale in the prediction of the original time series. The forecast results are then passed to a money management system to generate trades.
MOnthly TEmperature DAtabase of Spain 1951-2010: MOTEDAS. (1) Quality control
NASA Astrophysics Data System (ADS)
Peña-Angulo, Dhais; Cortesi, Nicola; Simolo, Claudia; Stepanek, Peter; Brunetti, Michele; González-Hidalgo, José Carlos
2014-05-01
The HIDROCAES project (Impactos Hidrológicos del Calentamiento Global en España, Spanish Ministery of Research CGL2011-27574-C02-01) is focused on the high resolution in the Spanish continental land of the warming processes during the 1951-2010. To do that the Department of Geography (University of Zaragoza, Spain), the Hydrometeorological Service (Brno Division, Chezck Republic) and the ISAC-CNR (Bologna, Italy) are developing the new dataset MOTEDAS (MOnthly TEmperature DAtabase of Spain), from which we present a collection of poster to show (1) the general structure of dataset and quality control; (2) the analyses of spatial correlation of monthly mean values of maximum (Tmax) and minimum (Tmin temperature; (3) the reconstruction processes of series and high resolution grid developing; (4) the first initial results of trend analyses of annual, seasonal and monthly range mean values. MOTEDAS has been created after exhaustive analyses and quality control of the original digitalized data of the Spanish National Meteorological Agency (Agencia Estatal de Meteorología, AEMET). Quality control was applied without any prior reconstruction, i.e. on original series. Then, from the total amount of series stored at AEMet archives (more than 4680) we selected only those series with at least 10 years of data (i.e. 120 months, 3066 series) to apply a quality control and reconstruction processes (see Poster MOTEDAS 3). Length of series was Tmin, upper and lower thresholds of absolute data, etc), and by comparison with reference series (see Poster MOTEDAS 3, about reconstruction). Anomalous data were considered when difference between Candidate and Reference series were higher than three times the interquartile distance. The total amount of monthly suspicious data recognized and discarded at the end of this analyses was 7832 data for Tmin, and 8063 for Tmax data; they represent less than 0,8% of original total monthly data, for both Tmax and Tmin. No spatial pattern was detected in the suspicious data; month by month Tmin shows maximum detection in summer months, while Tmax does not show any monthly pattern. Secondly, the homogeneity analyses was performed on the list of series free of anomalous data by using an arrays of test (SNHT, Bivariate, T de Student and Pettit) after new reference series calculated with data free of anomalous. The tests were applied at monthly, seasonal and annual scale (i.e. 17 times per method). Statistical inhomogeneity detections were accepted as follows: Three annual detections (monthly, seasonal, annual) must be found in SNHT or Bivariate test. The total amount of detections by the four tests was greater than 5% of the total possible detection per year. Before any correction we examined the Candidate and reference series chart. Proclim and Anclim software were used during all the processes The total amount of series affected by inhomogeneities was 1013 (Tmax) and 1011 (Tmin), i.e. 1/3 of original series was considered as inhomogeneous. We notice that identified inhomogeneous series in Tmax and Tmin usually do not coincide. This apparently small amount of series compared with previous work could be originated because of the mean length of series is around 15-20 years. References. Stepánek P. 2008a. AnClim - software for time series analysis (for Windows 95/NT). Department of Geography, Faculty of Natural Sciences, MU, Brno, 1.47 B. Stepánek P.. 2008b. ProClimDB - Software for Processing Climatological Datasets. CHMI, Regional office, Brno.
Signatures of ecological processes in microbial community time series.
Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie
2018-06-28
Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high goodness of fit to the Ricker model. We present a fast and robust scheme to classify community structure and to assess the prevalence of interactions directly from microbial time series data. The procedure not only serves to determine ecological drivers of microbial dynamics, but also to guide selection of appropriate community models for prediction and follow-up analysis.
NASA Astrophysics Data System (ADS)
Saturnino, Diana; Langlais, Benoit; Amit, Hagay; Civet, François; Mandea, Mioara; Beucler, Éric
2018-03-01
A detailed description of the main geomagnetic field and of its temporal variations (i.e., the secular variation or SV) is crucial to understanding the geodynamo. Although the SV is known with high accuracy at ground magnetic observatory locations, the globally uneven distribution of the observatories hampers the determination of a detailed global pattern of the SV. Over the past two decades, satellites have provided global surveys of the geomagnetic field which have been used to derive global spherical harmonic (SH) models through some strict data selection schemes to minimise external field contributions. However, discrepancies remain between ground measurements and field predictions by these models; indeed the global models do not reproduce small spatial scales of the field temporal variations. To overcome this problem we propose to directly extract time series of the field and its temporal variation from satellite measurements as it is done at observatory locations. We follow a Virtual Observatory (VO) approach and define a global mesh of VOs at satellite altitude. For each VO and each given time interval we apply an Equivalent Source Dipole (ESD) technique to reduce all measurements to a unique location. Synthetic data are first used to validate the new VO-ESD approach. Then, we apply our scheme to data from the first two years of the Swarm mission. For the first time, a 2.5° resolution global mesh of VO time series is built. The VO-ESD derived time series are locally compared to ground observations as well as to satellite-based model predictions. Our approach is able to describe detailed temporal variations of the field at local scales. The VO-ESD time series are then used to derive global spherical harmonic models. For a simple SH parametrization the model describes well the secular trend of the magnetic field both at satellite altitude and at the surface. As more data will be made available, longer VO-ESD time series can be derived and consequently used to study sharp temporal variation features, such as geomagnetic jerks.
A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis
NASA Astrophysics Data System (ADS)
Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz
2018-04-01
For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.
IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR
NASA Technical Reports Server (NTRS)
Mish, W. H.
1994-01-01
The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.
Time series modeling by a regression approach based on a latent process.
Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice
2009-01-01
Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.
NASA Technical Reports Server (NTRS)
Remsberg, Ellis E.
2009-01-01
Fourteen-year time series of mesospheric and upper stratospheric temperatures from the Halogen Occultation Experiment (HALOE) are analyzed and reported. The data have been binned according to ten-degree wide latitude zones from 40S to 40N and at 10 altitudes from 43 to 80 km-a total of 90 separate time series. Multiple linear regression (MLR) analysis techniques have been applied to those time series. This study focuses on resolving their 11-yr solar cycle (or SC-like) responses and their linear trend terms. Findings for T(z) from HALOE are compared directly with published results from ground-based Rayleigh lidar and rocketsonde measurements. SC-like responses from HALOE compare well with those from lidar station data at low latitudes. The cooling trends from HALOE also agree reasonably well with those from the lidar data for the concurrent decade. Cooling trends of the lower mesosphere from HALOE are not as large as those from rocketsondes and from lidar station time series of the previous two decades, presumably because the changes in the upper stratospheric ozone were near zero during the HALOE time period and did not affect those trends.
Borehole Volumetric Strainmeters Detect Very Long-period Ocean Level Changes in Tokai Area
NASA Astrophysics Data System (ADS)
Takanami, T.; Linde, A. T.; Sacks, S. I.; Kitagawa, G.; Hirata, N.; Rydelek, P. A.
2015-12-01
We detected a clear very long-period strain signal with a predominant period of about 2 months in the data from Sacks-Evertson borehole volumetric strainmeters. These have been operated by the Japan Meteorological Agency (JMA) since 1976 in Tokai area, Japan, the area of an expected Tokai eartquake. Earth's surface is always influenced by natural force such as earth tide, air pressure, and precipitation as well as by human induced sources. In order to decompose into their components in the maximum likelihood estimation, state-space modeling (Takanami et al., 2013) is applied to the observed time series data for 15 months before and after the earthquake M6.5 that occurred on 11th August 2009 in Suruga Bay. In the analysis, the strain data are decomposed into trend, air pressure, earth tide, precipitation effects and observation noise. Clear long-period strain signals are seen in the normalized trend component time series. Time series data from JMA tide gages around Suruga Bay are similarly decomposed. Then spectral analyses are applied to the trend components for the same time interval. Comparison of amplitude peaks in spectra for both data sets show all have a peak at period of about 1464 hours. Thus strain changes may be influenced by very long-period ocean level changes; it is necessary to consider this possibility before attributing tectonic significance to such variations.
Refined generalized multiscale entropy analysis for physiological signals
NASA Astrophysics Data System (ADS)
Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian
2018-01-01
Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
NASA Astrophysics Data System (ADS)
Kryanev, A. V.; Ivanov, V. V.; Romanova, A. O.; Sevastyanov, L. A.; Udumyan, D. K.
2018-03-01
This paper considers the problem of separating the trend and the chaotic component of chaotic time series in the absence of information on the characteristics of the chaotic component. Such a problem arises in nuclear physics, biomedicine, and many other applied fields. The scheme has two stages. At the first stage, smoothing linear splines with different values of smoothing parameter are used to separate the "trend component." At the second stage, the method of least squares is used to find the unknown variance σ2 of the noise component.
Turbulence Time Series Data Hole Filling using Karhunen-Loeve and ARIMA methods
2007-01-01
memory is represented by higher values of d. 4.1. ARIMA and EMD We applied an ARIMA (0,d,0) model to predict the behaviour of the final section of the...to a simplified ARIMA (0,d,0) model , which performed better than the linear interpolant but less effectively than the KL algorithm, disregarding edge...ar X iv :p hy si cs /0 70 12 38 v1 22 J an 2 00 7 Turbulence Time Series Data Hole Filling using Karhunen-Loève and ARIMA methods M P J L
Kernel canonical-correlation Granger causality for multiple time series
NASA Astrophysics Data System (ADS)
Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu
2011-04-01
Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.
Detecting the sampling rate through observations
NASA Astrophysics Data System (ADS)
Shoji, Isao
2018-09-01
This paper proposes a method to detect the sampling rate of discrete time series of diffusion processes. Using the maximum likelihood estimates of the parameters of a diffusion process, we establish a criterion based on the Kullback-Leibler divergence and thereby estimate the sampling rate. Simulation studies are conducted to check whether the method can detect the sampling rates from data and their results show a good performance in the detection. In addition, the method is applied to a financial time series sampled on daily basis and shows the detected sampling rate is different from the conventional rates.
NASA Astrophysics Data System (ADS)
Chen, Wei-Shing
2011-04-01
The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.
Ex-Vivo Cow Skin Viscoelastic Effect for Tribological Aspects in Endoprosthesis
NASA Astrophysics Data System (ADS)
Subhi, K. A.; Tudor, A.; Hussein, E. K.; Wahad, H.; Chisiu, G.
2018-01-01
The viscoelastic behavior of ex-vivo cow skin was experimentally studied by applied load from different indenter types (circle, square and triangle, all types have the same area) for different times (10 sec, 30 sec, and 60 sec). The viscoelastic tests were carried out using a UMT series (UMT-II, CETR Corporation). The experimental results collected at different operating conditions showed that the cow skin has a higher reaction against the triangle indenter compared to the other shapes. Whereas the hysteresis of cow skin was lower at low applied load time and it's increased when the time increased.
Snedden, Gregg A.; Swenson, Erick M.
2012-01-01
Hourly time-series salinity and water-level data are collected at all stations within the Coastwide Reference Monitoring System (CRMS) network across coastal Louisiana. These data, in addition to vegetation and soils data collected as part of CRMS, are used to develop a suite of metrics and indices to assess wetland condition in coastal Louisiana. This document addresses the primary objectives of the CRMS hydrologic analytical team, which were to (1) adopt standard time-series analytical techniques that could effectively assess spatial and temporal variability in hydrologic characteristics across the Louisiana coastal zone on site, project, basin, and coastwide scales and (2) develop and apply an index based on wetland hydrology that can describe the suitability of local hydrology in the context of maximizing the productivity of wetland plant communities. Approaches to quantifying tidal variability (least squares harmonic analysis) and partitioning variability of time-series data to various time scales (spectral analysis) are presented. The relation between marsh elevation and the tidal frame of a given hydrograph is described. A hydrologic index that integrates water-level and salinity data, which are collected hourly, with vegetation data that are collected annually is developed. To demonstrate its utility, the hydrologic index is applied to 173 CRMS sites across the coast, and variability in index scores across marsh vegetation types (fresh, intermediate, brackish, and saline) is assessed. The index is also applied to 11 sites located in three Coastal Wetlands Planning, Protection and Restoration Act projects, and the ability of the index to convey temporal hydrologic variability in response to climatic stressors and restoration measures, as well as the effect that this community may have on wetland plant productivity, is illustrated.
NASA Astrophysics Data System (ADS)
Zhao, W.; Amelung, F.; Dixon, T. H.; Wdowinski, S.
2012-12-01
Synthetic aperture radar interferometry time series is applied over Vatnajokull, Iceland by using 15 years ERS data. Ice loss at Vatnajokull accelerates since late 1990s especially after 21th century. Clear uplift signal due to ice mass loss is detected. The rebound signal is generally linear and increases a little bit after 2000. The relative annual velocity (GPS station 7485 as reference) is about 12 mm/yr at the ice cap edge, which matches the previous studies using GPS. The standard deviation compared to 11 GPS stations in this area is about 2 mm/yr. A relative-value modeling method ignoring the effect of viscous flow is chosen assuming elastic half space earth. The final ice loss estimation - 83 cm/yr - matches the climatology model with ground observations. Small Baseline Subsets is applied for time series analysis. Orbit error coupling with long wavelength phase trend due to horizontal plate motion is removed based on a second polynomial model. For simplicity, we do not consider atmospheric delay in this area because of no complex topography and small-scale turbulence is eliminated well after long-term average when calculating the annual mean velocity. Some unwrapping error still exits because of low coherence. Other uncertainties can be the basic assumption of ice loss pattern and spatial variation of the elastic parameters. It is the first time we apply InSAR time series for ice mass balance study and provide detailed error and uncertainty analysis. The successful of this application proves InSAR as an option for mass balance study and it is also important for validation of different ice loss estimation techniques.
Trend analysis of air temperature and precipitation time series over Greece: 1955-2010
NASA Astrophysics Data System (ADS)
Marougianni, G.; Melas, D.; Kioutsioukis, I.; Feidas, H.; Zanis, P.; Anandranistakis, E.
2012-04-01
In this study, a database of air temperature and precipitation time series from the network of Hellenic National Meteorological Service has been developed in the framework of the project GEOCLIMA, co-financed by the European Union and Greek national funds through the Operational Program "Competitiveness and Entrepreneurship" of the Research Funding Program COOPERATION 2009. Initially, a quality test was applied to the raw data and then missing observations have been imputed with a regularized, spatial-temporal expectation - maximization algorithm to complete the climatic record. Next, a quantile - matching algorithm was applied in order to verify the homogeneity of the data. The processed time series were used for the calculation of temporal annual and seasonal trends of air temperature and precipitation. Monthly maximum and minimum surface air temperature and precipitation means at all available stations in Greece were analyzed for temporal trends and spatial variation patterns for the longest common time period of homogenous data (1955 - 2010), applying the Mann-Kendall test. The majority of the examined stations showed a significant increase in the summer maximum and minimum temperatures; this could be possibly physically linked to the Etesian winds, because of the less frequent expansion of the low over the southeastern Mediterranean. Summer minimum temperatures have been increasing at a faster rate than that of summer maximum temperatures, reflecting an asymmetric change of extreme temperature distributions. Total annual precipitation has been significantly decreased at the stations located in western Greece, as well as in the southeast, while the remaining areas exhibit a non-significant negative trend. This reduction is very likely linked to the positive phase of the NAO that resulted in an increase in the frequency and persistence of anticyclones over the Mediterranean.
Effects of Helicity on Lagrangian and Eulerian Time Correlations in Turbulence
NASA Technical Reports Server (NTRS)
Rubinstein, Robert; Zhou, Ye
1998-01-01
Taylor series expansions of turbulent time correlation functions are applied to show that helicity influences Eulerian time correlations more strongly than Lagrangian time correlations: to second order in time, the helicity effect on Lagrangian time correlations vanishes, but the helicity effect on Eulerian time correlations is nonzero. Fourier analysis shows that the helicity effect on Eulerian time correlations is confined to the largest inertial range scales. Some implications for sound radiation by swirling flows are discussed.
A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.
Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L
2016-10-01
Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.
A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.
Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas
2018-02-23
We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
Predicting Flavonoid UGT Regioselectivity
Jackson, Rhydon; Knisley, Debra; McIntosh, Cecilia; Pfeiffer, Phillip
2011-01-01
Machine learning was applied to a challenging and biologically significant protein classification problem: the prediction of avonoid UGT acceptor regioselectivity from primary sequence. Novel indices characterizing graphical models of residues were proposed and found to be widely distributed among existing amino acid indices and to cluster residues appropriately. UGT subsequences biochemically linked to regioselectivity were modeled as sets of index sequences. Several learning techniques incorporating these UGT models were compared with classifications based on standard sequence alignment scores. These techniques included an application of time series distance functions to protein classification. Time series distances defined on the index sequences were used in nearest neighbor and support vector machine classifiers. Additionally, Bayesian neural network classifiers were applied to the index sequences. The experiments identified improvements over the nearest neighbor and support vector machine classifications relying on standard alignment similarity scores, as well as strong correlations between specific subsequences and regioselectivities. PMID:21747849
Dai, Wei; Fu, Caroline; Khant, Htet A; Ludtke, Steven J; Schmid, Michael F; Chiu, Wah
2014-11-01
Advances in electron cryotomography have provided new opportunities to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase-contrast optics produces images with markedly increased contrast compared with images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods for obtaining 3D structures of cyanophage assembly intermediates in the host by subtomogram alignment, classification and averaging. Acquiring three or four tomographic tilt series takes ∼12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. The time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D
2016-01-01
One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.
31 CFR 359.15 - When is the composite rate applied to Series I savings bonds?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Series I savings bonds? 359.15 Section 359.15 Money and Finance: Treasury Regulations Relating to Money... OF UNITED STATES SAVINGS BONDS, SERIES I General Information § 359.15 When is the composite rate applied to Series I savings bonds? The most recently announced composite rate applies to a bond during its...
31 CFR 359.15 - When is the composite rate applied to Series I savings bonds?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Series I savings bonds? 359.15 Section 359.15 Money and Finance: Treasury Regulations Relating to Money... OF UNITED STATES SAVINGS BONDS, SERIES I General Information § 359.15 When is the composite rate applied to Series I savings bonds? The most recently announced composite rate applies to a bond during its...
31 CFR 359.15 - When is the composite rate applied to Series I savings bonds?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Series I savings bonds? 359.15 Section 359.15 Money and Finance: Treasury Regulations Relating to Money... OF UNITED STATES SAVINGS BONDS, SERIES I General Information § 359.15 When is the composite rate applied to Series I savings bonds? The most recently announced composite rate applies to a bond during its...
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Li, Qiongge; Chan, Maria F
2017-01-01
Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field. © 2016 New York Academy of Sciences.
Fontes, Cristiano Hora; Budman, Hector
2017-11-01
A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
A comment on measuring the Hurst exponent of financial time series
NASA Astrophysics Data System (ADS)
Couillard, Michel; Davison, Matt
2005-03-01
A fundamental hypothesis of quantitative finance is that stock price variations are independent and can be modeled using Brownian motion. In recent years, it was proposed to use rescaled range analysis and its characteristic value, the Hurst exponent, to test for independence in financial time series. Theoretically, independent time series should be characterized by a Hurst exponent of 1/2. However, finite Brownian motion data sets will always give a value of the Hurst exponent larger than 1/2 and without an appropriate statistical test such a value can mistakenly be interpreted as evidence of long term memory. We obtain a more precise statistical significance test for the Hurst exponent and apply it to real financial data sets. Our empirical analysis shows no long-term memory in some financial returns, suggesting that Brownian motion cannot be rejected as a model for price dynamics.
NASA Astrophysics Data System (ADS)
McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.
2006-02-01
We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.
Simulation of Ground Winds Time Series for the NASA Crew Launch Vehicle (CLV)
NASA Technical Reports Server (NTRS)
Adelfang, Stanley I.
2008-01-01
Simulation of wind time series based on power spectrum density (PSD) and spectral coherence models for ground wind turbulence is described. The wind models, originally developed for the Shuttle program, are based on wind measurements at the NASA 150-m meteorological tower at Cape Canaveral, FL. The current application is for the design and/or protection of the CLV from wind effects during on-pad exposure during periods from as long as days prior to launch, to seconds or minutes just prior to launch and seconds after launch. The evaluation of vehicle response to wind will influence the design and operation of constraint systems for support of the on-pad vehicle. Longitudinal and lateral wind component time series are simulated at critical vehicle locations. The PSD model for wind turbulence is a function of mean wind speed, elevation and temporal frequency. Integration of the PSD equation over a selected frequency range yields the variance of the time series to be simulated. The square root of the PSD defines a low-pass filter that is applied to adjust the components of the Fast Fourier Transform (FFT) of Gaussian white noise. The first simulated time series near the top of the launch vehicle is the inverse transform of the adjusted FFT. Simulation of the wind component time series at the nearest adjacent location (and all other succeeding next nearest locations) is based on a model for the coherence between winds at two locations as a function of frequency and separation distance, where the adjacent locations are separated vertically and/or horizontally. The coherence function is used to calculate a coherence weighted FFT of the wind at the next nearest location, given the FFT of the simulated time series at the previous location and the essentially incoherent FFT of the wind at the selected location derived a priori from the PSD model. The simulated time series at each adjacent location is the inverse Fourier transform of the coherence weighted FFT. For a selected design case, the equations, the process and the simulated time series at multiple vehicle stations are presented.
Firefly Algorithm in detection of TEC seismo-ionospheric anomalies
NASA Astrophysics Data System (ADS)
Akhoondzadeh, Mehdi
2015-07-01
Anomaly detection in time series of different earthquake precursors is an essential introduction to create an early warning system with an allowable uncertainty. Since these time series are more often non linear, complex and massive, therefore the applied predictor method should be able to detect the discord patterns from a large data in a short time. This study acknowledges Firefly Algorithm (FA) as a simple and robust predictor to detect the TEC (Total Electron Content) seismo-ionospheric anomalies around the time of the some powerful earthquakes including Chile (27 February 2010), Varzeghan (11 August 2012) and Saravan (16 April 2013). Outstanding anomalies were observed 7 and 5 days before the Chile and Varzeghan earthquakes, respectively and also 3 and 8 days prior to the Saravan earthquake.
Morimoto, Shimpei; Yahara, Koji
2018-03-01
Protein expression is regulated by the production and degradation of mRNAs and proteins but the specifics of their relationship are controversial. Although technological advances have enabled genome-wide and time-series surveys of mRNA and protein abundance, recent studies have shown paradoxical results, with most statistical analyses being limited to linear correlation, or analysis of variance applied separately to mRNA and protein datasets. Here, using recently analyzed genome-wide time-series data, we have developed a statistical analysis framework for identifying which types of genes or biological gene groups have significant correlation between mRNA and protein abundance after accounting for potential time delays. Our framework stratifies all genes in terms of the extent of time delay, conducts gene clustering in each stratum, and performs a non-parametric statistical test of the correlation between mRNA and protein abundance in a gene cluster. Consequently, we revealed stronger correlations than previously reported between mRNA and protein abundance in two metabolic pathways. Moreover, we identified a pair of stress responsive genes ( ADC17 and KIN1 ) that showed a highly similar time series of mRNA and protein abundance. Furthermore, we confirmed robustness of the analysis framework by applying it to another genome-wide time-series data and identifying a cytoskeleton-related gene cluster (keratin 18, keratin 17, and mitotic spindle positioning) that shows similar correlation. The significant correlation and highly similar changes of mRNA and protein abundance suggests a concerted role of these genes in cellular stress response, which we consider provides an answer to the question of the specific relationships between mRNA and protein in a cell. In addition, our framework for studying the relationship between mRNAs and proteins in a cell will provide a basis for studying specific relationships between mRNA and protein abundance after accounting for potential time delays.
Modified cross sample entropy and surrogate data analysis method for financial time series
NASA Astrophysics Data System (ADS)
Yin, Yi; Shang, Pengjian
2015-09-01
For researching multiscale behaviors from the angle of entropy, we propose a modified cross sample entropy (MCSE) and combine surrogate data analysis with it in order to compute entropy differences between original dynamics and surrogate series (MCSDiff). MCSDiff is applied to simulated signals to show accuracy and then employed to US and Chinese stock markets. We illustrate the presence of multiscale behavior in the MCSDiff results and reveal that there are synchrony containing in the original financial time series and they have some intrinsic relations, which are destroyed by surrogate data analysis. Furthermore, the multifractal behaviors of cross-correlations between these financial time series are investigated by multifractal detrended cross-correlation analysis (MF-DCCA) method, since multifractal analysis is a multiscale analysis. We explore the multifractal properties of cross-correlation between these US and Chinese markets and show the distinctiveness of NQCI and HSI among the markets in their own region. It can be concluded that the weaker cross-correlation between US markets gives the evidence for the better inner mechanism in the US stock markets than that of Chinese stock markets. To study the multiscale features and properties of financial time series can provide valuable information for understanding the inner mechanism of financial markets.
NASA Astrophysics Data System (ADS)
Zhao, An; Jin, Ning-de; Ren, Ying-yu; Zhu, Lei; Yang, Xia
2016-01-01
In this article we apply an approach to identify the oil-gas-water three-phase flow patterns in vertical upwards 20 mm inner-diameter pipe based on the conductance fluctuating signals. We use the approach to analyse the signals with long-range correlations by decomposing the signal increment series into magnitude and sign series and extracting their scaling properties. We find that the magnitude series relates to nonlinear properties of the original time series, whereas the sign series relates to the linear properties. The research shows that the oil-gas-water three-phase flows (slug flow, churn flow, bubble flow) can be classified by a combination of scaling exponents of magnitude and sign series. This study provides a new way of characterising linear and nonlinear properties embedded in oil-gas-water three-phase flows.
Coupled uncertainty provided by a multifractal random walker
NASA Astrophysics Data System (ADS)
Koohi Lai, Z.; Vasheghani Farahani, S.; Movahed, S. M. S.; Jafari, G. R.
2015-10-01
The aim here is to study the concept of pairing multifractality between time series possessing non-Gaussian distributions. The increasing number of rare events creates ;criticality;. We show how the pairing between two series is affected by rare events, which we call ;coupled criticality;. A method is proposed for studying the coupled criticality born out of the interaction between two series, using the bivariate multifractal random walk (BiMRW). This method allows studying dependence of the coupled criticality on the criticality of each individual system. This approach is applied to data sets of gold and oil markets, and inflation and unemployment.
Statistical approaches for studying the wave climate of crossing-sea states
NASA Astrophysics Data System (ADS)
Barbariol, Francesco; Portilla, Jesus; Benetazzo, Alvise; Cavaleri, Luigi; Sclavo, Mauro; Carniel, Sandro
2017-04-01
Surface waves are an important feature of the world's oceans and seas. Their role in the air-sea exchanges is well recognized, together with their effects on the upper ocean and lower atmosphere dynamics. Physical processes involving surface waves contribute in driving the Earth's climate that, while experiencing changes at global and regional scales, in turn affects the surface waves climate over the oceans. The assessment of the wave climate at specific locations of the ocean is fruitful for many research fields in marine and atmospheric sciences and also for the human activities in the marine environment. Very often, wind generated waves (wind-sea) and one or more swell systems occur simultaneously, depending on the complexity of the atmospheric conditions that force the waves. Therefore, a wave climate assessed from the statistical analysis of long time series of integral wave parameters, can hardly say something about the frequency of occurrence of the so-called crossing-seas, as well as of their features. Directional wave spectra carry such information but proper statistical methods to analyze them are needed. In this respect, in order to identify the crossing sea states within the spectral time series and to assess their frequency of occurrence we exploit two advanced statistical techniques. First, we apply the Spectral Partitioning, a well-established method based on a two-step partitioning of the spectrum that allows to identify the individual wave systems and to compute their probability of occurrence in the frequency/direction space. Then, we use the Self-Organizing Maps, an unsupervised neural network algorithm that quantize the time series by autonomously identifying an arbitrary (small) number of wave spectra representing the whole wave climate, each with its frequency of occurrence. This method has been previously applied to time series of wave parameters and for the first time is applied to directional wave spectra. We analyze the wave climate of one of the most severe regions of the Mediterranean Sea, between north-west Sardinia and the Gulf of Lion, where quite often wave systems coming from different directions superpose. Time series for the analysis is taken from the ERA-Interim Reanalysis dataset, which provides global directional wave spectra at 1° resolution, starting from 1979 up to the present. Results from the two techniques are shown to be consistent, and their comparison points out the contribution that each technique can provide for a more detailed interpretation of the wave climate.
Large-deviation probabilities for correlated Gaussian processes and intermittent dynamical systems
NASA Astrophysics Data System (ADS)
Massah, Mozhdeh; Nicol, Matthew; Kantz, Holger
2018-05-01
In its classical version, the theory of large deviations makes quantitative statements about the probability of outliers when estimating time averages, if time series data are identically independently distributed. We study large-deviation probabilities (LDPs) for time averages in short- and long-range correlated Gaussian processes and show that long-range correlations lead to subexponential decay of LDPs. A particular deterministic intermittent map can, depending on a control parameter, also generate long-range correlated time series. We illustrate numerically, in agreement with the mathematical literature, that this type of intermittency leads to a power law decay of LDPs. The power law decay holds irrespective of whether the correlation time is finite or infinite, and hence irrespective of whether the central limit theorem applies or not.
NASA Astrophysics Data System (ADS)
Korotkov, E. V.; Korotkova, M. A.
2017-01-01
The purpose of this study was to detect latent periodicity in the presence of deletions or insertions in the analyzed data, when the points of deletions or insertions are unknown. A mathematical method was developed to search for periodicity in the numerical series, using dynamic programming and random matrices. The developed method was applied to search for periodicity in the Euro/Dollar (Eu/) exchange rate, since 2001. The presence of periodicity within the period length equal to 24 h in the analyzed financial series was shown. Periodicity can be detected only with insertions and deletions. The results of this study show that periodicity phase shifts, depend on the observation time. The reasons for the existence of the periodicity in the financial ranks are discussed.
NASA Astrophysics Data System (ADS)
Curceac, S.; Ternynck, C.; Ouarda, T.
2015-12-01
Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed
A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series
Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C.; Golino, Caroline A.; Kemper, Peter; Saha, Margaret S.
2016-01-01
Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features. PMID:27977764
31 CFR 359.15 - When is the composite rate applied to Series I savings bonds?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Series I savings bonds? 359.15 Section 359.15 Money and Finance: Treasury Regulations Relating to Money... OFFERING OF UNITED STATES SAVINGS BONDS, SERIES I General Information § 359.15 When is the composite rate applied to Series I savings bonds? The most recently announced composite rate applies to a bond during its...
Detection of long term persistence in time series of the Neuquen River (Argentina)
NASA Astrophysics Data System (ADS)
Seoane, Rafael; Paz González, Antonio
2014-05-01
In the Patagonian region (Argentina), previous hydrometeorological studies that have been developed using general circulation models show variations in annual mean flows. Future climate scenarios obtained from high-resolution models indicate decreases in total annual precipitation, and these scenarios are more important in the Neuquén river basin (23000 km2). The aim of this study was the estimation of long term persistence in the Neuquén River basin (Argentina). The detection of variations in the long range dependence term and long memory of time series was evaluated with the Hurst exponent. We applied rescaled adjusted range analysis (R/S) to time series of River discharges measured from 1903 to 2011 and this time series was divided into two subperiods: the first was from 1903 to 1970 and the second from 1970 to 2011. Results show a small increase in persistence for the second period. Our results are consistent with those obtained by Koch and Markovic (2007), who observed and estimated an increase of the H exponent for the period 1960-2000 in the Elbe River (Germany). References Hurst, H. (1951).Long term storage capacities of reservoirs". Trans. Am. Soc. Civil Engrs., 116:776-808. Koch and Markovic (2007). Evidences for Climate Change in Germany over the 20th Century from the Stochastic Analysis of hydro-meteorological Time Series, MODSIM07, International Congress on Modelling and Simulation, Christchurch, New Zealand.
Lyapunov exponents from CHUA's circuit time series using artificial neural networks
NASA Technical Reports Server (NTRS)
Gonzalez, J. Jesus; Espinosa, Ismael E.; Fuentes, Alberto M.
1995-01-01
In this paper we present the general problem of identifying if a nonlinear dynamic system has a chaotic behavior. If the answer is positive the system will be sensitive to small perturbations in the initial conditions which will imply that there is a chaotic attractor in its state space. A particular problem would be that of identifying a chaotic oscillator. We present an example of three well known different chaotic oscillators where we have knowledge of the equations that govern the dynamical systems and from there we can obtain the corresponding time series. In a similar example we assume that we only know the time series and, finally, in another example we have to take measurements in the Chua's circuit to obtain sample points of the time series. With the knowledge about the time series the phase plane portraits are plotted and from them, by visual inspection, it is concluded whether or not the system is chaotic. This method has the problem of uncertainty and subjectivity and for that reason a different approach is needed. A quantitative approach is the computation of the Lyapunov exponents. We describe several methods for obtaining them and apply a little known method of artificial neural networks to the different examples mentioned above. We end the paper discussing the importance of the Lyapunov exponents in the interpretation of the dynamic behavior of biological neurons and biological neural networks.
Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites
Madonna, F.; Rosoldi, M.; Güldner, J.; ...
2014-11-19
The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010–2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%.more » Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. In conclusion, specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.« less
Road safety forecasts in five European countries using structural time series models.
Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George
2014-01-01
Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.
NASA Astrophysics Data System (ADS)
Genty, Dominique; Massault, Marc
1999-05-01
Twenty-two AMS 14C measurements have been made on a modern stalagmite from SW France in order to reconstruct the 14C activity history of the calcite deposit. Annual growth laminae provides a chronology up to 1919 A.D. Results show that the stalagmite 14C activity time series is sensitive to modern atmosphere 14C activity changes such as those produced by the nuclear weapon tests. The comparison between the two 14C time series shows that the stalagmite time series is damped: its amplitude variation between pre-bomb and post-bomb values is 75% less and the time delay between the two time series peaks is 16 years ±3. A model is developed using atmosphere 14C and 13C data, fractionation processes and three soil organic matter components whose mean turnover rates are different. The linear correlation coefficient between modeled and measured activities is 0.99. These results, combined with two other stalagmite 14C time series already published and compared with local vegetation and climate, demonstrate that most of the carbon transfer dynamics are controlled in the soil by soil organic matter degradation rates. Where vegetation produces debris whose degradation is slow, the fraction of old carbon injected in the system increases, the observed 14C time series is much more damped and lag time longer than that observed under grassland sites. The same mixing model applied on the 13C shows a good agreement ( R2 = 0.78) between modeled and measured stalagmite δ 13C and demonstrates that the Suess Effect due to fossil fuel combustion in the atmosphere is recorded in the stalagmite but with a damped effect due to SOM degradation rate. The different sources of dead carbon in the seepage water are calculated and discussed.
NASA Astrophysics Data System (ADS)
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-04-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-01-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field. PMID:27118260
Moorman, J. Randall; Delos, John B.; Flower, Abigail A.; Cao, Hanqing; Kovatchev, Boris P.; Richman, Joshua S.; Lake, Douglas E.
2014-01-01
We have applied principles of statistical signal processing and non-linear dynamics to analyze heart rate time series from premature newborn infants in order to assist in the early diagnosis of sepsis, a common and potentially deadly bacterial infection of the bloodstream. We began with the observation of reduced variability and transient decelerations in heart rate interval time series for hours up to days prior to clinical signs of illness. We find that measurements of standard deviation, sample asymmetry and sample entropy are highly related to imminent clinical illness. We developed multivariable statistical predictive models, and an interface to display the real-time results to clinicians. Using this approach, we have observed numerous cases in which incipient neonatal sepsis was diagnosed and treated without any clinical illness at all. This review focuses on the mathematical and statistical time series approaches used to detect these abnormal heart rate characteristics and present predictive monitoring information to the clinician. PMID:22026974
Calculation of Rate Spectra from Noisy Time Series Data
Voelz, Vincent A.; Pande, Vijay S.
2011-01-01
As the resolution of experiments to measure folding kinetics continues to improve, it has become imperative to avoid bias that may come with fitting data to a predetermined mechanistic model. Towards this end, we present a rate spectrum approach to analyze timescales present in kinetic data. Computing rate spectra of noisy time series data via numerical discrete inverse Laplace transform is an ill-conditioned inverse problem, so a regularization procedure must be used to perform the calculation. Here, we show the results of different regularization procedures applied to noisy multi-exponential and stretched exponential time series, as well as data from time-resolved folding kinetics experiments. In each case, the rate spectrum method recapitulates the relevant distribution of timescales present in the data, with different priors on the rate amplitudes naturally corresponding to common biases toward simple phenomenological models. These results suggest an attractive alternative to the “Occam’s razor” philosophy of simply choosing models with the fewest number of relaxation rates. PMID:22095854
Artificial neural networks applied to forecasting time series.
Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar
2011-04-01
This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.
NASA Astrophysics Data System (ADS)
Zhou, Wei-Xing; Sornette, Didier
2007-07-01
We have recently introduced the “thermal optimal path” (TOP) method to investigate the real-time lead-lag structure between two time series. The TOP method consists in searching for a robust noise-averaged optimal path of the distance matrix along which the two time series have the greatest similarity. Here, we generalize the TOP method by introducing a more general definition of distance which takes into account possible regime shifts between positive and negative correlations. This generalization to track possible changes of correlation signs is able to identify possible transitions from one convention (or consensus) to another. Numerical simulations on synthetic time series verify that the new TOP method performs as expected even in the presence of substantial noise. We then apply it to investigate changes of convention in the dependence structure between the historical volatilities of the USA inflation rate and economic growth rate. Several measures show that the new TOP method significantly outperforms standard cross-correlation methods.
Solar signals detected within neutral atmospheric and ionospheric parameters
NASA Astrophysics Data System (ADS)
Koucka Knizova, Petra; Georgieva, Katya; Mosna, Zbysek; Kozubek, Michal; Kouba, Daniel; Kirov, Boian; Potuzníkova, Katerina; Boska, Josef
2018-06-01
We have analyzed time series of solar data together with the atmospheric and ionospheric measurements for solar cycles 19 till 23 according to particular data availability. For the analyses we have used long term data with 1-day sampling. By mean of Continuous Wavelet Transform (CWT) we have found common spectral domains within solar and atmospheric and ionospheric time series. Further we have identified terms when particular pairs of signals show high coherence applying Wavelet Transform Coherence (WTC). Despite wide oscillation ranges detected in particular time series CWT spectra we found only limited domains with high coherence by mean of WTC. Wavelet Transform Coherence reveals significant high power domains with stable phase difference for periods 1 month, 2 months, 6 months, 1 year, 2 years and 3-4 years between pairs of solar data and atmospheric and ionospheric data. The occurence of the detected domains vary significantly during particular solar cycle (SC) and from cycle to the following one. It indicates the changing solar forcing and/or atmospheric sensitivity with time.
Efficient Generation and Use of Power Series for Broad Application.
NASA Astrophysics Data System (ADS)
Rudmin, Joseph; Sochacki, James
2017-01-01
A brief history and overview of the Parker-Sockacki Method of Power Series generation is presented. This method generates power series to order n in time n2 for any system of differential equations that has a power series solution. The method is simple enough that novices to differential equations can easily learn it and immediately apply it. Maximal absolute error estimates allow one to determine the number of terms needed to reach desired accuracy. Ratios of coefficients in a solution with global convergence differ signficantly from that for a solution with only local convergence. Divergence of the series prevents one from overlooking poles. The method can always be cast in polynomial form, which allows separation of variables in almost all physical systems, facilitating exploration of hidden symmetries, and is implicitly symplectic.
Hydraulic Fatigue-Testing Machine
NASA Technical Reports Server (NTRS)
Hodo, James D.; Moore, Dennis R.; Morris, Thomas F.; Tiller, Newton G.
1987-01-01
Fatigue-testing machine applies fluctuating tension to number of specimens at same time. When sample breaks, machine continues to test remaining specimens. Series of tensile tests needed to determine fatigue properties of materials performed more rapidly than in conventional fatigue-testing machine.
Dollar$ & $en$e. Part V: What is your added value?
Wilkinson, I
2001-01-01
In Part I of this series, I introduced the concept of memes (1). Memes are ideas or concepts--the information world equivalent of genes. The goal of this series of articles is to infect you with memes, so that you will assimilate, translate, and express them. No matter what our area of expertise or "-ology," we all are in the information business. Our goal is to be in the wisdom business. In the previous papers in this series, I showed that when we convert raw data into wisdom we are moving along a value chain. Each step in the chain adds a different amount of value to the final product: timely, relevant, accurate, and precise knowledge that can be applied to create the ultimate product in the value chain: wisdom. In Part II of this series, I introduced a set of memes for measuring the cost of adding value (2). In Part III of this series, I presented a new set of memes for measuring the added value of knowledge, i.e., intellectual capital (3). In Part IV of this series, I discussed practical knowledge management tools for measuring the value of people, structural, and customer capital (4). In Part V of this series, I will apply intellectual capital and knowledge management concepts at the individual level, to help answer a fundamental question: What is my added value?
Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology
NASA Astrophysics Data System (ADS)
Serinaldi, Francesco; Kilsby, Chris G.; Lombardo, Federico
2018-01-01
The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as 'deterministic components' or 'trends' even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures.
Luo, Li; Luo, Le; Zhang, Xinli; He, Xiaoli
2017-07-10
Accurate forecasting of hospital outpatient visits is beneficial for the reasonable planning and allocation of healthcare resource to meet the medical demands. In terms of the multiple attributes of daily outpatient visits, such as randomness, cyclicity and trend, time series methods, ARIMA, can be a good choice for outpatient visits forecasting. On the other hand, the hospital outpatient visits are also affected by the doctors' scheduling and the effects are not pure random. Thinking about the impure specialty, this paper presents a new forecasting model that takes cyclicity and the day of the week effect into consideration. We formulate a seasonal ARIMA (SARIMA) model on a daily time series and then a single exponential smoothing (SES) model on the day of the week time series, and finally establish a combinatorial model by modifying them. The models are applied to 1 year of daily visits data of urban outpatients in two internal medicine departments of a large hospital in Chengdu, for forecasting the daily outpatient visits about 1 week ahead. The proposed model is applied to forecast the cross-sectional data for 7 consecutive days of daily outpatient visits over an 8-weeks period based on 43 weeks of observation data during 1 year. The results show that the two single traditional models and the combinatorial model are simplicity of implementation and low computational intensiveness, whilst being appropriate for short-term forecast horizons. Furthermore, the combinatorial model can capture the comprehensive features of the time series data better. Combinatorial model can achieve better prediction performance than the single model, with lower residuals variance and small mean of residual errors which needs to be optimized deeply on the next research step.
Using spectrotemporal indices to improve the fruit-tree crop classification accuracy
NASA Astrophysics Data System (ADS)
Peña, M. A.; Liao, R.; Brenning, A.
2017-06-01
This study assesses the potential of spectrotemporal indices derived from satellite image time series (SITS) to improve the classification accuracy of fruit-tree crops. Six major fruit-tree crop types in the Aconcagua Valley, Chile, were classified by applying various linear discriminant analysis (LDA) techniques on a Landsat-8 time series of nine images corresponding to the 2014-15 growing season. As features we not only used the complete spectral resolution of the SITS, but also all possible normalized difference indices (NDIs) that can be constructed from any two bands of the time series, a novel approach to derive features from SITS. Due to the high dimensionality of this "enhanced" feature set we used the lasso and ridge penalized variants of LDA (PLDA). Although classification accuracies yielded by the standard LDA applied on the full-band SITS were good (misclassification error rate, MER = 0.13), they were further improved by 23% (MER = 0.10) with ridge PLDA using the enhanced feature set. The most important bands to discriminate the crops of interest were mainly concentrated on the first two image dates of the time series, corresponding to the crops' greenup stage. Despite the high predictor weights provided by the red and near infrared bands, typically used to construct greenness spectral indices, other spectral regions were also found important for the discrimination, such as the shortwave infrared band at 2.11-2.19 μm, sensitive to foliar water changes. These findings support the usefulness of spectrotemporal indices in the context of SITS-based crop type classifications, which until now have been mainly constructed by the arithmetic combination of two bands of the same image date in order to derive greenness temporal profiles like those from the normalized difference vegetation index.
Frasca, F; Siani, A M; Casale, G R; Pedone, M; Bratasz, Ł; Strojecki, M; Mleczkowska, A
2017-06-01
The microclimatic monitoring of the historic church of Mogiła Abbey (Kraków, Poland) was carried out to study the impact of the environmental parameters on the organic and hygroscopic artworks. Specific indexes were proposed to objectively assess the quality of time series of temperature (T), relative humidity (RH), and carbon dioxide (CO 2 ) before applying the exploratory data analysis. The series were used to define the historic environmental conditions as stated in the European Standard EN 15757:2010 and with the use of the climate evaluation chart (CEC). It was found that the percentage of time in which T and RH values are within the allowable limits of the ASHRAE (2011) Class B is more than 85 %. This means that, for about 15 % of the time, there is a high risk of mechanical damage to highly vulnerable objects mainly due to the RH variability. The environment at the chancel resulted moister than that at the cornice, and the fungal growth is possible. In addition, the time-weighted preservation index (TWPI) is computed to evaluate the life expectancy of the objects, taking into account the environmental conditions of the site under study. The method of analogues, developed to predict the evolution of a system given observations of the past and without the knowledge of any equation among variables, was proposed and applied to the time series of temperature, relative humidity, and carbon dioxide with a 1-h sampling time to avoid the influence of the autocorrelation.
NASA Astrophysics Data System (ADS)
Cardille, J. A.; Lee, J.
2017-12-01
With the opening of the Landsat archive, there is a dramatically increased potential for creating high-quality time series of land use/land-cover (LULC) classifications derived from remote sensing. Although LULC time series are appealing, their creation is typically challenging in two fundamental ways. First, there is a need to create maximally correct LULC maps for consideration at each time step; and second, there is a need to have the elements of the time series be consistent with each other, without pixels that flip improbably between covers due only to unavoidable, stray classification errors. We have developed the Bayesian Updating of Land Cover - Unsupervised (BULC-U) algorithm to address these challenges simultaneously, and introduce and apply it here for two related but distinct purposes. First, with minimal human intervention, we produced an internally consistent, high-accuracy LULC time series in rapidly changing Mato Grosso, Brazil for a time interval (1986-2000) in which cropland area more than doubled. The spatial and temporal resolution of the 59 LULC snapshots allows users to witness the establishment of towns and farms at the expense of forest. The new time series could be used by policy-makers and analysts to unravel important considerations for conservation and management, including the timing and location of past development, the rate and nature of changes in forest connectivity, the connection with road infrastructure, and more. The second application of BULC-U is to sharpen the well-known GlobCover 2009 classification from 300m to 30m, while improving accuracy measures for every class. The greatly improved resolution and accuracy permits a better representation of the true LULC proportions, the use of this map in models, and quantification of the potential impacts of changes. Given that there may easily be thousands and potentially millions of images available to harvest for an LULC time series, it is imperative to build useful algorithms requiring minimal human intervention. Through image segmentation and classification, BULC-U allows us to use both the spectral and spatial characteristics of imagery to sharpen classifications and create time series. It is hoped that this study may allow us and other users of this new method to consider time series across ever larger areas.
Piecewise multivariate modelling of sequential metabolic profiling data.
Rantalainen, Mattias; Cloarec, Olivier; Ebbels, Timothy M D; Lundstedt, Torbjörn; Nicholson, Jeremy K; Holmes, Elaine; Trygg, Johan
2008-02-19
Modelling the time-related behaviour of biological systems is essential for understanding their dynamic responses to perturbations. In metabolic profiling studies, the sampling rate and number of sampling points are often restricted due to experimental and biological constraints. A supervised multivariate modelling approach with the objective to model the time-related variation in the data for short and sparsely sampled time-series is described. A set of piecewise Orthogonal Projections to Latent Structures (OPLS) models are estimated, describing changes between successive time points. The individual OPLS models are linear, but the piecewise combination of several models accommodates modelling and prediction of changes which are non-linear with respect to the time course. We demonstrate the method on both simulated and metabolic profiling data, illustrating how time related changes are successfully modelled and predicted. The proposed method is effective for modelling and prediction of short and multivariate time series data. A key advantage of the method is model transparency, allowing easy interpretation of time-related variation in the data. The method provides a competitive complement to commonly applied multivariate methods such as OPLS and Principal Component Analysis (PCA) for modelling and analysis of short time-series data.
Application of blind source separation to real-time dissolution dynamic nuclear polarization.
Hilty, Christian; Ragavan, Mukundan
2015-01-20
The use of a blind source separation (BSS) algorithm is demonstrated for the analysis of time series of nuclear magnetic resonance (NMR) spectra. This type of data is obtained commonly from experiments, where analytes are hyperpolarized using dissolution dynamic nuclear polarization (D-DNP), both in in vivo and in vitro contexts. High signal gains in D-DNP enable rapid measurement of data sets characterizing the time evolution of chemical or metabolic processes. BSS is based on an algorithm that can be applied to separate the different components contributing to the NMR signal and determine the time dependence of the signals from these components. This algorithm requires minimal prior knowledge of the data, notably, no reference spectra need to be provided, and can therefore be applied rapidly. In a time-resolved measurement of the enzymatic conversion of hyperpolarized oxaloacetate to malate, the two signal components are separated into computed source spectra that closely resemble the spectra of the individual compounds. An improvement in the signal-to-noise ratio of the computed source spectra is found compared to the original spectra, presumably resulting from the presence of each signal more than once in the time series. The reconstruction of the original spectra yields the time evolution of the contributions from the two sources, which also corresponds closely to the time evolution of integrated signal intensities from the original spectra. BSS may therefore be an approach for the efficient identification of components and estimation of kinetics in D-DNP experiments, which can be applied at a high level of automation.
NASA Astrophysics Data System (ADS)
Larnier, H.; Sailhac, P.; Chambodut, A.
2018-01-01
Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.
NASA Astrophysics Data System (ADS)
Bock, Y.; Fang, P.; Moore, A. W.; Kedar, S.; Liu, Z.; Owen, S. E.; Glasscoe, M. T.
2016-12-01
Detection of time-dependent crustal deformation relies on the availability of accurate surface displacements, proper time series analysis to correct for secular motion, coseismic and non-tectonic instrument offsets, periodic signatures at different frequencies, and a realistic estimate of uncertainties for the parameters of interest. As part of the NASA Solid Earth Science ESDR System (SESES) project, daily displacement time series are estimated for about 2500 stations, focused on tectonic plate boundaries and having a global distribution for accessing the terrestrial reference frame. The "combined" time series are optimally estimated from independent JPL GIPSY and SIO GAMIT solutions, using a consistent set of input epoch-date coordinates and metadata. The longest time series began in 1992; more than 30% of the stations have experienced one or more of 35 major earthquakes with significant postseismic deformation. Here we present three examples of time-dependent deformation that have been detected in the SESES displacement time series. (1) Postseismic deformation is a fundamental time-dependent signal that indicates a viscoelastic response of the crust/mantle lithosphere, afterslip, or poroelastic effects at different spatial and temporal scales. It is critical to identify and estimate the extent of postseismic deformation in both space and time not only for insight into the crustal deformation and earthquake cycles and their underlying physical processes, but also to reveal other time-dependent signals. We report on our database of characterized postseismic motions using a principal component analysis to isolate different postseismic processes. (2) Starting with the SESES combined time series and applying a time-dependent Kalman filter, we examine episodic tremor and slow slip (ETS) in the Cascadia subduction zone. We report on subtle slip details, allowing investigation of the spatiotemporal relationship between slow slip transients and tremor and their underlying physical mechanisms. (3) We present evolving strain dilatation and shear rates based on the SESES velocities for regional subnetworks as a metric for assigning earthquake probabilities and detection of possible time-dependent deformation related to underlying physical processes.
Retrieving hydrological connectivity from empirical causality in karst systems
NASA Astrophysics Data System (ADS)
Delforge, Damien; Vanclooster, Marnik; Van Camp, Michel; Poulain, Amaël; Watlet, Arnaud; Hallet, Vincent; Kaufmann, Olivier; Francis, Olivier
2017-04-01
Because of their complexity, karst systems exhibit nonlinear dynamics. Moreover, if one attempts to model a karst, the hidden behavior complicates the choice of the most suitable model. Therefore, both intense investigation methods and nonlinear data analysis are needed to reveal the underlying hydrological connectivity as a prior for a consistent physically based modelling approach. Convergent Cross Mapping (CCM), a recent method, promises to identify causal relationships between time series belonging to the same dynamical systems. The method is based on phase space reconstruction and is suitable for nonlinear dynamics. As an empirical causation detection method, it could be used to highlight the hidden complexity of a karst system by revealing its inner hydrological and dynamical connectivity. Hence, if one can link causal relationships to physical processes, the method should show great potential to support physically based model structure selection. We present the results of numerical experiments using karst model blocks combined in different structures to generate time series from actual rainfall series. CCM is applied between the time series to investigate if the empirical causation detection is consistent with the hydrological connectivity suggested by the karst model.
Conceptual recurrence plots: revealing patterns in human discourse.
Angus, Daniel; Smith, Andrew; Wiles, Janet
2012-06-01
Human discourse contains a rich mixture of conceptual information. Visualization of the global and local patterns within this data stream is a complex and challenging problem. Recurrence plots are an information visualization technique that can reveal trends and features in complex time series data. The recurrence plot technique works by measuring the similarity of points in a time series to all other points in the same time series and plotting the results in two dimensions. Previous studies have applied recurrence plotting techniques to textual data; however, these approaches plot recurrence using term-based similarity rather than conceptual similarity of the text. We introduce conceptual recurrence plots, which use a model of language to measure similarity between pairs of text utterances, and the similarity of all utterances is measured and displayed. In this paper, we explore how the descriptive power of the recurrence plotting technique can be used to discover patterns of interaction across a series of conversation transcripts. The results suggest that the conceptual recurrence plotting technique is a useful tool for exploring the structure of human discourse.
NASA Astrophysics Data System (ADS)
Tsai, Christina; Yeh, Ting-Gu
2017-04-01
Extreme weather events are occurring more frequently as a result of climate change. Recently dengue fever has become a serious issue in southern Taiwan. It may have characteristic temporal scales that can be identified. Some researchers have hypothesized that dengue fever incidences are related to climate change. This study applies time-frequency analysis to time series data concerning dengue fever and hydrologic and meteorological variables. Results of three time-frequency analytical methods - the Hilbert Huang transform (HHT), the Wavelet Transform (WT) and the Short Time Fourier Transform (STFT) are compared and discussed. A more effective time-frequency analysis method will be identified to analyze relevant time series data. The most influential time scales of hydrologic and meteorological variables that are associated with dengue fever are determined. Finally, the linkage between hydrologic/meteorological factors and dengue fever incidences can be established.
Evolution of record-breaking high and low monthly mean temperatures
NASA Astrophysics Data System (ADS)
Anderson, A. L.; Kostinski, A. B.
2011-12-01
We examine the ratio of record-breaking highs to record-breaking lows with respect to extent of time-series for monthly mean temperatures within the continental United States (1900-2006) and ask the following question. How are record-breaking high and low surface temperatures in the United States affected by time period? We find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. For example: in 2006, the ratio of record-breaking highs to record-breaking lows is ≈ 13 : 1 with 1950 as the first year and ≈ 25 : 1 with 1900 as the first year; both ratios are an order of magnitude greater than 3-σ for stationary simulations. We also find record-breaking events are more sensitive to trends in time-series of monthly averages than time-series of corresponding daily values. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. Correlation coefficients are 0.76 and 0.82 for 1900-2006 and 1950-2006 respectively; 3-σ = 0.3 for pairs of uncorrelated stationary time-series. We find similar values for globally distributed time-series: 0.87 and 0.92 for 1900-2006 and 1950-2006 respectively. However, the ratios evolve differently: global ratios increase throughout (1920-2006) while continental United States ratios decrease from about 1940 to 1970. (Based on Anderson and Kostinski (2011), Evolution and distribution of record-breaking high and low monthly mean temperatures. Journal of Applied Meteorology and Climatology. doi: 10.1175/JAMC-D-10-05025.1)
Coherence and Chaos Phenomena in Josephson Oscillators for Superconducting Electronics.
1989-01-25
represents dissipation due j+(a+/b)+ b--i(a-) to the surface resistance of the superconducting films , y is the uniform bias current normalized to the...represents series loss due series of time-dependent Fourier spatial compo- to surface resistance of the superconducting films , nents. Tis approach provides...case is that in which there is no ing films , y is the spatially uniform bias current normal- external magnetic field applied to the junction. In this
New Insights into Signed Path Coefficient Granger Causality Analysis
Zhang, Jian; Li, Chong; Jiang, Tianzi
2016-01-01
Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of “signed path coefficient Granger causality,” a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an “excitatory” or “inhibitory” influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation. PMID:27833547
Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.
Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark
2016-01-01
Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.
Computing the multifractal spectrum from time series: an algorithmic approach.
Harikrishnan, K P; Misra, R; Ambika, G; Amritkar, R E
2009-12-01
We show that the existing methods for computing the f(alpha) spectrum from a time series can be improved by using a new algorithmic scheme. The scheme relies on the basic idea that the smooth convex profile of a typical f(alpha) spectrum can be fitted with an analytic function involving a set of four independent parameters. While the standard existing schemes [P. Grassberger et al., J. Stat. Phys. 51, 135 (1988); A. Chhabra and R. V. Jensen, Phys. Rev. Lett. 62, 1327 (1989)] generally compute only an incomplete f(alpha) spectrum (usually the top portion), we show that this can be overcome by an algorithmic approach, which is automated to compute the D(q) and f(alpha) spectra from a time series for any embedding dimension. The scheme is first tested with the logistic attractor with known f(alpha) curve and subsequently applied to higher-dimensional cases. We also show that the scheme can be effectively adapted for analyzing practical time series involving noise, with examples from two widely different real world systems. Moreover, some preliminary results indicating that the set of four independent parameters may be used as diagnostic measures are also included.
The coupling analysis between stock market indices based on permutation measures
NASA Astrophysics Data System (ADS)
Shi, Wenbin; Shang, Pengjian; Xia, Jianan; Yeh, Chien-Hung
2016-04-01
Many information-theoretic methods have been proposed for analyzing the coupling dependence between time series. And it is significant to quantify the correlation relationship between financial sequences since the financial market is a complex evolved dynamic system. Recently, we developed a new permutation-based entropy, called cross-permutation entropy (CPE), to detect the coupling structures between two synchronous time series. In this paper, we extend the CPE method to weighted cross-permutation entropy (WCPE), to address some of CPE's limitations, mainly its inability to differentiate between distinct patterns of a certain motif and the sensitivity of patterns close to the noise floor. It shows more stable and reliable results than CPE does when applied it to spiky data and AR(1) processes. Besides, we adapt the CPE method to infer the complexity of short-length time series by freely changing the time delay, and test it with Gaussian random series and random walks. The modified method shows the advantages in reducing deviations of entropy estimation compared with the conventional one. Finally, the weighted cross-permutation entropy of eight important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.
The application of neural networks to myoelectric signal analysis: a preliminary study.
Kelly, M F; Parker, P A; Scott, R N
1990-03-01
Two neural network implementations are applied to myoelectric signal (MES) analysis tasks. The motivation behind this research is to explore more reliable methods of deriving control for multidegree of freedom arm prostheses. A discrete Hopfield network is used to calculate the time series parameters for a moving average MES model. It is demonstrated that the Hopfield network is capable of generating the same time series parameters as those produced by the conventional sequential least squares (SLS) algorithm. Furthermore, it can be extended to applications utilizing larger amounts of data, and possibly to higher order time series models, without significant degradation in computational efficiency. The second neural network implementation involves using a two-layer perceptron for classifying a single site MES based on two features, specifically the first time series parameter, and the signal power. Using these features, the perceptron is trained to distinguish between four separate arm functions. The two-dimensional decision boundaries used by the perceptron classifier are delineated. It is also demonstrated that the perceptron is able to rapidly compensate for variations when new data are incorporated into the training set. This adaptive quality suggests that perceptrons may provide a useful tool for future MES analysis.
Fractality of profit landscapes and validation of time series models for stock prices
NASA Astrophysics Data System (ADS)
Yi, Il Gu; Oh, Gabjin; Kim, Beom Jun
2013-08-01
We apply a simple trading strategy for various time series of real and artificial stock prices to understand the origin of fractality observed in the resulting profit landscapes. The strategy contains only two parameters p and q, and the sell (buy) decision is made when the log return is larger (smaller) than p (-q). We discretize the unit square (p,q) ∈ [0,1] × [0,1] into the N × N square grid and the profit Π(p,q) is calculated at the center of each cell. We confirm the previous finding that local maxima in profit landscapes are scattered in a fractal-like fashion: the number M of local maxima follows the power-law form M ˜ Na, but the scaling exponent a is found to differ for different time series. From comparisons of real and artificial stock prices, we find that the fat-tailed return distribution is closely related to the exponent a ≈ 1.6 observed for real stock markets. We suggest that the fractality of profit landscape characterized by a ≈ 1.6 can be a useful measure to validate time series model for stock prices.
Cross-sample entropy of foreign exchange time series
NASA Astrophysics Data System (ADS)
Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao
2010-11-01
The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.
NASA Astrophysics Data System (ADS)
Dash, Y.; Mishra, S. K.; Panigrahi, B. K.
2017-12-01
Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.
Reconstruction of network topology using status-time-series data
NASA Astrophysics Data System (ADS)
Pandey, Pradumn Kumar; Badarla, Venkataramana
2018-01-01
Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.
NASA Astrophysics Data System (ADS)
Sharma, A. K.; Hubert-Moy, L.; Betbederet, J.; Ruiz, L.; Sekhar, M.; Corgne, S.
2016-08-01
Monitoring land use and land cover and more particularly irrigated cropland dynamics is of great importance for water resources management and land use planning. The objective of this study was to evaluate the combined use of multi-temporal optical and radar data with a high spatial resolution in order to improve the precision of irrigated crop identification by taking into account information on crop phenological stages. SAR and optical parameters were derived from time- series of seven quad-pol RADARSAT-2 and four Landsat-8 images which were acquired on the Berambadi catchment, South India, during the monsoon crop season at the growth stages of turmeric crop. To select the best parameter to discriminate turmeric crops, an analysis of covariance (ANCOVA) was applied on all the time-series parameters and the most discriminant ones were classified using the Support Vector Machine (SVM) technique. Results show that in absence of optical images, polarimetric parameters derived from SAR time-series can be used for the turmeric area estimates and that the combined use of SAR and optical parameters can improve the classification accuracy to identify turmeric.
37 CFR 1.78 - Claiming benefit of earlier filing date and cross-references to other applications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such prior-filed application, identifying it by application number (consisting of the series code and.... These time periods are not extendable. Except as provided in paragraph (a)(3) of this section, the... application. The time periods in this paragraph do not apply if the later-filed application is: (A) An...
A method to predict streamflow for ungauged basins of the Mid-Atlantic Region, USA was applied to the Rappahannock watershed in Virginia, USA. The method separates streamflow time series into magnitude and time sequence components. It uses the regionalized flow duration curve (RF...
Anwar, A R; Muthalib, M; Perrey, S; Galka, A; Granert, O; Wolff, S; Deuschl, G; Raethjen, J; Heute, U; Muthuraman, M
2012-01-01
Directionality analysis of signals originating from different parts of brain during motor tasks has gained a lot of interest. Since brain activity can be recorded over time, methods of time series analysis can be applied to medical time series as well. Granger Causality is a method to find a causal relationship between time series. Such causality can be referred to as a directional connection and is not necessarily bidirectional. The aim of this study is to differentiate between different motor tasks on the basis of activation maps and also to understand the nature of connections present between different parts of the brain. In this paper, three different motor tasks (finger tapping, simple finger sequencing, and complex finger sequencing) are analyzed. Time series for each task were extracted from functional magnetic resonance imaging (fMRI) data, which have a very good spatial resolution and can look into the sub-cortical regions of the brain. Activation maps based on fMRI images show that, in case of complex finger sequencing, most parts of the brain are active, unlike finger tapping during which only limited regions show activity. Directionality analysis on time series extracted from contralateral motor cortex (CMC), supplementary motor area (SMA), and cerebellum (CER) show bidirectional connections between these parts of the brain. In case of simple finger sequencing and complex finger sequencing, the strongest connections originate from SMA and CMC, while connections originating from CER in either direction are the weakest ones in magnitude during all paradigms.
A Four-Stage Hybrid Model for Hydrological Time Series Forecasting
Di, Chongli; Yang, Xiaohua; Wang, Xiaochao
2014-01-01
Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782
A four-stage hybrid model for hydrological time series forecasting.
Di, Chongli; Yang, Xiaohua; Wang, Xiaochao
2014-01-01
Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.
Mariani, Luigi; Zavatti, Franco
2017-09-01
The spectral periods in North Atlantic Oscillation (NAO), Atlantic Multidecadal Oscillation (AMO) and El Nino Southern Oscillation (ENSO) were analyzed and has been verified how they imprint a time series of European temperature anomalies (ETA), two European temperature time series and some phenological series (dates of cherry flowering and grapevine harvest). Such work had as reference scenario the linear causal chain MCTP (Macroscale Circulation→Temperature→Phenology of crops) that links oceanic and atmospheric circulation to surface air temperature which in its turn determines the earliness of appearance of phenological phases of plants. Results show that in the three segments of the MCTP causal chain are present cycles with the following central period in years (the % of the 12 analyzed time series interested by these cycles are in brackets): 65 (58%), 24 (58%), 20.5 (58%), 13.5 (50%), 11.5 (58%), 7.7 (75%), 5.5 (58%), 4.1 (58%), 3 (50%), 2.4 (67%). A comparison with short term spectral peaks of the four El Niño regions (nino1+2, nino3, nino3.4 and nino4) show that 10 of the 12 series are imprinted by periods around 2.3-2.4yr while 50-58% of the series are imprinted by El Niño periods of 4-4.2, 3.8-3.9, 3-3.1years. The analysis highlights the links among physical and biological variables of the climate system at scales that range from macro to microscale whose knowledge is crucial to reach a suitable understanding of the ecosystem behavior. The spectral analysis was also applied to a time series of spring - summer precipitation in order to evaluate the presence of peaks common with other 12 selected series with result substantially negative which brings us to rule out the existence of a linear causal chain MCPP (Macroscale Circulation→Precipitation→Phenology). Copyright © 2017 Elsevier B.V. All rights reserved.
Briët, Olivier J T; Amerasinghe, Priyanie H; Vounatsou, Penelope
2013-01-01
With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions' impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during "consolidation" and "pre-elimination" phases. Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low.
Briët, Olivier J. T.; Amerasinghe, Priyanie H.; Vounatsou, Penelope
2013-01-01
Introduction With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions’ impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during “consolidation” and “pre-elimination” phases. Methods Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. Results The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. Conclusions G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low. PMID:23785448
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data sets, but little research exists of how these tools compare to each other in practice. This work introduces and defines exploratory causal analysis (ECA) to address this issue along with the concept of data causality in the taxonomy of causal studies introduced in this work. The motivation is to provide a framework for exploring potential causal structures in time series data sets. ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality tools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise.
Wang, Guochao; Wang, Jun
2017-01-01
We make an approach on investigating the fluctuation behaviors of financial volatility duration dynamics. A new concept of volatility two-component range intensity (VTRI) is developed, which constitutes the maximal variation range of volatility intensity and shortest passage time of duration, and can quantify the investment risk in financial markets. In an attempt to study and describe the nonlinear complex properties of VTRI, a random agent-based financial price model is developed by the finite-range interacting biased voter system. The autocorrelation behaviors and the power-law scaling behaviors of return time series and VTRI series are investigated. Then, the complexity of VTRI series of the real markets and the proposed model is analyzed by Fuzzy entropy (FuzzyEn) and Lempel-Ziv complexity. In this process, we apply the cross-Fuzzy entropy (C-FuzzyEn) to study the asynchrony of pairs of VTRI series. The empirical results reveal that the proposed model has the similar complex behaviors with the actual markets and indicate that the proposed stock VTRI series analysis and the financial model are meaningful and feasible to some extent.
NASA Astrophysics Data System (ADS)
Wang, Guochao; Wang, Jun
2017-01-01
We make an approach on investigating the fluctuation behaviors of financial volatility duration dynamics. A new concept of volatility two-component range intensity (VTRI) is developed, which constitutes the maximal variation range of volatility intensity and shortest passage time of duration, and can quantify the investment risk in financial markets. In an attempt to study and describe the nonlinear complex properties of VTRI, a random agent-based financial price model is developed by the finite-range interacting biased voter system. The autocorrelation behaviors and the power-law scaling behaviors of return time series and VTRI series are investigated. Then, the complexity of VTRI series of the real markets and the proposed model is analyzed by Fuzzy entropy (FuzzyEn) and Lempel-Ziv complexity. In this process, we apply the cross-Fuzzy entropy (C-FuzzyEn) to study the asynchrony of pairs of VTRI series. The empirical results reveal that the proposed model has the similar complex behaviors with the actual markets and indicate that the proposed stock VTRI series analysis and the financial model are meaningful and feasible to some extent.
NASA Astrophysics Data System (ADS)
Gill, L. W.; Naughton, O.; Johnston, P. M.; Basu, B.; Ghosh, B.
2013-08-01
This research has used continuous water level measurements five groundwater-fed lakes (or turloughs) in a linked lowland karst network of south Galway in Ireland over a 3 year period in order to elucidate the hydrogeological controls and conduit configurations forming the flooded karstic hydraulic system beneath the ground. The main spring outflow from this network discharges below mean sea level making it difficult to determine the hydraulic nature of the network using traditional rainfall-spring flow cross analysis, as has been done in many other studies on karst systems. However, the localised groundwater-surface water interactions (the turloughs) in this flooded lowland karst system can yield information about the nature of the hydraulic connections beneath the ground. Various different analytical techniques have been applied to the fluctuating turlough water level time series data in order to determine the nature of the linkage between them as well as hydraulic pipe configurations at key points in order to improve the conceptual model of the overall karst network. Initially, simple cross correlations between the different turlough water levels were carried out applying different time lags. Frequency analysis of the signals was then carried out using Fast Fourier transform analysis and then both discrete and continuous wavelet analyses have been applied to the data sets to characterise these inherently non-stationary time-series of fluctuating water levels. The analysis has indicated which turloughs are on the main line conduit system and which are somewhat off-line, the relative size of the main conduit in the network including evidence of localised constrictions, as well as clearly showing the tidal influence on the water levels in the three lower turloughs at shallow depths ∼8 km from the main spring outfall at the sea. It has also indicated that the timing of high rainfall events coincident with maximum spring tide levels may promote more consistent, long duration flooding of the turloughs throughout the winter.